DoF - a Robotiq Community
Warning sign
The Dof Community was shut down in June 2023. This is a read-only archive.
If you have questions about Robotiq products please reach our support team.
nick

I’m trying to use the object_found function to adjust a frame on a universal robot (UR10 Version 3.4.2.65).  For some reason the accuracy on the rotation is terrible.  Attached is a video of it in action, below is the code.  The first 3 marks in the video are within the camera subroutine.  The next 2 marks are done by using a frame adjusted with the difference between two object_found positions (pos_found – pos_original).  Is there any fix for this?  Do I have to do something special when subtracting rotation values found by object_found?  I know it's not a vision target issue because the 3 vision points are ok

Note, the vision system is using the Automatic Method (not parametric method) so that the T within the circle will keep the orientation accurate.  Also, the snapshot position is perpendicular to the target

UF1 = plane used as a user frame


 Program

   Robot Program

     MoveJ

       TEST_var

     Camera Locate

       MoveL

         Waypoint_1

         Waypoint_2

         Waypoint_4

         Waypoint_5

         Waypoint_6

         Waypoint_7

         Waypoint_8

       frame=UF1_var

       pos_found≔object_location

       x_adjust=pos_found[0]-pos_original[0]

       y_adjust=pos_found[1]-pos_original[1]

       r_adjust=pos_found[5]-pos_original[5]

       UF1[0]=pos_original[0]+x_adjust

       UF1[1]=pos_original[1]+y_adjust

       UF1[5]=pos_original[5]+r_adjust

       frame_post=UF1_var

     MoveL                                                          'With UF1 as the frame the positions are based off of

       Waypoint_9

       Waypoint_10

       Waypoint_11

       Waypoint_12

       Waypoint_13


nick

@etienne my detection threshold is 85%, but it's not getting a false detection.  If it was getting a false detection then the first 3 marks would have been off as well.  The issue it seems is with my math it seems because I'm not handling the rotation vectors correctly

Thank you @Vincent, that helped me troubleshoot this a bit and now I know the better question to ask. So basically what I was doing was subtracting the difference between the pos_found and pos_original which gave me a good x,y,z offset but I can't do this with the rx,ry,rz it seems because they're rotation vectors.  I figure I have to use pose_trans() but I couldn't get it to work right; below is the simplified version of my code, I just need to know how to properly offset the frame's rotation. FYI I also tried converting my rotation vectors to RPY and then doing the math but that didn't work, though I may have done that incorrectly

The goal is to offset UF1 by the offset of pos_found relative to pos_original.  Thus if pos_found is 30mm up in the x direction and 15% degrees rotated (compared to pos_original) then UF1 should be adjusted that amount.  

pos1 is the backup of UF1
pos_original is the reference position

   Robot Program
     UF1=pos1
     MoveJ
       TEST_var                        'My snapshot position
     Camera Locate
       MoveL
         Waypoint_1
         Waypoint_2
         Waypoint_3
         Waypoint_4
         Waypoint_5
         Waypoint_6
         Waypoint_7
         Waypoint_8
       frame=UF1_var
       pos_found≔object_location
       pos2=pose_sub(pos_found,pos_original)
       UF1=pose_add(pos1,pos2)
       frame_post=UF1_var
     MoveL                                'motions are relative to UF1
       Waypoint_9
       Waypoint_10
       Waypoint_11
       Waypoint_12
       Waypoint_13