Home Programming

Discussion

Left ArrowBack to discussions page
nicknick Posts: 5 Apprentice

I’m trying to use the object_found function to adjust a frame on a universal robot (UR10 Version 3.4.2.65).  For some reason the accuracy on the rotation is terrible.  Attached is a video of it in action, below is the code.  The first 3 marks in the video are within the camera subroutine.  The next 2 marks are done by using a frame adjusted with the difference between two object_found positions (pos_found – pos_original).  Is there any fix for this?  Do I have to do something special when subtracting rotation values found by object_found?  I know it's not a vision target issue because the 3 vision points are ok

Note, the vision system is using the Automatic Method (not parametric method) so that the T within the circle will keep the orientation accurate.  Also, the snapshot position is perpendicular to the target

UF1 = plane used as a user frame


 Program

   Robot Program

     MoveJ

       TEST_var

     Camera Locate

       MoveL

         Waypoint_1

         Waypoint_2

         Waypoint_4

         Waypoint_5

         Waypoint_6

         Waypoint_7

         Waypoint_8

       frame=UF1_var

       pos_found≔object_location

       x_adjust=pos_found[0]-pos_original[0]

       y_adjust=pos_found[1]-pos_original[1]

       r_adjust=pos_found[5]-pos_original[5]

       UF1[0]=pos_original[0]+x_adjust

       UF1[1]=pos_original[1]+y_adjust

       UF1[5]=pos_original[5]+r_adjust

       frame_post=UF1_var

     MoveL                                                          'With UF1 as the frame the positions are based off of

       Waypoint_9

       Waypoint_10

       Waypoint_11

       Waypoint_12

       Waypoint_13


Comments

  • Etienne_SamsonEtienne_Samson Beta Tester Beetle, Wrist Camera URCap 1.3.0, Vacuum Beta tester Posts: 419 Handy
    @nick what is the detection threshold you are using? Could you check in the installation tab, live image of the camera, and see how your model is matched to the real part?

    @Vincent_Paquin can you advise?
    Etienne Samson
    Technical Support Director
    +1 418-380-2788 ext. 207
    [email protected]
  • Vincent_PaquinVincent_Paquin Posts: 11 Apprentice
    edited September 2017
    @nick, @Etienne_Samson If I understand correctly, the problem here is that the frame is not fully adjusted. The current adjustment is only a 2D adjustment while in reality it is a 3d problem. The rotation is assumed to be only around the z-axis, but it is a 3d rotation. The x, y and z component of the rotation axis must be taken in account as well as the z component of the translation.  In short, the 6 components of the transformation must be used.

    I hope this helps.


    Vincent Paquin
    Machine Vision Technical Leader
    [email protected]
  • nicknick Posts: 5 Apprentice
    @etienne my detection threshold is 85%, but it's not getting a false detection.  If it was getting a false detection then the first 3 marks would have been off as well.  The issue it seems is with my math it seems because I'm not handling the rotation vectors correctly

    Thank you @Vincent, that helped me troubleshoot this a bit and now I know the better question to ask. So basically what I was doing was subtracting the difference between the pos_found and pos_original which gave me a good x,y,z offset but I can't do this with the rx,ry,rz it seems because they're rotation vectors.  I figure I have to use pose_trans() but I couldn't get it to work right; below is the simplified version of my code, I just need to know how to properly offset the frame's rotation. FYI I also tried converting my rotation vectors to RPY and then doing the math but that didn't work, though I may have done that incorrectly

    The goal is to offset UF1 by the offset of pos_found relative to pos_original.  Thus if pos_found is 30mm up in the x direction and 15% degrees rotated (compared to pos_original) then UF1 should be adjusted that amount.  

    pos1 is the backup of UF1
    pos_original is the reference position

       Robot Program
         UF1=pos1
         MoveJ
           TEST_var                        'My snapshot position
         Camera Locate
           MoveL
             Waypoint_1
             Waypoint_2
             Waypoint_3
             Waypoint_4
             Waypoint_5
             Waypoint_6
             Waypoint_7
             Waypoint_8
           frame=UF1_var
           pos_found≔object_location
           pos2=pose_sub(pos_found,pos_original)
           UF1=pose_add(pos1,pos2)
           frame_post=UF1_var
         MoveL                                'motions are relative to UF1
           Waypoint_9
           Waypoint_10
           Waypoint_11
           Waypoint_12
           Waypoint_13
  • nicknick Posts: 5 Apprentice
    Figured out an alternative solution to my problem.  Though if possible I'd still like to understand how to do the above.
Sign In or Register to comment.
Left ArrowBack to discussions page