Discussion

Left ArrowBack to discussions page
T800_412T800_412 Posts: 13 Apprentice
What is the proper way to setup a plane feature for a UR robot, and then use it in the program for controlling XYZ position of the TCP? I have a UR10 robot with a "pallet"positioned nearby. The robot base is rotated approximately 45 degrees from the XY orientation of the pallet.

I set a plane feature for the pallet, but the Z seems inverted. (when I look at it in on the Move tab and set the view to the pallet feature, the robot is upside down, and up/down for the TCP is inverted. Also, if I try to to move the robot to the position, it tries to invert the tool.

I am setting the points for the feature by placing the TCP (pointing straight down) at three separate corners of the pallet and saving each. (I am guessing this is incorrect?)

Best Answer

  • T800_412T800_412 Posts: 13 Apprentice
    Accepted Answer
    Another local distributor helped me to get this working correctly. The solution is to create the feature with the TCP Y axis inverted by entering 3.1416 in the TCP configuration. In this case, the pallet feature is a flat plane parallel to the ground The TCP was set with a tool pointing straight down and the X/Y arrows on the FT300 oriented as desired.

    After teaching the feature, remove the 3.1416 from the TCP configuration and then program the robot with moveL commands with the feature selected as the reference. The pose_add command can now be used to move the TCP from the starting point (point one of the feature) with X,Y,orZ offsets.

Comments

  • matthewd92matthewd92 Posts: 520Founding Pro, Tactile Sensor Beta Testers Handy
    What function are you using to get the new waypoints, pose_add or pose_trans?
  • T800_412T800_412 Posts: 13 Apprentice
    I am currently using the pose_add command. I have a pose defined, then add the offset. This part works fine with moveL or moveJ with the Base as the feature. If I try to use the plane feature I defined as the pallet, the TCP wants to invert 180 degrees.


  • T800_412T800_412 Posts: 13 Apprentice
    Here is a top view of the layout I want to use for the robot. The rectangle with the 12 black circles is the pallet I want the robot to go to to get wheels. I want to define the feature so X and Y are oriented left/right, up/down with respect to the drawing view. Z pointing out of the view towards the person looking down.

    Referencing 1,2,3,4, how do I setup a feature that will orient X,Y,Z as described, so that when selected as the Feature on the Move tab, the robot image does not go inverted, or when used in the program with a moveL command, the TCP does not try to invert?



    The UR10 User Manual seems to describe setting up features as an easy way to make the robot be able to move in reference to something other than the robot base, but I am thoroughly confused by reading it several times and attempting several different methods to teach the feature. I was even told by a distributor of UR to just invert the TCP when teaching it and that will work, which is not true at all. 
  • LoïcLoïc Posts: 12 Apprentice
    edited November 8
    I may have misunderstood, but I think you're trying to create a feature that is not possible to build.

    If you create a feature that, with respect to your drawing view, have X form left to right and Y from up to down, then your Z direction will be the reverse of the base feature. That's because you can only build cartesian feature. If you want a Z in the same direction as the base feature, you'll have to inverse your X or your Y direction.
  • matthewd92matthewd92 Posts: 520Founding Pro, Tactile Sensor Beta Testers Handy
    When setting up a feature you want to make sure you follow the right hand rule to understand the orientation of the axes, if you’re not familiar with it you can find info here which is what @Loïc is referring to.

    whats happening when you use the feature in the movel as the reference plane is the system is doing a pose_trans() behind the scenes which is converting your tcp frame into your new feature relative to the base, since the z axis points 180 degrees to the base frame it’s inverting your tool orientation. 


  • LoïcLoïc Posts: 12 Apprentice

    T800_412 said:
    Another local distributor helped me to get this working correctly. The solution is to create the feature with the TCP Y axis inverted by entering 3.1416 in the TCP configuration. In this case, the pallet feature is a flat plane parallel to the ground The TCP was set with a tool pointing straight down and the X/Y arrows on the FT300 oriented as desired.

    After teaching the feature, remove the 3.1416 from the TCP configuration and then program the robot with moveL commands with the feature selected as the reference. The pose_add command can now be used to move the TCP from the starting point (point one of the feature) with X,Y,orZ offsets.

    This solution works well because you're not offsetting on Z, rx or rz.

    In fact, you've created a feature with the Z going down and not up as you want (the right hand rule as @matthew92 said), but pose_add use a position corresponding to the base frame but an orientation referred to the tool frame as mentioned here.

    The most important is that you've find a way to solve your problem.


  • manjunath13manjunath13 Posts: 9 Apprentice
    @matthewd92,
    which one is the most effctive way of programming for a palletising application, considering speed and motion optimisation
    1. using palletising wizard
    2. plane feature with defined points 

    when we want to have multiple TCPs and subprograms for each pallet points 
  • matthewd92matthewd92 Posts: 520Founding Pro, Tactile Sensor Beta Testers Handy
    @manjunath13 we do not use the palletizing wizard, opting rather to control the pallet function ourselves and so I do not fully know which one is the most effective for your application.  We create our own function so that we have full control over all of the parameters so that it is fastest for us to modify during operation if things change we can adapt quickly.  

    We will do things like set up lists for the number of rows and columns, the angle that the particular box is at for each position, if one layer is different than the next we can handle it this way.  We have done it a few different ways with the arrays but they all function pretty much the same, figure out where you are on the pallet and then query the system (the index in the list or lists) for what to do at that location and then perform a calculated move at that new position.

    Hope that was helpful.
  • manjunath13manjunath13 Posts: 9 Apprentice
    @matthewd92 ,i agree with your point, initially i started with the similar idea but got stuck with the MovL defining points in the program,


    lets say i have a tray containing 64 components, then do i need to define all 64 of them before start of the program using an if statement.
    i want make use of the feature plan to adapt to the location changes.

    plz share a snippet if possible to understand how you are creating a custom function.

    THANKS IN ADVANCE

  • T800_412T800_412 Posts: 13 Apprentice
    @manjunath13, You do not need to define all the points, providing you know the distances you need the robot to travel to get them. For example, for the program I have now for my wheel pallet, I use p[0,0,0,0,0,0] as the corner of the pallet, then calculate where to look for the center position of the wheel by the OD of the wheel. I use pose_add to moveL the TCP to that point, perform the CamLocate function, then calculate the next point to look for the next wheel.

    If the wheel size or array on the pallet is changed, I do not have to reprogram, just have the operator enter the new OD, and array size XYZ (wheels left/right, wheels front/back, number of layers)
  • matthewd92matthewd92 Posts: 520Founding Pro, Tactile Sensor Beta Testers Handy
    So the way we do it is all programmatic, there is only one taught waypoint generally and that is corner point and is taught as a feature or if we are concerned with the plane we will use the plane and the starting point is the corner point

    so maybe we do something like this (pseudocode):  Assuming we have taught a feature plane called myPallet

    var rows = 5
    var columns = 5
    var rowOffset = .15 (meters)
    var columnOffset = .15 (meters)
    var i=0 , j = 0
    
    #move to some area near the pallet
    loop i < rows
        loop j < columns
            var pickPoint = pose_trans(var_myPallet, p[j*columnOffset, i* rowOffset,0,0,0,0]) 
            var overPickPoint = pose_trans(pickPoint, p[0,0,.050,0,0,0])
            movel(overPickPoint)
            movel(pickPoint)
            Do Something to Pick
            movel(overPickPoint)
            j=j+1
        end
        i=i+1
        j=0
    end
    You may need to account for how the plane was taught and use a rotation in either the X or Y to flip the orientation of the TCP, this would cause the first calculation to look something like this assuming a flip in the Y axis
    var pickPoint = pose_trans(var_myPallet, p[j*columnOffset, i* rowOffset,0,3.14,0,0]) 


  • manjunath13manjunath13 Posts: 9 Apprentice
    @matthewd92
    thank you for your help, its a nice way of programming
  • matthewd92matthewd92 Posts: 520Founding Pro, Tactile Sensor Beta Testers Handy
    You’re welcom, I’m glad I could help. 
Sign In or Register to comment.
Left ArrowBack to discussions page