Home Programming

Discussion

Left ArrowBack to discussions page
PuolikasPuolikas Posts: 5 Apprentice
Hi.

What is the best way to use the feature plane?

My rig is Universal Robots UR10, Robotiq wrist camera and Robotiq 2F140 Gripper.

I have a portable assembly station and I want to move it around, take it to storage and put it back in a different place etc.
Now what I have done is to make a plane feature, points representing the corners of the station. Then in program I choose the feature plane under the MoveL command. Everything going great when the program is very simple.

I get the problems when the program is complicated. I cannot use MoveJ command (cannot choose the plane feature). Then in palletizing when I make rectraction after I have gripped the part the added Waypoints make weird moves. Like they are calculated from the Base feature.

Everything is resolved when I know how to move the base feature origin to the work origin before programming.

This method is widely used in CNC-machining:



So we have the Machine origin and then we tell the Machine where the origin of the part is. Then if we tell the Machine to move to X0,Y0,Z0 it will go to the origin of the part, not the Machine origin. This is like using feature plane, but every move is automatically related to part origin. So if you just make a MoveL command X0,Y0,Z0 without choosing the feature plane, it should go to part origin, not to the base feature origin.

If you can help me with this I would be very thankful! At our company (Oulu University of Applied Sciences) we have 30 students, 10 groups, sharing 4 UR-robots. That means robot cells moves all the time and without easy way to move the origin of the program, students have to always show the moved waypoints all over again. That can be frustrating..

Please feel free to ask more specifics if you need. :smiley:

/ Lassi Kaivosoja - Project Engineer OUAS 

Comments

  • louis_bergeronlouis_bergeron Posts: 94 Handy
    edited June 2017
    @Puolikas

    I think you can't use MoveJ in this case.  MoveJ is a motion in Joint space, not in Cartesian space.  It means, there is no x,y,z coordinate. The coordinates are in degrees for each joint for a MoveJ.  You will also see that acceleration and speed are represented in degrees/second and degrees/sec2.  It means MoveJ is neither in the Base coordinate.  Asking the robot to put joint number 2 at 35 degrees, for example, it means it's always at the same place in space.  That's why it look like in base coordinate because it's a fixed position in space.  It's a fixed angle for each joint.  The base is also in Cartesian space with X,Y and Z.  To use Cartesian coordinates like X,Y,Z, I think you must use MoveL or MoveP.

      
    Louis Bergeron
    Integration Coach
    robotiq.com
    [email protected]
  • PuolikasPuolikas Posts: 5 Apprentice
    Thank you for your clarification louis_bergeron. 
  • matthewd92matthewd92 Founding Pro, Tactile Sensor Beta Testers Posts: 1,267 Handy
    There is a way to use a movej with a feature but it requires that the movej use variable waypoints.  If you wanted to use taught waypoints you could teach a movel sequence in the before start section of the program and wrap it inside an if statement that never tests true such as if false.  The next step is to use the fixed waypoints as variables, you would need to assign then fixed waypoint to a variable, behind the scenes it is already doing a pose_trans() function to calculate the pose in base coordinates from the feature coordinates.  You can see this in the screenshot below where I am basically assigning waypoint1 to a variable but I put an error in the assignment so that this error screen popped up


      Then your movej would just use those variable waypoints instead of fixed waypoints.  This results in a program that is rather simple still, you have fixed points to go change if needed and a way to control your feature frame as well.  In this example, I made the feature variable and used pose_add() to cause the feature frame to step down in 100 mm increments.  Notice in the pose_add I am not using the feature but rather the features variable, I did this so that it will do more than 1 step down.



  • PuolikasPuolikas Posts: 5 Apprentice
    Thanks.

    I'm familiar with this method but still not what I'm looking for. Some day I will figure this problem out and bring it to you guys. :smiley:

    /Lassi K.
  • matthewd92matthewd92 Founding Pro, Tactile Sensor Beta Testers Posts: 1,267 Handy
    @Puolikas have you ever tried making the Base feature a variable?  You could then change the base feature frame of reference, this might be what you are looking for. Basically in your before start you could ask for some offsets in the X, Y ad Z direction and then use those offsets to offset the base coordinate system.

    Is this more along the line of what you are looking for?
  • PuolikasPuolikas Posts: 5 Apprentice
    Yes. This is exactly what I need. :smile:
    Didn't know that you can do that.

    So In the begining of the program I show the new base feature offset with trans_pose. I just need to make a jig which helps me to show the correct angle of the assembly station.

    I'll get back to you when I have tested this out.

    Thanks for your help.
  • abeachy_HG24abeachy_HG24 Posts: 79 Handy
    @matthewd92 @PuolikasWhenever one of you gets a chance, could you post a sample of how you would program this? I had a few free moments this morning and I was attempting to do this but I was unsuccessful. Not urgent or anything, just curious. Thanks.
  • matthewd92matthewd92 Founding Pro, Tactile Sensor Beta Testers Posts: 1,267 Handy
    Fortunately I have to do this for a customer so will be working on it over the next week or so as we are deploying the robot.  I am not sure yet how I would go about reteaching the plane feature easily, have some ideas but have not completely baked them all the way through and am just pulling a robot out of the box this week for the test.  Hard to do what I am planning to do on the simulator.  I'll keep you posted though.
  • K_BingK_Bing Posts: 2 Recruit
    @matthewd92
    Any updates? I feel like its a common need to move the cobot around and then reteach the base feature.
  • matthewd92matthewd92 Founding Pro, Tactile Sensor Beta Testers Posts: 1,267 Handy
    @K_Bing I don't have an update currently, we are still working on the best solution for getting accurate points and calculating the new coordinates to pass to the base feature.  As soon as we have something working I will post something up for the group.
  • Joao_LourencoJoao_Lourenco Unconfirmed Posts: 1 Recruit
    Hi
    I'm using an UR5 CB3 with a wrist camera and a hand-e gripper in a palletization application. I think my question is somewhat similar to Puolikas's and that's why I put it here. The parts come from a conveyor belt, the robot has to put them in a box on the right side and after this box is full it has to move to the next box on the opposite side. The robot will have several applications with different parts and boxes. How do I program just one box and then move the program to a different position?
  • PuolikasPuolikas Posts: 5 Apprentice
    Hello,

    1. you make a point feature (name e.g. box1_corner) from installation to show where corner of the box is.
    2. then start you box filling with MoveL-command and select from the command tab feature; base -> box1_corner_var
    3. now under the MoveL-command make your moves
    4. then make new point feature (name e.g. box2_corner)
    5. copy the first MoveL-command and paste it where you need it
    6. change the box1_corner_var -> box2_corner_var from the second MoveL-command
    7. test with caution! :smile:

    Start with a simple program then proceed to the actual program.

    / Lassi Kaivosoja, email: [email protected]
  • TesfitTesfit Posts: 4 Apprentice
    Fortunately I have to do this for a customer so will be working on it over the next week or so as we are deploying the robot.  I am not sure yet how I would go about reteaching the plane feature easily, have some ideas but have not completely baked them all the way through and am just pulling a robot out of the box this week for the test.  Hard to do what I am planning to do on the simulator.  I'll keep you posted though.
    Any update about reteaching the plane easily? I am working on a mobile station where a positional error can happen due to then nature of the design and I am using a feature plane to program the waypoints. The problem I am having is when I reteach the plane it is difficult to get an accurate position of the plane. 
  • matthewd92matthewd92 Founding Pro, Tactile Sensor Beta Testers Posts: 1,267 Handy
    We have a plane calculation script that we use, have been using it for several years.  It takes in 3 points and returns a single pose which is your plane, you would just assign this to your plane feature variable which is used in the move sequences.  We do not use Polyscope any longer, all of our programs are script so we use this for applying offsets for instance to a base feature and then move to the calculated points.

    This is the same as the feature plane in Polyscope since like 3.7 or whenever Ur changed to defining origin->x axis -> y axis


  • cobottiukkocobottiukko Posts: 17 Handy
    Tesfit said:
    Fortunately I have to do this for a customer so will be working on it over the next week or so as we are deploying the robot.  I am not sure yet how I would go about reteaching the plane feature easily, have some ideas but have not completely baked them all the way through and am just pulling a robot out of the box this week for the test.  Hard to do what I am planning to do on the simulator.  I'll keep you posted though.
    Any update about reteaching the plane easily? I am working on a mobile station where a positional error can happen due to then nature of the design and I am using a feature plane to program the waypoints. The problem I am having is when I reteach the plane it is difficult to get an accurate position of the plane. 
    @Tesfit Our customers are using Robotiq's Contact offset feature a lot. When using it, no need to program any changes even the mobile station is not exactly at the same position as it used to be when implemented. All the waypoints are programmed in Feature Point and Robotiq's Contact offset will adjust the Feature point so all the waypoints will be moved relatively same amount.

    Here you can see the idea: https://www.youtube.com/watch?v=Bs_MQslF_v8&t=14s All credits to Ralph W. Earl Company for this awesome video! ( @bcastets It would be nice if you could have same kind of video in youtube as a public video)
    Below is also a good application video from our customer where robot is on mobile platform (EasyRobotics Flex) and the robot is moved in daily basis. "Reprogramming" is done by the Robotiq Copilot. Human just places the robot somewhere near the machine and rest is done automatically.

    Application video: https://www.linkedin.com/posts/mrliljamo_robotti-astui-t%C3%B6ihin-purkamaan-tuotannon-activity-6780388942060539904-zqh7

    Robotiq has a great learning platform where Contact offset function is explained in depht. https://elearning.robotiq.com/


  • bcastetsbcastets Vacuum Beta tester Posts: 673 Expert
    @cobottiukko
    Thank you for your suggestion. I will try to work a video to show a use case of Contact Offset function.
    At the moment the best reference we have is probably our elearning video:
    https://elearning.robotiq.com/course/view.php?id=7&section=14

  • cobottiukkocobottiukko Posts: 17 Handy
    bcastets said:
    @cobottiukko
    Thank you for your suggestion. I will try to work a video to show a use case of Contact Offset function.
    At the moment the best reference we have is probably our elearning video:
    https://elearning.robotiq.com/course/view.php?id=7&section=14

    Thank you! That's true but the problem is that those eLearning videos cannot be shared or embedded since it requires to be signed at eLearning.
  • colbyjcolbyj Posts: 7 Apprentice
    @Puolikas have you ever tried making the Base feature a variable?  You could then change the base feature frame of reference, this might be what you are looking for. Basically in your before start you could ask for some offsets in the X, Y ad Z direction and then use those offsets to offset the base coordinate system.

    Is this more along the line of what you are looking for?
    I've been trying to do this for a similar application recently. I see how to change the base feature to variable in polyscope- but then how do you use it?

    We're attempting to shift our x-y orientation, so that we can program one set of movements as relative moves and adjust the orientation as need be. I wrote a script that takes the difference in angles between two waypoints (one defined as the "start" orientation, and one as the "new") and adds them to the initial TCP- defined in the Installation (and manually created into a variable in BeforeStart).

    The issue is, however, that these translations aren't correct, since get_target_tcp_pose (and the other ways of pulling the pose value, from what I can tell...) is WRT the base frame, not the flange frame (which the TCP rotation's are WRT). 

    So I'm wondering, could we make the base feature a variable, and add our angle shifts to it to achieve the intended rotation? And if so, how would we access the base feature in URScript? 

    Thank you in advance!
  • colbyjcolbyj Posts: 7 Apprentice
    colbyj said:
    @Puolikas have you ever tried making the Base feature a variable?  You could then change the base feature frame of reference, this might be what you are looking for. Basically in your before start you could ask for some offsets in the X, Y ad Z direction and then use those offsets to offset the base coordinate system.

    Is this more along the line of what you are looking for?
    I've been trying to do this for a similar application recently. I see how to change the base feature to variable in polyscope- but then how do you use it?

    We're attempting to shift our x-y orientation, so that we can program one set of movements as relative moves and adjust the orientation as need be. I wrote a script that takes the difference in angles between two waypoints (one defined as the "start" orientation, and one as the "new") and adds them to the initial TCP- defined in the Installation (and manually created into a variable in BeforeStart).

    The issue is, however, that these translations aren't correct, since get_target_tcp_pose (and the other ways of pulling the pose value, from what I can tell...) is WRT the base frame, not the flange frame (which the TCP rotation's are WRT). 

    So I'm wondering, could we make the base feature a variable, and add our angle shifts to it to achieve the intended rotation? And if so, how would we access the base feature in URScript? 

    Thank you in advance!

    Turns out I was looking for the Base variable in the installation variables, and didn't see it in the program variables tab at first (I have many variables being defined/stored). 

    This approach however would shift the location of the starting position the intended amount, but not the actual direction of the motions. Effectively, the robot would just rotate 90 deg. about its base and continue the same pattern in the original direction. I likely messed something up, but we're trying this method instead:

    The new approach we're trying is a variable Point Feature as the reference, so that we can orient all of our programs based on some initial point of that section (program is relative movements for a "section", based on some initial position anyways- so this would be much easier to figure out than a plane). 

    An example would be:
    <div>Loop_1 = 0 #Set Loop Counter at 0</div><div>global p_ileft = Point_1 #def initial pose, Point_1 is a Point Feature initialized to = the initial tool position</div><div>global p_iCenter = pose_trans(p_ileft, p_via) #def center pose</div><div>global p_iright = pose_trans(p_ileft, p_to) #def end pose</div><div>sleep(0.01)</div><div>while (Loop_1 < btm_passes):</div><div>&nbsp; &nbsp; movec(p_iCenter, p_iright)&nbsp;</div><div>&nbsp; &nbsp; stopl(a)</div><div>&nbsp; &nbsp; sleep(0.01)</div><div>&nbsp; &nbsp; if Loop_1 < (temp_btm_passes):</div><div>&nbsp; &nbsp; &nbsp; &nbsp; movec(p_iCenter, p_ileft) #Return on non-last loops</div><div>&nbsp; &nbsp; &nbsp; &nbsp; stopl(a)</div><div>&nbsp; &nbsp; &nbsp; &nbsp; sleep(0.01)</div><div>&nbsp; &nbsp; else:</div><div>&nbsp; &nbsp; &nbsp; &nbsp; break #Stay on Right Side for Last Loop</div><div>&nbsp; &nbsp; end</div><div>&nbsp; &nbsp; Loop_1 = Loop_1 + 1</div><div>end</div>

    We call our script files (within 'script code' nodes in polyscope) within a MoveL, with the feature set to Point_1_var. It works as intended when we manually set the Point_1 Feature to = the tool, and using the pose_trans function (I've been going back and forth between when to use trans or add...). However, if I try to change the initial waypoint and run global Point_1_var =  get_target_tcp_pose(), the feature won't re-initialize to the new initial TCP position. Are these changes happening internally and not being stored? Or are the changes not being made?

    Also, can we redefine these Positions using script? Such as, instead of teaching Point Features for each initial point of the section (we're going to have many sections) and modifying the script files to refer to each different Point Feature, could we teach each as a waypoint, move to that waypoint, and execute Point_1_var =  get_target_tcp_pose()? 

    I'm thinking that the waypoints in this case would have to be in a separate Move command, relative to the base instead of the Point Feature (as the feature would be changing)...
  • colbyjcolbyj Posts: 7 Apprentice
    The code isn't very clear above- I added a "Code" formatting, and it's harder to read (at least on my end)

    global p_via = p[dr,-1*dy,0,0,0,0]
    global p_to = p[dx,0,0,0,0,0]

    global p_ileft = Point_1
    global p_iCenter = pose_trans(p_ileft, p_via) #def center pose
    global p_iright = pose_trans(p_ileft, p_to) #def end pose
    sleep(0.01)
    while (Loop_1 < btm_passes):
        movec(p_iCenter, p_iright)
        stopl(a)
        sleep(0.01)
        if Loop_1 < (temp_btm_passes):
            movec(p_iCenter, p_ileft)
            stopl(a)
            sleep(0.01)
        else:
            break #Stay on Right Side for Last Loop
        end
        Loop_1 = Loop_1 + 1
    end

  • colbyjcolbyj Posts: 7 Apprentice

    I've been able to get the intended results using this method, so I thought I'd share in case others are doing something similar. In retrospect, this may have been easier to do using the Pallet Command... 

    First thing- I changed the code above to global p_ileft = Point_1_var (was just = Point_1- think that's where I was getting issues).  

    I defined Installation Variables i_pose1 and pose_zero, both = p[0,0,0,0,0,0]. The i_pose1 thing is a trick I learned on here to get rid of the Move to Start whenever you run your program- define it as installation variable, use get_forward_kin(), and moveJ to it (Variable Waypoint). 

    Then I made a Point feature, selected variable, and initialized it at the tool (view -> tool, all values =0)

    First I use a moveJ (so it's WRT Base) to a fixed waypoint (Init_Position) and store that TCP pose as my Point feature.

    Then, in the moveL, I select the Feature = Point_1_Var and move to a variable waypoint pose_zero. This way we're moving 0 relative to the point feature, which we defined as the position the robot is already. The only reason I did this is since moveL nodes need a waypoint, and were incomplete with only script files. 

    The question I still have is- does Point_1 and Point_1_var apply changes within the program while it's running, and then revert back to installation value when the program is finished? I appear to be getting the correct motions (when I rotate initial position), but when I view the installation features, the Point_1 feature isn't aligned with the tool unless I manually change the Installation variable. 
Sign In or Register to comment.
Left ArrowBack to discussions page