Home› Programming
Discussion
Back to discussions page
Puolikas
Posts: 5 Apprentice
Moving plane feature like in CNC-machining |
1.9K views
|
Answered | |
/ Most recent by colbyj
in Programming
|
21 comments |

in Programming
Hi.
What is the best way to use the feature plane?
My rig is Universal Robots UR10, Robotiq wrist camera and Robotiq 2F140 Gripper.
I have a portable assembly station and I want to move it around, take it to storage and put it back in a different place etc.
Now what I have done is to make a plane feature, points representing the corners of the station. Then in program I choose the feature plane under the MoveL command. Everything going great when the program is very simple.
I get the problems when the program is complicated. I cannot use MoveJ command (cannot choose the plane feature). Then in palletizing when I make rectraction after I have gripped the part the added Waypoints make weird moves. Like they are calculated from the Base feature.
Everything is resolved when I know how to move the base feature origin to the work origin before programming.
This method is widely used in CNC-machining:

So we have the Machine origin and then we tell the Machine where the origin of the part is. Then if we tell the Machine to move to X0,Y0,Z0 it will go to the origin of the part, not the Machine origin. This is like using feature plane, but every move is automatically related to part origin. So if you just make a MoveL command X0,Y0,Z0 without choosing the feature plane, it should go to part origin, not to the base feature origin.
If you can help me with this I would be very thankful! At our company (Oulu University of Applied Sciences) we have 30 students, 10 groups, sharing 4 UR-robots. That means robot cells moves all the time and without easy way to move the origin of the program, students have to always show the moved waypoints all over again. That can be frustrating..
Please feel free to ask more specifics if you need.
/ Lassi Kaivosoja - Project Engineer OUAS
What is the best way to use the feature plane?
My rig is Universal Robots UR10, Robotiq wrist camera and Robotiq 2F140 Gripper.
I have a portable assembly station and I want to move it around, take it to storage and put it back in a different place etc.
Now what I have done is to make a plane feature, points representing the corners of the station. Then in program I choose the feature plane under the MoveL command. Everything going great when the program is very simple.
I get the problems when the program is complicated. I cannot use MoveJ command (cannot choose the plane feature). Then in palletizing when I make rectraction after I have gripped the part the added Waypoints make weird moves. Like they are calculated from the Base feature.
Everything is resolved when I know how to move the base feature origin to the work origin before programming.
This method is widely used in CNC-machining:

So we have the Machine origin and then we tell the Machine where the origin of the part is. Then if we tell the Machine to move to X0,Y0,Z0 it will go to the origin of the part, not the Machine origin. This is like using feature plane, but every move is automatically related to part origin. So if you just make a MoveL command X0,Y0,Z0 without choosing the feature plane, it should go to part origin, not to the base feature origin.
If you can help me with this I would be very thankful! At our company (Oulu University of Applied Sciences) we have 30 students, 10 groups, sharing 4 UR-robots. That means robot cells moves all the time and without easy way to move the origin of the program, students have to always show the moved waypoints all over again. That can be frustrating..
Please feel free to ask more specifics if you need.

/ Lassi Kaivosoja - Project Engineer OUAS
I think you can't use MoveJ in this case. MoveJ is a motion in Joint space, not in Cartesian space. It means, there is no x,y,z coordinate. The coordinates are in degrees for each joint for a MoveJ. You will also see that acceleration and speed are represented in degrees/second and degrees/sec2. It means MoveJ is neither in the Base coordinate. Asking the robot to put joint number 2 at 35 degrees, for example, it means it's always at the same place in space. That's why it look like in base coordinate because it's a fixed position in space. It's a fixed angle for each joint. The base is also in Cartesian space with X,Y and Z. To use Cartesian coordinates like X,Y,Z, I think you must use MoveL or MoveP.
Integration Coach
robotiq.com
[email protected]
Then your movej would just use those variable waypoints instead of fixed waypoints. This results in a program that is rather simple still, you have fixed points to go change if needed and a way to control your feature frame as well. In this example, I made the feature variable and used pose_add() to cause the feature frame to step down in 100 mm increments. Notice in the pose_add I am not using the feature but rather the features variable, I did this so that it will do more than 1 step down.
I'm familiar with this method but still not what I'm looking for. Some day I will figure this problem out and bring it to you guys.
/Lassi K.
Is this more along the line of what you are looking for?
Didn't know that you can do that.
So In the begining of the program I show the new base feature offset with trans_pose. I just need to make a jig which helps me to show the correct angle of the assembly station.
I'll get back to you when I have tested this out.
Thanks for your help.
Any updates? I feel like its a common need to move the cobot around and then reteach the base feature.
I'm using an UR5 CB3 with a wrist camera and a hand-e gripper in a palletization application. I think my question is somewhat similar to Puolikas's and that's why I put it here. The parts come from a conveyor belt, the robot has to put them in a box on the right side and after this box is full it has to move to the next box on the opposite side. The robot will have several applications with different parts and boxes. How do I program just one box and then move the program to a different position?
1. you make a point feature (name e.g. box1_corner) from installation to show where corner of the box is.
2. then start you box filling with MoveL-command and select from the command tab feature; base -> box1_corner_var
3. now under the MoveL-command make your moves
4. then make new point feature (name e.g. box2_corner)
5. copy the first MoveL-command and paste it where you need it
6. change the box1_corner_var -> box2_corner_var from the second MoveL-command
7. test with caution!
Start with a simple program then proceed to the actual program.
/ Lassi Kaivosoja, email: [email protected]
This is the same as the feature plane in Polyscope since like 3.7 or whenever Ur changed to defining origin->x axis -> y axis
Here you can see the idea: https://www.youtube.com/watch?v=Bs_MQslF_v8&t=14s All credits to Ralph W. Earl Company for this awesome video! ( @bcastets It would be nice if you could have same kind of video in youtube as a public video)
Below is also a good application video from our customer where robot is on mobile platform (EasyRobotics Flex) and the robot is moved in daily basis. "Reprogramming" is done by the Robotiq Copilot. Human just places the robot somewhere near the machine and rest is done automatically.
Application video: https://www.linkedin.com/posts/mrliljamo_robotti-astui-t%C3%B6ihin-purkamaan-tuotannon-activity-6780388942060539904-zqh7
Robotiq has a great learning platform where Contact offset function is explained in depht. https://elearning.robotiq.com/
Thank you for your suggestion. I will try to work a video to show a use case of Contact Offset function.
At the moment the best reference we have is probably our elearning video:
https://elearning.robotiq.com/course/view.php?id=7§ion=14
We're attempting to shift our x-y orientation, so that we can program one set of movements as relative moves and adjust the orientation as need be. I wrote a script that takes the difference in angles between two waypoints (one defined as the "start" orientation, and one as the "new") and adds them to the initial TCP- defined in the Installation (and manually created into a variable in BeforeStart).
The issue is, however, that these translations aren't correct, since get_target_tcp_pose (and the other ways of pulling the pose value, from what I can tell...) is WRT the base frame, not the flange frame (which the TCP rotation's are WRT).
So I'm wondering, could we make the base feature a variable, and add our angle shifts to it to achieve the intended rotation? And if so, how would we access the base feature in URScript?
Thank you in advance!
Turns out I was looking for the Base variable in the installation variables, and didn't see it in the program variables tab at first (I have many variables being defined/stored).
This approach however would shift the location of the starting position the intended amount, but not the actual direction of the motions. Effectively, the robot would just rotate 90 deg. about its base and continue the same pattern in the original direction. I likely messed something up, but we're trying this method instead:
The new approach we're trying is a variable Point Feature as the reference, so that we can orient all of our programs based on some initial point of that section (program is relative movements for a "section", based on some initial position anyways- so this would be much easier to figure out than a plane).
An example would be:
We call our script files (within 'script code' nodes in polyscope) within a MoveL, with the feature set to Point_1_var. It works as intended when we manually set the Point_1 Feature to = the tool, and using the pose_trans function (I've been going back and forth between when to use trans or add...). However, if I try to change the initial waypoint and run global Point_1_var = get_target_tcp_pose(), the feature won't re-initialize to the new initial TCP position. Are these changes happening internally and not being stored? Or are the changes not being made?
Also, can we redefine these Positions using script? Such as, instead of teaching Point Features for each initial point of the section (we're going to have many sections) and modifying the script files to refer to each different Point Feature, could we teach each as a waypoint, move to that waypoint, and execute Point_1_var = get_target_tcp_pose()?
I'm thinking that the waypoints in this case would have to be in a separate Move command, relative to the base instead of the Point Feature (as the feature would be changing)...
global p_via = p[dr,-1*dy,0,0,0,0]
I've been able to get the intended results using this method, so I thought I'd share in case others are doing something similar. In retrospect, this may have been easier to do using the Pallet Command...
First thing- I changed the code above to global p_ileft = Point_1_var (was just = Point_1- think that's where I was getting issues).
I defined Installation Variables i_pose1 and pose_zero, both = p[0,0,0,0,0,0]. The i_pose1 thing is a trick I learned on here to get rid of the Move to Start whenever you run your program- define it as installation variable, use get_forward_kin(), and moveJ to it (Variable Waypoint).
Then I made a Point feature, selected variable, and initialized it at the tool (view -> tool, all values =0)
First I use a moveJ (so it's WRT Base) to a fixed waypoint (Init_Position) and store that TCP pose as my Point feature.
Then, in the moveL, I select the Feature = Point_1_Var and move to a variable waypoint pose_zero. This way we're moving 0 relative to the point feature, which we defined as the position the robot is already. The only reason I did this is since moveL nodes need a waypoint, and were incomplete with only script files.
The question I still have is- does Point_1 and Point_1_var apply changes within the program while it's running, and then revert back to installation value when the program is finished? I appear to be getting the correct motions (when I rotate initial position), but when I view the installation features, the Point_1 feature isn't aligned with the tool unless I manually change the Installation variable.