Instead of using a move under a Vision Locate can you use the path recording?
@Sebastien You can use the Path node under a Camera Locate node but it has to be a relative path. The procedure is to first insert a MoveL node and to define a first relative waypoint. This will be the starting point of you relative path. Then you can record the path. After you check the relative path checkbox and you are done.At runtime the Camera Locate node will get the object position. It will then offset the first move reference frame which will move the robot to the path staring point. Then the trajectory will be played relative to that starting point.
@Sebastien Here is an example that I made. You can get the program by clicking the link below. In the program I basically programmed the camera locate as is. Then as suggested by @PierreOlivier_Proulx I added a waypoint relative to the camera feature and then recorded the path making sure it is relative. In the video you will see that the vision system is not able to locate the part at certain locations. I did not troubleshoot this but my guess would be that the shadows created by the part as it was laying to the side of the field of view messed up a bit the vision system. In that case, to solve the issue I would have added an external source of light to eliminate shadows effect. That opens up a whole lot of application. What application do you have in mind?https://youtu.be/JZ8g7fW1_U0
Very nice! One example we often hear is dispensing on parts laid out on a pallet but not in a perfectly structured / fixtured way.
For me can be useful for gluing and deburring!! what is error between acquired path and real shape?
@Fabio_Facchinetti the Path recording will record the robot pose according to the robot precision. The only thing that you can add do this precision is deformation of the tool. When you record, you apply a force on the tool. When you play back, there is no force applied. So the imprecision will depends on the tool stifness.
In this combination, shouldn't we add also the imprecision of the Camera Locate @Etienne_Samson ?
@Samuel_Bouchard yes your right, I only mentioned the Path imprecision, then the camera locate will give you the X Y Z + Rz starting point. That starting point will have it's own imprecision. @Fabio_Facchinetti you will also have an imprecision from the Camera Locate to add.In short, there are two different kind of imprecision, Path will be an imprecision on the actual trajectory, multi-directional, depending on your tool, and quite small to be honest, you can estimate it: Total Path error = UR error + Tool deformationRemember that the UR error is +/- 0.1 mm, so if you have a stiff tool, your error will be very low.Camera Locate imprecision will be an offset from the reference frame, uni-directional, this would will be larger then the Path recording imprecision, but I have no numbers to give right now
Ok sorry... I was wondering that path could be learned from camera due to edge shape recognition ...
@Fabio_Facchinetti I agree with you that this feature would be a nice to have but automatic trajectory generation for robots are quite hard to do. You have to consider tool orientation, account for singularity positions and so on. So even if you have the contour points of the object, the work to generate the trajectory is far from over!
Very smart function B)
@Etienne_Samson, I agree that "Total Path error = UR error + Tool deformation" is a sensible way to estimate the UR precision along a path. However, based on recent feedback I received from UR support, apparently the +/-0.1mm repeatability spec does not actually hold up for paths that are calculated by the robot. For example, using a moveL command, I was told that while the robot can reach "point A" and "point B" with high precision, the TCP may deviate from the "linear" path between the programmed waypoints by roughly +/-1mm. That's 10x worse than the listed repeatability! In my own experience, when we encounter robots that have some trouble with path precision, UR has usually suggested that this behavior is to be expected. Has anyone else experienced this sort of "wobbly" path performance? If you used an FTS to teach a linear path, would it improve the precision of a robot with poor moveL performance? I would appreciate any feedback on this, let me know if I can clarify what we're seeing in any way.
@jbahner we should ask UR directly about that, we got @Stefan_Stubgaard, corporate support at UR who joined us last week. Maybe they got some improvement via software on the path planning or some recommendations. Stefan any comments ?Concerning the use of the Path with FTS to improve a linear path, we don't have any numbers to give right now. And one important thing to remember, the human hand usually guide the Path teaching, and I'm pretty sure your hand got much more imprecision then the UR. So unless you plan to use another industrial robot to teach the Path to the UR, it wouldn't make sens from my point of view. But it's still something I will dig in.
I am not surpirsed that path accuracy for a Universal Robot is around 10x of the repeatability. The path accuracy is affected by deformation of robot arm, as the positional error in each joint is amplified due to the kinematic design. The rigidity of the contruction upon where the robot is mounted is important as well.
My own personal experience working with other robot brands, mainly Fanuc and Hyundai, as an integrator is that an accuracy below +/-1 mm was often hard to achieve, at least with their standard software (some brands offer a high accuracy software option available for an additional price, where you can reach btw +/-0.2 and +/-0.5 mm)
There is an interesting article concerning robot accuracy conducted by Nikon Metrology, see attached pdf.
I am very interested in what level of accuracy you are in need of and for what purpose you need it. If there is a big need in the market of a certain accuracy level we will definitely be interested in investigating the options to improve accuracy.
@Stefan_Stubgaard, thanks for taking interest. To summarize, the application I was working on was for a metal polishing process, and the UR's +/- 1mm fluctuations in path linearity were happening at a high enough frequency to ruin the surface finish. UR support was very reluctant to admit there was a problem, but we ended up pushing through an RMA for a joint. The replacement part ended up fixing the problem, and the robot is performing well above the previous standard. I'd be happy to share more detail on this particular case if you'd like to e-mail me at firstname.lastname@example.org.As far as the market for high accuracy path performance, I've seen quite a bit of interest from people that want to use the UR for path-critical applications like laser cutting and 3d-printing, usually seeking a bare minimum of 100 micron precision. They are drawn to the UR for it's price, simplicity, and excellent interface, but as soon as we talk technical specs we run into problems. When a customer reads a spec that says 100 micron repeatability, and then they find out that spec might be 10x worse while the robot is moving, they are pretty disappointed. I think these customers would be willing to pay a premium for a robot that combines the performance of more precision-grade products with the excellent ease-of-use of the UR, but as of now the UR falls short of what they need.
Wouldnt one way to fix path precision on the robot to slice up the path? If you want to move from pointA to pointB, would use the interpolate_pose and give it some small increment and loop through until the path is complete? You could use a servoj command and feed it the slices, this should allow for smooth motion through the linear path that the interpolate_pose is calculating between the points. If you use the movel you have to add a radius to keep it from stopping between points but the speed that you see is much slower due to slicing the move into small parts (this is very evident in the speed difference between the path move and the return move)Would this help to solve the issue with path repeatability?