@ziga004
Do you know the visual offset function of the wrist camera ? That maybe helpful for your project.
https://elearning.robotiq.com/course/view.php?id=5§ion=14
Nevertheless, I read about this option in the manual and saw a few of your posts about it. From my understanding, even when using the Visual Offset Node, you can't update the relative trajectory automatically, when the Offset Tag moves. Am I right? As per your post https://dof.robotiq.com/discussion/2282/change-visual-offset-tag-reference-position here, you'd still have to keep the old Tag reference point, just transpose from it to a new one (Tag_2)?
Also, can you answer one quick question for me: I have an older wrist camera with a license that doesn't support code scanning. So I can't really test it in real world on a tag. When using the Visual Offset node, what type of Feature do you specify in all the MoveJ's and MoveL's inside the node? Is it the Snapshot position, is it a Point Feature and are you using Fixed Waypoints or Variable Waypoints for corresponding Moves?
Visual offset Node use a feature point.
When the visual offset tag is registered in the installation menu, the reference position of the tag is saved. All relative waypoints are defined according to this reference position.
When the program is executed, the visual offset tag can be changed to a different location, all waypoints will follow.
If you need to edit relative points, you need to do it with the tag at the reference position saved when the tag has been registered in the intallation menu.
The article you shared presents a solution to be able to change tag reference position in case the relative waypoints have been previously created.
https://dof.robotiq.com/discussion/2282/change-visual-offset-tag-reference-position
You can update your License to get the code scanning feature. Contact support@robotiq.com to get it done.
When using the Visual Offset node, what type of Feature do you specify in all the MoveJ's and MoveL's inside the node?
> The feature point use by the visual tag offset registered in the installation menu.
Is it the Snapshot position, is it a Point Feature and are you using Fixed Waypoints or Variable Waypoints for corresponding Moves?
> The reference position is a feature point. All relative waypoint are linked to this feature.
Hi everyone!
I have a basic setup consisting of a UR10e and a wrist camera. By using the RoboDK simulation software I'm able to export waypoints (by that I mean MoveJ's and MoveL's) in .URP format, readable by the Polyscope. Basically, I can transfer pre-made program's (Offline programming) either, with respect to a local reference frame or w.r.t to the robot base. The camera would then scan and locate an reference object in real world.
What I would like to accomplish with your help is correcting these pre-programmed waypoints inside Polyscope with the object_location data from the wrist camera. Imagine an reference object in real world that get's scanned and (x,y,z,rx,ry,rz) is gathered. A previously defined Point Feature variable gets overwritten by this data and the all the waypoints get shifted automatically w.r.t. the new reference.
I know for a fact that you can specify a Point Feature to a MoveL/MoveJ however, according to this post (https://dof.robotiq.com/discussion/2302/dispensing-application-with-robotiq-wrist-camera) the waypoints have to be re-taught one by one with the Point Feature selected AND in addition to that, a Point Feature has to be re-taught when new waypoints are defined. This means an operator has to be present to do that.
My question is: is it possible to update an offline generated program, consisting of MoveL waypoints (.URP file), on the fly using wrist camera without retouching the waypoints or redefining a point feature by hand?
Kind regards!