Back to discussions page
Part Recognition, Synchronous Robots, Part inspection
/ Most recent by gerenga
Samuel_Bouchard Posts: 150 Handy
edited June 2016 in Applications
Hi @gerenga, you've packed a lot of technology in the following demo. Great work! Can you share information on the products used and integration tricks with the other Pros?
CEO & Co-Founder @ Robotiq
we used national instruments for both systems. The right hand side is just running with the NI software on a notebook using a standard usb microscope to show that systems don't have to be that complicated.
The left side is running with a smart camera on which the application runs independently and is controlled via TCP/IP communication. There is a notebook attached but it is only for show reasons and actually not required.
sure the setup was shown on Intermach 2016 exhibition in Bangkok.
Thailand is still far behind regarding technical understanding and how to apply technology and automation into production environment.
The main intention of this application was to show the different standard situations for pick & place applications which are common.
Application on the right - Pick & Place from fixed position
One of the easiest applications regarding pick and place is a defined material state at a known position. Here in our case the round shape of the bearing with the bottom reference as defined state and the fixed position from the precision pin as known position.
We are loading into a swivel unit from Zimmer group with the small Zimmer gripper (which we use as they give us a lot of freedom based on their force/size ratio) which then moves the part into a small inspection process done via NI software. In this case inner diameter check.
One thing we wanted to highlight is that cycle time improvments don't always mean you have to speed up the robot but instead get process done in parallel. Here the robot is free for other tasks as it doesn't stay in the actual process. The next step would be a 180deg swivel unit with two grippers which gets loading/unloading and the inspection to run in parallel. However we wanted to leave some room for discussions and curious questions from the visitors so we decided not to show it.
Application on the left - Pick & Place from variable position
Th application on the left is pick and place from an undefined(related to robot not in general) material state and unknown position.
Using the vision system from NI we can achieve a defined state and known position as we get the required position data X,Y,Rz for the nut to make it defined and known for the robot. So it basically becomes an easy task for the robot again.
A few tricks on the setup here:
1. We use pattern matching and defined an area around the nut. By that we made sure that there is space for the gripper fingers for the pick up.
2. We use the robotiq position control to move the gripper fingers in the position which is covered in the pattern matching so we can move down for the pick up without the risk of crashing into a nut being too close.
3. We move a box with a barcode into the station which refers to the nut size as we use two different sizes. Here we show further opportunities the same vision system can provide. It is rather difficult for the visitors to see the difference in nut size from distance so it shows where the vision systems are performing better than operators and more reliable. So by that we have a sorting function integrated which is easily controlled by the barcode. Here automating feeding of boxes with barcode and counting would be easy upgrades but didn't have further benefits for the application.
To show the force control feature of the robotiq grippers we added moving the box out of the station as well.
As we frequently get requests regarding robots working together we show the simplest possible "synchronization" by exchange modbus register values between the robots trigger movements which are simply on both sides time controlled and robotiq actions being triggered by modbus registers as well. It is not really a synchronization but the intention here was showing simple UR features most people don't even consider to realize tasks.
It's as simple solution using the different components to show basic principles in automation with robots.
Any questions, please let me know.