Robotics

High-Speed Vision System Used for Human-Robot Collaborative System

The project had two goals: to develop and evaluate a real-time human-robot collaborative (HRC) system and to achieve concrete tasks such as collaborative peg-in-hole using the developed system.
May 27, 2022
4 min read

In a research paper titled, “Development of a Real-Time Human-Robot Collaborative System Based on 1 kHz Visual Feedback Control and Its Application to a Peg-in-Hole Task” (Yamakawa, Y.; Matsui, Y.; Ishikawa, M. Development of a Real-Time Human-Robot Collaborative System Based on 1 kHz Visual Feedback Control and Its Application to a Peg-in-Hole Task . Sensors 2021, 21, 663. https://doi.org/10.3390/s21020663), researchers from the University of Tokyo (Tokyo, Japan; www.u-tokyo.ac.jp) describe their work on human-robot collaboration that had two goals: to develop and evaluate a real-time human-robot collaborative (HRC) system and to achieve concrete tasks such as collaborative peg-in-hole using the developed system.

Researchers sought to create a high-speed, high-accuracy human-robot interaction (HRI) system to drive the HRC. Toward that end, they used a high-speed vision system as the sensing system. The vision system consisted of a Mikrotron (Unterschleissheim, Germany; www.mikrotron.de) EoSens 4CXP CoaXPress monochrome camera (MC4086) and an image processing PC equipped with an Intel (Santa Clara, CA, UA; www.intel.com) Xeon W5-1603 v3 2.8-GHz processor and 16 Gbytes of RAM. The camera features a 4/3’’ CMOS sensor, and although it offers 2,336 × 1,728 pixel resolution at speeds up to 563 frames per second (fps), this experimental setting required a higher frame rate of 1000 fps, so it was configured at 1024 × 768 pixels. A Basler AG (Ahrensburg, Germany; www.baslerweb.com) microEnable 5 AQ8-CXP6D frame grabber acquired the raw image data from the EoSens camera. Images were processed on a PC using Windows 7 Professional (64 bit); the image processing software was Microsoft (Redmond, WA, USA; www.microsoft.com) Visual Studio 2017.

The robot hand used features three fingers: a left thumb, an index finger, and a right thumb with a closing speed of 180° in 0.1 second, which is a level of performance beyond that of a human being. Each finger has a top link and a root link, with the left and right thumbs able to rotate around a palm. Therefore, the index finger has two degrees of freedom.

The experiments used a board measuring 220 mm long, 100 mm wide, and 5 mm thick with a mass of about 113 g. Retroreflective markers were attached at four corners of the board to simplify corner detection by the high-speed camera. The hole used in the collaborative peg-in-hole task was formed at the center of the board. The hole’s radius was 6.350 mm. The peg was made of stainless steel, had a radius of 6.325 mm, a length of 405 mm, a chamfer angle of 45°, and a chamfer length of 1 mm. It was fixed to a frame by a magnet.

During experiments to test the speed of the HRC, a human moved the board, which changed the board’s position and orientation. After acquiring image data every 1 ms, the image-processing PC measured the position and orientation of the board within 1 ms and sent the measurement results to the real-time controller via an Ethernet connection using the UDP protocol. The reference joint angle of the robot hand was obtained by solving inverse kinematics of the robot hand based on the position and posture of the board. The torque to be input to the servo motor of the robot hand was generated by proportional derivative control for the reference joint angle, and finally the robot hand was moved according to the torque input and assisted in placing the board onto the peg.

In the experiment, the collaborative motion was performed in time spans varying from 2 to 15 seconds. Because of the high frame rate of the camera combined with the low latency of the CoaXPress interface, any collaborative error could be successfully suppressed to within 0.03 radian angle even when the board was moved by the human subject at a high rate of speed or in a random fashion. Furthermore, the torque input could be suppressed. As a result, collaborative motion between the human and the robot hand using the proposed method was successfully confirmed.

Applying its new system, the University of Toyko team plans to demonstrate other tasks that cannot currently be achieved with conventional HRC methods and to continue to add flexibility and intelligence to the system.

Sign up for Vision Systems Design Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!