Researchers use gesture recognition to control robotic nurse
Future surgeons may benefit from a system that recognizes hand gestures as commands to control asurgical robot used as a "scrub nurse" or to tell a computer to display medical images of a patient during surgery. Both the gesture recognition and robotic nurse innovations might help to reduce the length of surgeries and the potential for infection, says Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University. Findings from the research are detailed in a paper appearing in the February issue of Communications of the ACM, written by researchers at Purdue, the Naval Postgraduate School in Monterey, CA, and Ben-Gurion University of the Negev, Israel. The work is funded by the US Agency for Healthcare Research and Quality.
The Purdue researchers have developed a prototype service robot, in work with faculty in the university's School of Veterinary Medicine. Researchers at other institutions developing robotic scrub nurses have focused on voice recognition. Wachs is developing algorithms that isolate the hands and apply "anthropometry," or predicting the position of the hands based on knowledge of where the surgeon's head is. The hand-gesture recognition system uses Microsoft’s Kinect camera, which senses 3-D space. "Even if you have the best camera, you have to know how to program the camera, how to use the images," Wachs said. "Otherwise, the system will work very slowly."
Wachs notes that his team is also working on prediction, to anticipate what images a surgeon would need to see next and what instruments required. The vision-based hand gesture recognition technology could have other applications, including the coordination of emergency response activities during disasters, says the Purdue team.
SOURCE: Purdue University/ACM
-- Posted byVision Systems Design