3D camera helps wheelchair users move with facial expressions
Researchers from Brazil have developed a solution that uses 3Dcamera technology to enable physically-challenged adults to move around in a wheelchair via facial, head, or iris movements.
While the equipment developed by the team from the School of Electrical and Computer Engineering, State University of Campinas (FEEC/Unicamp) is still experimental, a project recently approved by the Innovative Research Program for Small Business (PIPE) of FAPESP (São Paulo Research Foundation) aims to adapt the technology to make it more accessible and to put it into the market in Brazil within two years.
"Our goal is that the final product cost no more than twice a common motorized chair, those that are controlled by joystick and now cost around R $ 7,000," said Professor of FEEC / Unicamp Eleri Cardozo, in a translated press release.
The technology—which Cardozo said could benefit people with tetraplegia, stroke victims, patients with amyotrophic lateral sclerosis or other health conditions that prevent the precise movement of the hands—is based on a 3D camera that monitors facial and body expressions. The device uses a RealSense 3D camera from Intel, which is a depth sensing device that features three cameras in one: a 1080p HD camera, an infrared camera, and an infrared laser projector, that "see" like the human eye to sense depth and track human motion, according to Intel.
The motorized wheelchair had its normal joystick that would control its movements removed, in favor of the depth sensing 3D camera. Expressions that are monitored by the camera are then translated by the system into commands to control the wheelchair.
"The camera can identify more than 70 facial points around the mouth, nose and eyes. By moving these points, it is possible to get simple commands, such as forward, backward, left or right and, most importantly, stop," said Cardozo.
Additionally, the wheelchair is equipped with Wi-Fi and a remote control feature that allows a caregiver to control the wheelchair over an internet connection. When a wheelchair user gets tired, according to FAPESP, a caregiver can take over control of the chair and transport the occupant. With an eye toward patients with even more serious and limiting conditions, including those that would prevent even facial movements, the team is also looking into BCI (brain-computer interface) technology that would extract signals directly from the brain through external electrodes and turn them into commands.
Instead of just creating robotic wheelchairs based on the prototype, however, the team intends to reduce costs and develop software that could be deployed into existing wheelchairs already on the market. The idea, said researcher Paulo Gurgel Pinheiro, is that a user could download the software and begin using it for facial expression-based wheelchair navigation.
Called Wheelie, the system prototype is expected to be ready by the beginning of 2017. Prior to then, the team is working on both improving the classification of facial expressions in order to prevent the interpretation of the signs be hampered by differences in ambient lighting, and to ensure that only the facial expressions of the chair user are captured when there are other people nearby.
View the FAPESP press release.
Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design
To receive news like this in your inbox, click here.
Join our LinkedIn group | Like us on Facebook | Follow us on Twitter
James Carroll
Former VSD Editor James Carroll joined the team 2013. Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.