Seen on MIT Technology Review:Many people with visual impairments find seeing-eye dogs invaluable for avoiding obstacles and negotiating traffic. But even the smartest guide dog can’t distinguish between similar banknotes, read a bus timetable, or give directions. Now robotics researchers at Carnegie Mellon University are developing assistive robots to help blind travelers navigate the modern world.
Read full article on MIT Technology Review.
Our take:
Robotics researchers at Carnegie Mellon University are using the vision-guided Baxter research robot to assist blind travelers in navigating the world.
Many of you may recognize Baxter, a force-limited robot from Rethink Robotics. Baxter is a dual-armed collaborative robot with three cameras and an infrared sensor that is employed to detect whether a human operator is within a 1.5 – 15 in. range of a machine, as the robot is generally intended to be used in automation applications. Researchers from Carnegie Mellon, however, want to use it to help blind people.
"We envision robots being part of society in smart cities and want to make sure that people with visual impairment and other disabilities aren’t left out of that future," said Professor M. Bernardine Dias.
Dias, along with colleague Aaron Steinfeld, decided on using an assistive robot at an information desk in a busy transit center in order to provide help with visual or physical tasks when human workers were either absent or overwhelmed by disgruntled travelers, according to MIT.
Funded by the National Science Foundation, the research has revealed some interesting trends.
“Sighted people tend to be apprehensive when they meet a dexterous humanoid robot for the first time,” says Steinfeld. “But blind people seem to be very comfortable interacting with the robot. They were more comfortable holding the robot’s plastic fingers, in fact, than having physical contact with another human being.”
Baxter was chosen because it is a force-limited robot with a lack of dangerous pinch points, thus making it safe for people who are interacting with it through their sense of touch. Baxter first introduces itself, and then switches off to allow a visually-impaired user the chance to feel its shape and construction. When the person is ready to communicate, a verbal cue turns it back on. Furthermore, the robot is able to learn next tasks by mimicking users moving its manipulators, which according to Dias, opens up a whole world of possibilities in terms of having blind people teach Baxter to do things of use to them.
Future plans include the integration of the robot into a smartphone app they have already developed called NavPal, which provides "audio breadcrumbs" to warn visually impaired pedestrians of hyperlocal hazards like potholes or construction sites. The long-term goal, according to the article, is to introduce mobile robots that perform similar tasks as seeing-eye dogs.
It is interesting to see the researchers using a robot like Baxter, which may otherwise be used for something like automated packaging or inspection, in an application such as this one. What else could you see a robot like Baxter being used for?
- James Carroll, Senior Web Editor
You might also like: