How will LS3 “see?” We can look to its predecessor, the BigDog, to get an idea. BigDog has an integrated stereo vision system and a LIDAR system. The vision system was developed by NASA’s Jet Propulsion Laboratory and features a pair of *stereo cameras, a computer, and vision software. The system can be used to acquire the shape of the 3D terrain just in front of the robot, and also to find a clear path forward. LIDAR is used to allow the BigDog to follow a human leader without requiring the operator to drive, according to a research paper on the robot.
The LS3 prototype, which is being referred to as “AlphaDog,” completed its first outdoor assessment in January 2012 when it climbed and descended a hill while testing its vision capabilities. Completion of the project, which will take place in Waltham, Mass., is expected to be completed by March 31, 2015.
View more information the LS3 program.
[*Editor’s note: Camera vendor not listed.]
Also check out:
Students develop 3D thermal mapping system for firefighting robots
Teams to compete to program Terminator-style robot
Robot snakes inspect nuclear power plant
Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design
To receive news like this in your inbox, click here.
Join our LinkedIn group | Like us on Facebook | Follow us on Twitter | Check us out on Google +