Panoramic images help steer robot over desert terrain
Panoramic images help steer robot over desert terrain
Designing a robot capable of long-distance planetary exploration is the goal of researchers at NASA Ames Research Center (NARC; Mountain View, CA) and Carnegie Mellon University (CMU; Pittsburgh, PA). To that end, they developed Nomad, a mobile robot that recently crossed the Atacama Desert in Chile and explored landscapes similar to those on the Moon and Mars (see Fig. 1). Controlled remotely from both NARC and CMU, Nomad uses novel camera, image-compression, and data-communications systems to transmit images to researchers in North America.
Because traditional cameras provide limited resolution and fields of view, Nomad`s developers built a panospheric camera capable of capturing images 360 around the robot. This was accom plished by pointing a standard 1k ¥ 1k color camera upward into a convex optical mirror (see Fig. 2). In this way, the camera can acquire 360 panoramic images from 90 below the horizontal to 30 above the horizontal at 6 frames/s. Such spherical images of the horizon provide researchers with wider imagery for driving through and viewing the planetary terrain.
"Because images contain over 12 Mbits of data," says William Whittaker Fredkin, research professor and director of the Field Robotics Center at CMU, "we needed to compress them before transmission." To do so, Whittaker and his colleagues fitted the robot with an on-board imaging computer based on a dual Pentium Pro computer running Windows NT and a C80-based add-in board. To perform image compression, researchers ported fractal image compression software from Summus (Irmo, SC) to the DSP board to attain a 100:1 compression ratio.
After compression, images are decomposed into packets and transmitted at 6 frames/s to a repeater station using a 1.54-Mbit/s wireless Ethernet bridge under the control of network data-delivery service software from Real-Time Innovations (Sunnyvale, CA). Image data are then relayed to a satellite ground terminal and transmitted via satellite to a receiver station in Pittsburgh. Then, using image-pro cessing software developed with the GRK Laboratory at the University of Iowa (Des Moines, IO), the images are dewarped and displayed. The view from the panospheric camera is then projected on a 220, 35-ft-wide screen at the Carnegie Science Center (Pittsburgh, PA).
To control the robot, NASA created a virtual dashboard, an operator interface for driving the virtual robot (see Fig. 3). This control panel allows an operator to see the robot`s state in both graphical and numeric formats. The robot`s position can be plotted on aerial images and rendered in 3-D, all in real-time. Consequently, an operator can quickly assess Nomad`s condition and send instructions to it with a click of a mouse.