Non-Factory

NASA’s Raven vision system to enable autonomous rendezvous at the International Space Station

Set to launch aboard the 10th SpaceX commercial resupply mission is NASA’s Raven technology module, which features a vision system comprised of visible, infrared, and lidar sensors that will be affixed outside the International Space Station (ISS) to test technologies that will enable autonomous rendezvous in space.
Feb. 16, 2017
4 min read

Set to launch aboard the 10th SpaceX commercial resupply mission is NASA’s Raven technology module, which features a vision system comprised of visible, infrared, and lidar sensors that will be affixed outside the International Space Station (ISS) to test technologies that will enable autonomous rendezvous in space.

Through Raven, NASA says it will be one step closer to having a relative navigation capability that it can take “off the shelf” and use with minimum modifications for many missions, and for decades to come. Raven’s technology demonstration objectives are three-fold:

  • Provide an orbital testbed for satellite-servicing relative navigation algorithms and software.
  • Demonstrate multiple rendezvous paradigms can be accomplished with a similar hardware suite.
  • Demonstrate an independent visiting vehicle monitoring capability.

Raven’s visible camera, the VisCam, was originally manufactured for the Hubble Space Telescope Servicing Mission 4 on STS-109. The 28 Volt camera features an IBIS5 1300 B CMOS image sensor from Cypress Semiconductor, which is a 1280 x 1024 focal plane array with a 6.7 µm pixel size that outputs a 1000 x 1000-pixel monochrome image over a dual data-strobed Low Voltage Differential Signaling (LVDS) physical interface.

The camera is paired with a commercially available, 7 radiation-tolerant 8 – 24 mm zoom lens that has been ruggedized for spaceflight by NASA’s Goddard Space Flight Center. The motorized zoom lens provides zoom and focus capabilities via two, one-half inch stepper motors. The adjustable iris on the commercial version of the lens has been replaced with a fixed f/4.0 aperture. Additionally, the Viscam provides a 45° x 45° FOV when at the 8 mm lens setting and a 16° x 16° FOV while at the 24 mm lens setting. The combination of the fixed aperture and variable focal length and focus adjustments in the lens yield a depth of field of approximately four inches from the lens out to infinity, according to NASA. Furthermore, the VisCam assembly includes a stray light baffle coated with N-Science’s ultra-black Deep Space Black coating, which protects the VisCam from unwanted optical artifacts that arise from the dynamic lighting conditions in Low Earth Orbit.

Raven’s infrared camera, the IRCam, is a longwave infrared (LWIR) camera that is sensitive in the 8 – 14 µm wavelength range. The camera features a 640 x 480 pixel U6010 Vanadium Oxide microbolometer array from DRS Technologies and includes an internal shutter for on-orbit camera calibration and flat field correction. Furthermore, the camera operates via USB 2.0 interface and includes an athermalized, 50 mm f/1.0 lens that yields an 18° x 14° FOV.

Also included in Raven’s sensor payload is flash lidar, which collects data by first illuminating the relative scene with a wide-beam 1572 nm laser pulse and then collecting the reflected light on a 256 x 256 focal plane array. By clocking the time between transmission and reception, the focal plane array can accurately measure the distance to the reflected surface, as well as the return intensity, at a rate of up to 30 Hz.

Five days after launch, Raven will be removed from the the unpressurized “trunk” of the SpaceX Dragon spacecraft by the Dextre robotic arm, and attached on a payload platform outside the space station. From this location, Raven will begin providing information for the development of a mature real-time relative navigation system.

"Two spacecraft autonomously rendezvousing is crucial for many future NASA missions and Raven is maturing this never-before-attempted technology," said Ben Reed, deputy division director, for the Satellite Servicing Projects Division (SSPD) at NASA’s Goddard Space Flight Center in Greenbelt, Maryland — the office developing and managing this demonstration mission.

While on the ISS, Raven’s components will gather images and track incoming and outgoing visiting space station spacecraft. Raven’s sensors will feed data (images) to a processor, which will use special pose algorithms to gauge the relative distance between Raven and the spacecraft it is tracking. Based on these calculations, the processor will autonomously send commands that swivel the Raven module on its gimbal to keep the sensors trained on the vehicle, while continuing to track it. During this, NASA operators on the ground will evaluate the Raven’s capabilities and make adjustments to increase its tracking performance.

Over a two-year lifespan, Raven will test these technologies, which are expected to support future NASA missions for decades to come.

View the NASA press release.
View a technical paper on Raven.

Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox,
click here.

Join our LinkedIn group | Like us on Facebook | Follow us on Twitter

Learn more: search the Vision Systems Design Buyer's Guide for companies, new products, press releases, and videos

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Sign up for Vision Systems Design Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!