Non-Factory

Researchers produce wide-angle video with ball camera

Researchers from Carnegie Mellon University and the University of Electro-Communications have shown that a camera embedded in the side of a rubber-sheathed plastic foam football can record video while the ball is in flight, along with developing a computer algorithm that converts the raw video into a stable, wide-angle view.
Feb. 27, 2013
3 min read

Football fans have become accustomed to viewing televised games from a dozen or more camera angles, but researchers at Carnegie Mellon University (Pittsburgh, PA, USA) and the University of Electro-Communications (UEC; Tokyo, Japan) in Tokyo suggest another possible camera position: inside the ball itself.

The researchers have shown that a camera embedded in the side of a rubber-sheathed plastic foam football can record video while the ball is in flight and could give spectators a unique, ball's-eye view of the playing field. Because a football can spin at 600 rpm, the raw video is an unwatchable blur. But the researchers developed a computer algorithm that converts the raw video into a stable, wide-angle view.

Kris Kitani, a post-doctoral fellow in Carnegie Mellon's Robotics Institute, is aware that a football league is unlikely to approve camera-embedded footballs for regular play. Even so, the BallCam might be useful for TV, movie productions, or training purposes. Two demonstration videos are available on his web site at http://www.cs.cmu.edu/~kkitani/Top.html.

Other researchers have developed throwable cameras that produce static images or use multiple cameras to capture stabilized video. Most notably, Steve Hollinger, president of S.H. Pierce and Co. (Boston, MA, USA) patented such an idea over four years ago. The BallCam system developed by Kitani and Kodai Horita, along with Hideki Sasaki and Professor Hideki Hoike of UEC, uses a single camera with a narrow field of view to generate a dynamic, wide-angle video.

When the ball is thrown in a clean spiral, the camera records a succession of frames as the ball rotates. When processing these frames, the algorithm uses the sky to determine which frames were made when the camera was looking up and which were made when it was looking down. The upward frames are discarded and the remaining, overlapping frames are stitched together with special software to create a large panorama. Similar stitching software is used by NASA to combine images from Mars rovers into large panoramas and is increasingly found in digital cameras.

The algorithm also makes corrections for some distortions in the image that twist yard lines and occur because of the speed of the ball's rotation. Further work will be necessary to eliminate all of the distortion, Kitani said, and a faster camera sensor or other techniques will be needed to reduce blurring. Multiple cameras might also be added to the football to improve the finished video.

Co-author Horita, a visiting graduate student last year at the Robotics Institute, will present a paper about BallCam on March 8 at the Augmented Human International Conference in Stuttgart, Germany.

-- Dave Wilson, Senior Editor, Vision Systems Design

Sign up for Vision Systems Design Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!