CARNEGIE MELLON (US) — Embedding a camera in the side of a football could give spectators a new, ball’s-eye view of the playing field.
Football fans have become accustomed to viewing televised games from a dozen or more camera angles, but researchers suggest another possible camera position: inside the ball itself.
Because a football can spin at 600 rpm, the raw video is an unwatchable blur. But the researchers developed a computer algorithm that converts the raw video into a stable, wide-angle view.
Kris Kitani, a post-doctoral fellow in Carnegie Mellon University’s Robotics Institute, is aware that a football league is unlikely to approve camera-embedded footballs for regular play. Even so, the BallCam might be useful for TV, movie productions or training purposes.
Study co-author Kodai Horita, a visiting graduate student last year at the Robotics Institute, will present a paper about BallCam on March 8 at the Augmented Human International Conference in Stuttgart, Germany.
Kitani says BallCam was developed as part of a larger exploration of digital sports. “We’re interested in how technology can be used to enhance existing sports and how it might be used to create new sports,” he explains.
In some cases, athletic play may be combined with arts or entertainment; a camera-embedded ball, for instance, might be used to capture the expressions on the face of players as they play catch with it.
Other researchers have developed throwable cameras that produce static images or use multiple cameras to capture stabilized video. The BallCam system, made from a rubber-sheathed plastic foam football, uses a single camera with a narrow field of view to generate a dynamic, wide-angle video.
When the ball is thrown in a clean spiral, the camera records a succession of frames as the ball rotates. When processing these frames, the algorithm uses the sky to determine which frames were made when the camera was looking up and which were made when it was looking down.
The upward frames are discarded and the remaining, overlapping frames are stitched together with special software to create a large panorama. Similar stitching software is used by NASA to combine images from Mars rovers into large panoramas and is increasingly found in digital cameras.
The algorithm also makes corrections for some distortions in the image that twist yard lines and occur because of the speed of the ball’s rotation. Further work will be necessary to eliminate all of the distortion, Kitani said, and a faster camera sensor or other techniques will be needed to reduce blurring. Multiple cameras might also be added to the football to improve the finished video.
Kitani and Horita developed BallCam along with Hideki Sasaki and Professor Hideki Hoike of University of Electro-Communications (UEC) in Tokyo where Horita is a graduate student.
Source: Carnegie Mellon University