supercomputer_eye

Supercomputers with a driver’s license?

YALE (US)— Researchers have developed a supercomputer based on the human visual system, mimicking its neural network to quickly interpret the world around it.

“One of our first prototypes of this system is already capable of outperforming graphic processors on vision tasks,” says Eugenio Culurciello, associate professor of electrical engineering at Yale University.

Culurciello embedded the supercomputer on a single chip, making the system smaller, yet more powerful and efficient, than full-scale computers.

“The complete system is going to be no bigger than a wallet, so it could easily be embedded in cars and other places,” he says.

The NeuFlow system uses complex vision algorithms developed by Yann LeCun at New York University to run large neural networks for synthetic vision applications.

The idea that Culurciello and LeCun are focusing on is a system that would allow cars to drive themselves. In order to be able to recognize the various objects encountered on the road—such as other cars, people, stoplights, sidewalks, not to mention the road itself. NeuFlow processes tens of megapixel images in real time.

The system is also extremely efficient, simultaneously running more than 100 billion operations per second using only a few watts (that’s less than the power a cell phone uses) to accomplish what it takes bench-top computers with multiple graphic processors more than 300 watts to achieve.

“One of our first prototypes of this system is already capable of outperforming graphic processors on vision tasks,” Culurciello says.

Culurciello embedded the supercomputer on a single chip, making the system smaller, yet more powerful and efficient, than full-scale computers.

“The complete system is going to be no bigger than a wallet, so it could easily be embedded in cars and other places,” Culurciello said.

Beyond the autonomous car navigation, the system could be used to improve robot navigation into dangerous or difficult-to-reach locations, to provide 360-degree synthetic vision for soldiers in combat situations, or in assisted living situations where it could be used to monitor motion and call for help should an elderly person fall, for example.

More news from Yale University: http://opa.yale.edu/

chat0 Comments

You are free to share this article under the Creative Commons Attribution-NoDerivs 3.0 Unported license.

0 Comments

We respect your privacy.