A new flying robot can detect and avoid fast-moving objects, researchers report.
The new drone gets scientists a step closer to drones that can fly faster in harsh environments.
Although many flying robots have cameras to detect obstacles, it typically takes from 20 to 40 milliseconds for the drone to process the image and react.
That may seem fast, but it’s not quick enough to avoid a bird or another drone, or even a static obstacle when the drone itself flies at high speed. This can pose a problem with drones in unpredictable environments, or when many fly in the same area.
To solve the problem, researchers equipped a quadcopter (a drone with four propellers) with special cameras and algorithms that reduced its reaction time down to a few milliseconds—enough to avoid a ball thrown at it from a short distance.
The results, published in Science Robotics, can make drones more effective in situations such as the aftermath of a natural disaster.
“For search and rescue applications, such as after an earthquake, time is very critical, so we need drones that can navigate as fast as possible in order to accomplish more within their limited battery life,” says Davide Scaramuzza, who leads the Robotics and Perception Group at the University of Zurich as well as the NCCR Robotics Search and Rescue Grand Challenge.
“However, by navigating fast drones are also more exposed to the risk of colliding with obstacles, and even more if these are moving. We realized that a novel type of camera, called Event Camera, are a perfect fit for this purpose.”
Traditional video cameras, such as the ones found in every smartphone, regularly take snapshots of the whole scene, exposing the pixels of the image all at the same time. This way, though, it can only detect a moving object after the on-board computer has analyzed all the pixels.
Event cameras, on the other hand, have smart pixels that work independently of each other. The pixels that detect no changes remain silent, while the ones that see a change in light intensity immediately send out the information.
This means that only a tiny fraction of the all pixels of the image will need to be processed by the onboard computer, therefore speeding up the computation a lot.
Event cameras are a recent innovation, and existing object-detection algorithms for drones don’t work well with them. So the researchers had to invent their own algorithms that collect all the events the camera records over a very short time, then subtracts the effect of the drone’s own movement—which typically account for most of the changes in what the camera sees.
Drone detection in 3.5 milliseconds
Scaramuzza and his team first tested the cameras and algorithms alone. They threw objects of various shapes and sizes towards the camera, and measured how efficiently the algorithm detected them. The success rate varied between 81 and 97%, depending on the size of the object and the distance of the throw, and the system only took 3.5 milliseconds to detect incoming objects.
Then the most serious test began: putting cameras on an actual drone, flying it both indoor and outdoor and throwing objects directly at it. The drone avoided objects—including a ball thrown from a three-meter (9 feet) distance and travelling at 10 meters (32 feet) per second—more than 90% of the time.
When the drone “knew” the size of the object in advance, one camera was enough. When, instead, it had to face objects of varying size, two cameras gave it stereoscopic vision.
The results show that event cameras can increase the speed at which drones can navigate up to 10 times, expanding their possible applications Scaramuzza says.
“One day drones will be used for a large variety of applications, such as delivery of goods, transportation of people, aerial filmography and, of course, search and rescue,” he says. “But enabling robots to perceive and make decision faster can be a game changer for also for other domains where reliably detecting incoming obstacles plays a crucial role, such as automotive, good delivery, transportation, mining, and remote inspection with robots.”
Nearly as reliable as human pilots
In the future, the team aims to test this system on an even more agile quadrotor.
“Our ultimate goal is to make one day autonomous drones navigate as good as human drone pilots. Currently, in all search and rescue applications where drones are involved, the human is actually in control,” says Davide Falanga, a PhD student and the study’s primary author.
“If we could have autonomous drones navigate as reliable as human pilots we would then be able to use them for missions that fall beyond line of sight or beyond the reach of the remote control.”
The Swiss National Science Foundation through the National Center of Competence in Research (NCCR) Robotics funded the work.
Source: University of Zurich