It’s difficult enough to see things in the dark, but if you’re a hummingbird-sized hawkmoth you also have to juggle hovering in midair while tracking a flower that’s moving in the wind.
Using high-speed infrared cameras and 3D-printed robotic flowers, scientists have learned that the insects manage these complex sensing and control challenges by slowing their brains to improve vision under low-light conditions—while continuing to perform demanding tasks.
The findings could help the next generation of small flying robots operate efficiently under a broad range of lighting conditions, researchers say.
“There has been a lot of interest in understanding how animals deal with challenging sensing environments, especially when they are also doing difficult tasks like hovering in mid-air,” says lead author Simon Sponberg, a former postdoctoral researcher at University of Washington who is now an assistant professor in the School of Physics and School of Applied Physiology at Georgia Institute of Technology.
“This is also a very significant challenge for micro air vehicles.”
The hawkmoth has been studied extensively to investigate the fundamental principles governing the development and function of its neural system, says coauthor Tom Daniel, a biology professor and director of the new Air Force Center of Excellence on Nature-Inspired Flight Technologies and Ideas at University of Washington.
Daniel’s research group has experimentally characterized the response of flying hawkmoths using a sensory input comprised of the linear sum of sine waves. The new paper, published in Science, extends application of the “sum of sines” approach, he says.
“Simon’s work took the formal methods of control theory to dissect out how neural circuits adapt to vast ranges of luminance levels,” adds Daniel. “By looking at the time delays in the movement dynamics of a freely-flying moth—interacting with the input of a robotically moved flower—Simon was able to extract the luminance dependent processing of the moth’s central nervous system.”
Scientists already knew that the moths, which feed on flower nectar during the evening and at dusk and dawn, use specialized eye structures to maximize the amount of light they can capture. But they also surmised that the insects might be slowing their nervous systems to make the best use of this limited light.
But if they were slowing their brains to see better, wouldn’t that hurt their ability to hover and track the motion of flowers?
To study this question, researchers used high-speed infrared cameras and nectar-dispensing robotic flowers that could be moved from side-to-side at different rates. While varying both the light conditions and the frequency at which the flowers moved, they were able to study how well free-flying moths kept their tongues—or proboscises—in the flowers.
They also measured real flowers blowing in the wind to determine the range of motion the insects had to contend with in the wild.
“We expected to see a tradeoff with the moths doing significantly worse at tracking flowers in low light conditions,” Sponberg says. “What we saw was that while the moths did slow down, that only made a difference if the flower was moving rapidly—faster than they actually move in nature.”
In the experiments, the moths tracked robotic flowers that were oscillating at rates of up to 20 hertz—20 oscillations per second. That was considerably faster than the two-hertz maximum rate observed in real flowers. Because the moth’s wings beat at a rate of about 25 strokes per second, they had to adjust their direction of movement with nearly every wingstroke—a major sensing, computational, and control accomplishment.
“This is really an extreme behavior, though the moth makes it look simple and elegant,” Sponberg says. “To maneuver like this is really quite challenging.”
What it means for robots
In the natural world, light intensity varies 10 billion-fold from noon on a sunny day to midnight on a cloudy evening. Operating in that range of luminosity is a challenge for both moths and the sensors on human-engineered systems. Understanding how natural systems adjust to this range of conditions could therefore have broader benefits.
“If we want to have robots or machine vision systems that are working under this broad range of conditions, understanding how these moths function under these varying light conditions would be very useful,” Sponberg says.
To gather the data reported in this paper, the researchers used a robotic flower able to move in one dimension. Recently, they’ve used the actuator devices from a 3D printer to build a robotic flower that moves in two or three dimensions, providing an additional challenge for the moths.
In future research, they hope to incorporate their robotic flower into a low-speed wind tunnel to study the aerodynamic challenges the moths overcome—including the role of wing vortices and the flow-effect interaction of the insect’s wings with the flowers.
Robert Hall from University of Washington and Jonathan Dyhr from Northwest University are study coauthors.
The National Science Foundation and the Air Force Office of Scientific Research funded the work.