Involuntary, fixational eye movements play a bigger role in vision than researchers previously thought, according to a new study.
Our eyes are never at rest. Instead, they remain in motion, even between our voluntary gaze shifts, through fixational eye movements—small, continuous movements that we are not aware of making.
Scientists have long sought to understand how we humans can perceive the world as stable as our eyes are constantly moving.
Past research has suggested that, in the intervals between voluntary gaze shifts, the human visual system builds a picture of a stable world by relying solely on sensory inputs from fixational eye movements.
According to the new study, however, there may be another contributing factor.
The researchers report that the visual system not only receives sensory inputs from fixational eye movements but also possesses knowledge of the motor behavior involved in those movements.
“The human brain has a very precise knowledge of how the eyes move, even if humans are not aware of moving them, and they use this knowledge to infer spatial relations and perceive the world not as blurry but as stable,” says Michele Rucci, a professor in the brain and cognitive sciences department and Center for Visual Science at the University of Rochester.
The results of the research reveal that spatial representations—that is, the locations of objects in relation to other objects—are based on a combination of sensory and motor activity from both voluntary and involuntary eye movements, which is contrary to the prevailing understanding, Rucci explains.
“It was already clear that the visual system uses sensory and motor knowledge from large voluntary movements, either gaze shifts we perform to look at different parts of a scene, or tracking movements for following moving objects,” he says.
“But scientists didn’t think smaller, involuntary movements like fixational eye movements could be used to convey information through motor signals.”
Instead, the research shows the visual system continually monitors motor activity, even when people believe they are maintaining a steady gaze. The research also shows that vision has computational strategies similar to other senses, such as touch and smell, where motor behavior profoundly affects incoming sensory signals.
The results have important implications in future studies of visual perception and will help in better understanding visual impairments that involve abnormal eye movements.
“Our study unveils that involuntary eye movements, which are widely discarded as motor noise, make major contributions to spatial representations of the world,” says Zhetuo Zhao, a PhD student in Rucci’s lab and the study’s first author. “As we show, studying spatial representations without considering motor activity—as is often done in current neuroscience—is severely limiting.”
The study appears in Nature Communications.
Source: University of Rochester