Hearing helps us see what we can’t see

U. PENN (US)—Hearing the name of an object improves chances of seeing it, even when the object is flashed onscreen in conditions and speeds (50 milliseconds) that would render it invisible, recent experiments show.

The findings, published in the journal PLoS One, suggest that the effect is specific to language. Getting a good look at the object before the experiment did nothing to help participants see it flashed.

Language changes what we see and also enhances perceptual sensitivity, the research shows. Verbal cues can influence even the most elementary visual processing and inform our understanding of how language affects perception.

The research team led by psychologist Gary Lupyan, an assistant professor at the University of Pennsylvania, had participants complete an object detection task in which they made an object-presence or -absence decision to briefly presented capital letters.

Other experiments within the study further defined the relationship between auditory cues and identification of visual images.

For example, researchers reasoned that if auditory cues help with object detection by encouraging participants to mentally picture the image, then the cuing effect might disappear when the target moved on screen.

The study found that verbal cues still clued participants in. No matter what position on screen the target showed up the effect of the auditory cue was not diminished, an advantage over visual cues.

Researchers also found that the magnitude of the cuing effect correlated with each participant’s own estimation of the vividness of their mental imagery. Using a common questionnaire, researchers discovered that those who consider their mental imagery particularly vivid scored higher when provided an auditory cue.

The team went on to determine that the auditory cue improved detection only when the cue was correct—that is the target image and the verbal cue had to match. According to researchers, hearing the image labeled evokes an image of the object, strengthening its visual representation and thus making it visible.

“This research speaks to the idea that perception is shaped moment-by-moment by language,” says Lupyan. “Although only English speakers were tested, the results suggest that because words in different languages pick out different things in the environment, learning different languages can shape perception in subtle, but pervasive ways.”

Michael Spivey of the University of California, Merced, collaborated on the work, which was funded by the National Science Foundation.

More news from Penn: www.upenn.edu/pennnews

chat1 Comment

You are free to share this article under the Creative Commons Attribution-NoDerivs 3.0 Unported license.

  1. C Klein

    What are the implications of this for the deaf and hearing-impaired?

We respect your privacy.