Tickling a baby’s toes may be cute, but it’s also possible that those touches could help babies learn the words in their language.
New research shows that a caregiver’s touch could help babies to find words in the continuous stream of speech.
“We found that infants treat touches as if they are related to what they hear and thus these touches could have an impact on their word learning,” says Amanda Seidl, an associate professor of speech, language, and hearing sciences at Purdue University who studies language acquisition.
“We think of touch as conveying affection, but our recent research shows that infants can relate touches to their incoming speech signal. Others have looked at the role of touch with respect to babies forming an attachment and physical development. But until now the impact of touch on language learning has not been explored.”
The findings appear in Developmental Science. Seidl is interested in the multitude of cues or sources of information that babies may combine to learn their language. Learning words presents a challenge for infants since most of the words they hear are presented in a continuous stream of speech, rather than in isolation, by their caregivers.
“Parents may pause before saying an infant’s name, but they almost never do so for other words. This research explored whether touches could help infants to find where words begin and end in the continuous stream of speech. They need to find words before they can attach real meaning to their words,” Seidl says.
“Because names of body parts are often the first words that babies learn and touching is often involved when caregivers talk about body parts, we speculated that touch could act as a cue to word edges.”
A total of 48 English-learning 4-month-olds were tested at Purdue’s Infant Speech Lab in two groups as they sat on a parent’s lap facing an experimenter while listening to a pre-recorded continuous stream of speech of nonsense words.
In the first experiment, every time a nonsense word, such as “dobita,” was spoken, the experimenter touched the baby’s knee. This occurred two dozen times. Also, the word “lepoga” was played 24 times, but the infant was only touched once on her elbow during the playing of this word. The other 23 touches to the elbow occurred on other syllable sequences.
Following this listening, the babies participated in a language preference study, and almost all showed that they had pulled “dobita” out of the continuous stream of speech. This was the word that was reinforced by aligned touching.
In the second experiment, the same format of continuous speech and new words was played, but the experimenter touched his or her eyebrow or chin instead of the baby. The children in this experiment did not show that they had pulled out any words.
“It didn’t matter how much time the infant spent looking at the experimenter’s face, the babies were not able to use these cues in the same way as they were when their own body was touched,” says Seidl, who is now looking at individual differences in how parents speak and touch their baby.
“I am interested in whether we can predict babies’ language later on from early measures of speech perception,” Seidl says. “If we look at speech perception and learning in a 6-month-old can we predict their language ability at 3 years? If we can find out what kinds of learners young children are, we could target their learning environment to their learning style.”
Also part of the research team are Ruth Tincoff, an assistant professor at Bucknell University, and former Purdue undergraduate student Christopher Baker and former Purdue graduate student Alejandrina Cristia.
The National Science Foundation supported the research.
Source: Purdue University