That Look on Your Face May Affect Your Speech Perception


by Jim Schnabel

March 30, 2009

How you understand spoken words may be influenced by the expression on your face, according to new research. The finding sheds light on the neural pathways the brain uses both to perceive and to produce speech correctly.

“When we—scientists included—think about speech, we generally think about the auditory system and hearing,” says the study’s corresponding author David Ostry, a psychology professor at McGill University in Montreal and also a senior scientist at Yale-affiliated Haskins Laboratory in Connecticut. “Over the past several years, we have started to recognize that the somatosensory system, the other sensory modality that is involved when we talk, may play an equally important role.”

Ostry, along with Haskins researchers Takayuki Ito and Mark Tiede, used a programmable skin-stretching apparatus on 75 volunteers as they listened to a test word. The sound of the word was made to vary randomly between “had” and “head.”  For each trial, the device moved the subject’s facial skin in a certain direction for a certain interval.

The researchers found that the subjects were more likely to perceive the word “had” when their forced skin movements approximated the movements that would normally be made if they voiced the word “had.”  Similarly, they were more likely to perceive “head” when their faces were moved as if they were saying the same word.

As the researchers expected, the influence of facial motion on the word the subjects perceived was clearest when the sound of the word was ambiguously in between “had” and “head.” That influence tended to disappear when the test word became less ambiguous and could thus be perceived more easily based on its sounds alone, or when the skin stretcher made movements unrelated to forming either word. The research was reported in December in the Proceedings of the National Academy of Sciences.

“It’s a neat study,” says Marco Iacoboni, a neuroscientist at UCLA’s David Geffen School of Medicine who has done research in this area. Iacoboni suspects that skin stretching such as that reported in the Haskins study activates one part of a perceptual system that involves so-called mirror neurons.

As work by Iacoboni and other researchers has made clear, the mirror neuron system connects muscle-controlling neurons in the premotor cortex to sensory neurons in other brain areas.  Through this system, for example, the process of hearing a word spoken by someone else activates the motor program used for speaking that word.  Many researchers believe that this mirroring property effectively turns the human sensorimotor system, with its millions of nerve connections throughout the body and brain, into an exquisitely sensitive antenna for the deep perception of anything another human happens to be doing.

In the case of the subjects whose facial expressions were manipulated with the skin-stretching device, says Iacoboni, “I think what happens is that the somatosensory signals are sent to the motor system, and because the motor system is involved in simulating the motor plan for speaking when we perceive the speech of other people, these somatosensory signals injected into the motor system alter its activity and thus alter speech perception.”

Ostry agrees that the mirror system is likely to be involved. But “there are other multisensory areas in auditory and somatosensory cortex as well that might be involved, and we are pursuing this in current work,” he says.

In a study published online in Nature Neuroscience last September, Ostry and Sazzad Nasir found that deaf people were highly sensitive to speech-relevant somatosensory inputs and apparently relied on those inputs to produce intelligible speech in the absence of auditory feedback. “It means that when we talk, we are just as concerned about getting our speech movements right as we are about the sounds themselves,” says Ostry.