From Monkey Calls to Human Speech


by Jim Schnabel

July 13, 2009

The uniqueness of spoken language has made it relatively difficult to study: For ethical reasons, scientists are strictly limited in the experimentation they can do on humans, so much of the research in the field has focused on rare patients with brain damage in speech-related areas. In recent years, however, speech scientists have been making major advances based on studies of monkeys. Such studies indicate that primate communications, though much simpler than ours, occur thanks to the same neural pathways and have many of the same basic features.

“These studies of how auditory processes work in primate brains give us an extremely useful framework for understanding speech processing in human brains,” says Sophie Scott, a neuroscientist at University College London who co-authored a review article on the subject published online May 26 in Nature Neuroscience.

Animal experiments have helped to confirm a fundamental theory about language processing: It is divided into two broad paths of activity in the brain. Inspired by the finding of a similar division of neural labor in visual processing, hearing researchers in the 1990s proposed that one auditory stream runs from the primary auditory cortex, deep in the center of the brain,  to the back of the parietal cortex (the back half of the top of the brain), while another runs from the auditory cortex to the front of the temporal cortex (toward the bottom of the brain). As with vision, the first stream is believed to process spatial, “where” information, while the second processes “what” information concerning patterns and objects.

Several recent experiments in macaques and other animals have correlated “where” or “what” processing—zeroing in on the location of another monkey, or identifying its distinctive “voice”—with these anatomically separate processing streams in the brain. Even in cats, researchers have been able to confirm, in a study last year, that when the back portion of the auditory cortex is disrupted, the animal partially loses its ability to localize sounds, and when the front portion is disrupted, the animal loses its auditory pattern-recognition ability—but not vice versa.

Hierarchies of hearing

Studies such as these have strongly suggested that within these processing streams, a basic “hierarchical” processing structure exists, similar to that noted for vision. At the base of the hierarchy is the primary auditory cortex, where the sound information coming from the inner ear is presented “tonotopically”—according to its frequency—“just as in the visual system you have the retinotopic organization of [the] primary visual cortex,” says Josef Rauschecker, first author of the review paper and a neuroscientist and primate researcher at Georgetown University Medical Center in Washington, D.C. From this core of more or less raw data, the neuronal circuits of the auditory cortex “combine feature elements to build representations of more complex [auditory] objects,” he says.

Higher up in this processing hierarchy, neurons represent more stable auditory objects that are less dependent on the physical properties of the incoming sound. Such objects would include words, or, in the case of monkeys, species-specific calls, that would be perceived with stable meanings despite a wide range of pitches and pronunciations. Rauschecker and colleagues have found evidence that the apex of this processing hierarchy—for both the what and the where streams—lies outside the auditory cortex, within the prefrontal cortex.

Subsequent studies have suggested that these prefrontal areas targeted by auditory streams may be necessary for translating perception into action and decision. Yale Cohen, a researcher at the University of Pennsylvania School of Medicine, notes that he and his colleagues recently have done studies, published and unpublished, of prefrontal neuron responses in macaques trained to respond to auditory stimuli. “In the prefrontal cortex, what the neurons are telling us is not what the stimuli sound like, but what the animal reports,” he says—even if the animal’s report happens to be incorrect. “If you go one stage back, say in the temporal lobe, those neurons don’t care about what the animal is telling us; all they care about is the perception of the stimulus.”

The human speech system, for all its complexity, appears to be firmly rooted in the less developed neural structures of other primates. “We think the same anatomical pathways are available in monkeys and in humans,” says Rauschecker. “From an anatomical point of view there’s almost no difference, as far as we can see, between the two species.”

The challenge of future research, then, is to find out what else explains the enormous interspecies difference in speech capacity. “Finding out what primates don’t have in their connectivity and in their physiology I think will be crucial to understanding what makes speech so special,” Rauschecker says.