skip to primary navigation skip to content
*** PLEASE READ ************************************** THIS PAGE HAS BEEN IMPORTED FROM THE OLD SITE. FORMATTING IS MAINTAINED BY AN EXTERNAL STYLESHEET. WHEN YOU EDIT THIS PAGE, YOU MAY WANT TO REMOVE THE REFERENCE TO THIS STYLESHEET AND UPDATE THE FORMATTING. ******************************************************
Untitled Document

Matthew H. Davis and Ingrid S. Johnsrude (2003)

Hierarchical Processing in Spoken Language Comprehension

In press at The Journal of Neuroscience, 23(8), p.3423-3431

Abstract:

Understanding spoken language requires a complex series of processing stages to translate speech sounds into meaning. In this study, we use functional magnetic resonance imaging to explore the brain regions that are involved in spoken language comprehension, fractionating this system into sound-based and more abstract higher-level processes. We distorted English sentences in three acoustically different ways, applying each distortion to varying degrees to produce a range of intelligibility (quantified as the number of words that could be reported) and collected whole-brain echo-planar imaging data from 12 listeners using sparse imaging. The blood oxygenation level-dependent signal correlated with intelligibility along the superior and middle temporal gyri in the left hemisphere and in a less extensive homologous area on the right, the left inferior frontal gyrus (LIFG), and the left hippocampus. Regions surrounding auditory cortex, bilaterally, were sensitive to intelligibility but also showed a differential response to the three forms of distortion, consistent with sound-form-based processes. More distant intelligibility-sensitive regions within the superior and middle temporal gyri, hippocampus, and LIFG were insensitive to the acoustic form of sentences, suggesting more abstract nonacoustic processes. The hierarchical organization suggested by these results is consistent with cognitive models and auditory processing in nonhuman primates. Areas that were particularly active for distorted speech conditions and, thus, might be involved in compensating for distortion, were found exclusively in the left hemisphere and partially overlapped with areas sensitive to intelligibility, perhaps reflecting attentional modulation of auditory and linguistic processes.

Request a pdf by emailing: matt.davis@mrc-cbu.cam.ac.uk

Listen to example stimuli.


This page was created on 21st March 2003. Comments and suggestions to matt.davis@mrc-cbu.cam.ac.uk.

genesis();