skip to primary navigation skip to content

CBSU bibliography search

To request a reprint of a CBSU publication, please click here to send us an email (reprints may not be available for all publications)

Hierarchical Processing in Spoken Language Comprehension
Journal of Neuroscience, 23(8), 3423-3431
Year of publication:
CBU number:
Understanding the spoken language requires a complex series of processing stages to translate speech sounds into meaning. In this paper we use fMRI to explore the brain regions involved in spoken language comprehension, fractionating this system into sound-based, and more-abstract. higher-level processes. We distorted English sentences in three acoustically different ways, applying each distortion to varying degrees to produce a range of intelligibility (quantified as the number of words that could be reported) and collected whole-brain EPI data from 12 listeners using sparse imaging. BOLD signal correlated with intelligibility along the superior and middle temporal gyri, in the left inferior frontal gyrus (LIFG) and in left hippocampus. Regions surrounding auditory cortex, bilaterally, were sensitive to intelligibility but also showed a differential response to the three forms of distortion, consistent with sound-form-based processes. More distant intelligibility-sensitive regions within the superior and middle temporal gyri, hippocampus, and LIFG were insensitive to the acoustic form of sentences, suggesting more abstract, non-acoustic processes. The hierarchical organisation suggested by these results is consistent with cognitive models and with auditory processing in non-human primates. Areas that were particularly active for distorted speech conditions and thus might be involved in compensating for distortion were found exclusively in the left hemisphere, and partially overlapped with areas sensitive to intelligibility perhaps reflecting attentional modulation of auditory and linguistic processes.