Speech perception relies on integrating sensory information with predictions about which words are likely to be heard. Researchers from University of Cambridge Units; MRC Cognition and Brain Sciences Unit (MRC CBU) and the Department of Clinical Neurosciences, with collaborators in Newcastle, have shown that regions of the frontal lobe allow listeners to use these predictions flexibly when understanding degraded speech.
In a paper recently published in Nature Communications (https://www.nature.com/articles/s41467-017-01958-7) the study team led by Thomas Cope asked patients with language problems (aphasia) caused by selective neurodegeneration of frontal brain regions to listen to speech. These patients typically have problems in speaking fluently. This work provides new evidence on how brain regions that have long been known to be involved in speech production also contribute to speech perception.
The researchers measured the tiny magnetic and electric fields produced by brain activity during the perception of speech that was degraded so it was hard to understand. In healthy elderly individuals, the magnitude, timing and location of brain activity depended on how clearly the spoken word was presented and also on whether or not listeners were expecting that particular word (based on written subtitles presented beforehand). In patients with frontal lobe damage, the brain responded in exactly the same way to changes in speech clarity. However, the same patients showed a delayed brain response to words that matched or mismatched with the written subtitles. The length of this delay was directly related to the amount of neurodegeneration in frontal lobes and made patients with aphasia inflexible when listening to degraded speech – they continued to expect that a spoken word would match the written cue even when healthy individuals already knew that it was incorrect.
These findings help resolve a long-standing controversy concerning the functional role of frontal brain regions during speech perception. They also explain a number of puzzling symptoms of frontal aphasia, which were previously poorly understood. For example, inflexible predictions can account for why patients with frontal aphasia have problems understanding grammar, and why they report hearing problems even though their ability to detect quiet sounds (i.e. their audiogram) remains normal. These findings further support the hypothesis that the brain represents information based on how well it matches predictions (a ‘predictive coding’ theory) and provide a direct link between the roles of the frontal lobe in language and flexible behaviour.
The full article can be read here: https://www.nature.com/articles/s41467-017-01958-7