skip to primary navigation skip to content
Matt Davis
Programme leader, Hearing and language group

matt.davis@mrc-cbu.cam.ac.uk
01223 273637
  • Predicting and perceiving spoken words
  • Oscillatory neural responses to connected speech
  • Predictive coding and higher-level language comprehension
  • Lexical learning of spoken words

You can view a full list of publications here.

Brain stimulation synchronised with speechNew - paper on entrainment to connected speech:

Working with Benedikt Zoefel and Alan Archer-Boyd we used transcranial alternating current stimulation (tACS) to modulate rhythmic electrical activity in the brain and demonstrated that blood oxygenation responses measured with fMRI: (1) depend on alignment between oscillations and speech rhythm, (2) in brain regions critical for processing speech (the superior temporal gyrus), (3) predict individual differences in rhythmic processing of speech and (4) only if the presented speech is intelligible.

Zoefel, B., Archer-Boyd, A., Davis, M.H. (2018) Phase entrainment of brain oscillations causally modulates neural responses to intelligible speech Current Biology, 28(3), 401-408. CBU News Story

This and other research featured in a recent CBU workshop on state-oscillatory brain stimulation held at the MRC CBU. PEPA home page

Current CBU Staff:

Becky Gilbert

Lucy MacGregor

Ed Sohoglu

Benedikt Zoefel

Current CBU Students:

Heidi Solberg-Økland

Carol (YingCan) Wang

CBU / Cambridge Collaborators:

Bob Carlyon

Thomas Cope

Rik Henson

James Rowe

External Collaborators:

Helen BlankUniversity Medical Center Hamburg-Eppendorf, Hamburg, Germany

Gareth Gaskell, Jelena Mirkovic, Department of Psychology, York, UK

Ingrid Johnsrude, Brain and Mind Institute, University of Western Ontario, London, Ontario, Canada

Jonathan Peelle, Department of Otolaryngology, Washington University in St. Louis, MI, USA.

Kathy Rastle, Jakke Tamminen, Department of Psychology, Royal Holloway, University of London, UK

Jenni Rodd Experimental Psychology, University College London, UK

Jack RogersSchool of Social Sciences, Birmingham City University, UK

Jo TaylorSchool of Life & Health Sciences, Aston University, UK

Previous research highlights:

Perceptual learning of degraded speech by minimising prediction error. Example stimuli from our experiments on speech intelligibility, vocoded speech, and sine-wave speech.

How the brain uses predictive coding to recognise and learn words. What changes in your brain when you learn new words.Semantic ambiguity cartoon

 

Thanks to Berger & Wyse, and the Guardian Weekend for a summary of our research on speech comprehension at reduced levels of awareness.

You can also read about research using brain imaging to detect speech comprehension and awareness during sedation and in vegetative state patients.

Psycholinguistic "research" from the internet:

Aoccdrnig to a rscheearch at Cmabrigde Uinervtisy, it deosn't mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe. Here's a very old page that I wrote about the problems of reading jumbled texts.

genesis();