Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility

Hyojin Park, Christoph Kayser, Gregor Thut, Joachim Gross

Research output: Contribution to journalArticlepeer-review

55 Citations (Scopus)
224 Downloads (Pure)

Abstract

During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence between oscillatory brain activity and speaker’s lip movements and demonstrated significant entrainment in visual cortex. We then used partial coherence to remove contributions of the coherent auditory speech signal from the lip-brain coherence. Comparing this synchronization between different attention conditions revealed that attending visual speech enhances the coherence between activity in visual cortex and the speaker’s lips. Further, we identified a significant partial coherence between left motor cortex and lip movements and this partial coherence directly predicted comprehension accuracy. Our results emphasize the importance of visually entrained and attention-modulated rhythmic brain activity for the enhancement of audiovisual speech processing.
Original languageEnglish
Article numbere14521
JournaleLife
Volume5
DOIs
Publication statusPublished - 5 May 2016

Fingerprint

Dive into the research topics of 'Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility'. Together they form a unique fingerprint.

Cite this