Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility

Research output: Contribution to journalArticlepeer-review

Standard

Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility. / Park, Hyojin; Kayser, Christoph; Thut, Gregor; Gross, Joachim.

In: Elife, Vol. 5, e14521, 05.05.2016.

Research output: Contribution to journalArticlepeer-review

Harvard

APA

Vancouver

Author

Bibtex

@article{ec3fb6b1f59c4f27bc43e9b393471d52,
title = "Lip movements entrain the observers{\textquoteright} low-frequency brain oscillations to facilitate speech intelligibility",
abstract = "During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence between oscillatory brain activity and speaker{\textquoteright}s lip movements and demonstrated significant entrainment in visual cortex. We then used partial coherence to remove contributions of the coherent auditory speech signal from the lip-brain coherence. Comparing this synchronization between different attention conditions revealed that attending visual speech enhances the coherence between activity in visual cortex and the speaker{\textquoteright}s lips. Further, we identified a significant partial coherence between left motor cortex and lip movements and this partial coherence directly predicted comprehension accuracy. Our results emphasize the importance of visually entrained and attention-modulated rhythmic brain activity for the enhancement of audiovisual speech processing.",
author = "Hyojin Park and Christoph Kayser and Gregor Thut and Joachim Gross",
year = "2016",
month = may,
day = "5",
doi = "10.7554/eLife.14521",
language = "English",
volume = "5",
journal = "Elife",
issn = "2050-084X",
publisher = "eLife Sciences Publications",

}

RIS

TY - JOUR

T1 - Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility

AU - Park, Hyojin

AU - Kayser, Christoph

AU - Thut, Gregor

AU - Gross, Joachim

PY - 2016/5/5

Y1 - 2016/5/5

N2 - During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence between oscillatory brain activity and speaker’s lip movements and demonstrated significant entrainment in visual cortex. We then used partial coherence to remove contributions of the coherent auditory speech signal from the lip-brain coherence. Comparing this synchronization between different attention conditions revealed that attending visual speech enhances the coherence between activity in visual cortex and the speaker’s lips. Further, we identified a significant partial coherence between left motor cortex and lip movements and this partial coherence directly predicted comprehension accuracy. Our results emphasize the importance of visually entrained and attention-modulated rhythmic brain activity for the enhancement of audiovisual speech processing.

AB - During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence between oscillatory brain activity and speaker’s lip movements and demonstrated significant entrainment in visual cortex. We then used partial coherence to remove contributions of the coherent auditory speech signal from the lip-brain coherence. Comparing this synchronization between different attention conditions revealed that attending visual speech enhances the coherence between activity in visual cortex and the speaker’s lips. Further, we identified a significant partial coherence between left motor cortex and lip movements and this partial coherence directly predicted comprehension accuracy. Our results emphasize the importance of visually entrained and attention-modulated rhythmic brain activity for the enhancement of audiovisual speech processing.

U2 - 10.7554/eLife.14521

DO - 10.7554/eLife.14521

M3 - Article

VL - 5

JO - Elife

JF - Elife

SN - 2050-084X

M1 - e14521

ER -