Multimodal language processing: How preceding discourse constrains gesture interpretation and affects gesture integration when gestures do not synchronise with semantic affiliates.

Isabella Fritz, Sotaro Kita, Jeannette Littlemore, Andrea Krott

Research output: Contribution to journalArticlepeer-review

38 Downloads (Pure)

Abstract

Previous studies have suggested that a co-speech gesture needs to be synchronous with semantically related speech (semantic affiliates) for its successful semantic integration into a discourse model because co-speech gestures are often highly ambiguous on their own. But not all gestures synchronise with semantic affiliates, some precede them. The current study tested whether the interpretation of a gesture that does not synchronise with its semantic affiliate can be constrained by preceding verbal discourse and integrated into a recipient’s discourse model. A behavioural experiment (Experiment 1) showed that related discourse information can indeed constrain recipients’ interpretations of such gestures. Results from an ERP experiment (Experiment 2) confirmed that synchronisation between gesture and semantic affiliate is not essential in order for the gesture to become part of a discourse model, but only if the preceding context constrains the gesture’s meaning. In this case, we found evidence for post-semantic integration (P600, time-locked to the gesture’s semantic affiliate).
Original languageEnglish
Article number104191
Number of pages17
JournalJournal of Memory and Language
Volume117
Early online date24 Dec 2020
DOIs
Publication statusPublished - Apr 2021

Keywords

  • ERP
  • discourse
  • gesture
  • synchrony

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology

Fingerprint

Dive into the research topics of 'Multimodal language processing: How preceding discourse constrains gesture interpretation and affects gesture integration when gestures do not synchronise with semantic affiliates.'. Together they form a unique fingerprint.

Cite this