The neural dynamics of hierarchical Bayesian causal inference in multisensory perception

Tim Rohe, Ann-Christine Ehlis, Uta Noppeney

Research output: Contribution to journalArticlepeer-review

28 Citations (Scopus)
325 Downloads (Pure)

Abstract

Transforming the barrage of sensory signals into a coherent multisensory percept relies on solving the binding problem – deciding whether signals come from a common cause and should be integrated or, instead, segregated. Human observers typically arbitrate between integration and segregation consistent with Bayesian Causal Inference, but the neural mechanisms remain poorly understood. Here, we presented people with audiovisual sequences that varied in the number of flashes and beeps, then combined Bayesian modelling and EEG representational similarity analyses. Our data suggest that the brain initially represents the number of flashes and beeps independently. Later, it computes their numbers by averaging the forced-fusion and segregation estimates weighted by the probabilities of common and independent cause models (i.e. model averaging). Crucially, prestimulus oscillatory alpha power and phase correlate with observers’ prior beliefs about the world’s causal structure that guide their arbitration between sensory integration and segregation.

Original languageEnglish
Article number1907
JournalNature Communications
Volume10
Issue number1
DOIs
Publication statusPublished - 23 Apr 2019

ASJC Scopus subject areas

  • General Chemistry
  • General Biochemistry,Genetics and Molecular Biology
  • General Physics and Astronomy

Fingerprint

Dive into the research topics of 'The neural dynamics of hierarchical Bayesian causal inference in multisensory perception'. Together they form a unique fingerprint.

Cite this