Projects per year
Several theories propose that emotions and self-awareness arise from the integration of internal and external signals and their respective precision-weighted expectations. Supporting these mechanisms, research indicates that the brain uses temporal cues from cardiac signals to predict auditory stimuli and that these predictions and their prediction errors can be observed in the scalp heartbeat-evoked potential (HEP). We investigated the effect of precision modulations on these cross-modal predictive mechanisms, via attention and interoceptive ability. We presented auditory sequences at short (perceived synchronous) or long (perceived asynchronous) cardio-audio delays, with half of the trials including an omission. Participants attended to the cardio-audio synchronicity of the tones (internal attention) or the auditory stimuli alone (external attention). Comparing HEPs during omissions allowed for the observation of pure predictive signals, without contaminating auditory input. We observed an early effect of cardio-audio delay, reflecting a difference in heartbeat-driven expectations. We also observed a larger positivity to the omissions of sounds perceived as synchronous than to the omissions of sounds perceived as asynchronous when attending internally only, consistent with the role of attentional precision for enhancing predictions. These results provide support for attentionally modulated cross-modal predictive coding and suggest a potential tool for investigating its role in emotion and self-awareness.
- predictive coding
FingerprintDive into the research topics of 'Skipping a Beat: Heartbeat-Evoked Potentials Reflect Predictions during Interoceptive-Exteroceptive Integration'. Together they form a unique fingerprint.
- 1 Finished