Steady-state responses in MEG demonstrate information integration within but not across the auditory and visual senses

Research output: Contribution to journalArticle

Authors

  • Anette S. Giani
  • Erick Ortiz
  • Paolo Belardinelli
  • Mario Kleiner
  • Hubert Preissl

Colleges, School and Institutes

External organisations

  • MEG Center
  • Department of Obstetrics and Gynecology, Croydon University Hospital, Croydon, UK.
  • Department of Clinical Psychology
  • Max Planck Institute for Biological Cybernetics
  • University of Tuebingen
  • University of Arkansas for Medical Sciences

Abstract

To form a unified percept of our environment, the human brain integrates information within and across the senses. This MEG study investigated interactions within and between sensory modalities using a frequency analysis of steady-state responses that are elicited time-locked to periodically modulated stimuli. Critically, in the frequency domain, interactions between sensory signals are indexed by crossmodulation terms (i.e. the sums and differences of the fundamental frequencies). The 3 × 2 factorial design, manipulated (1) modality: auditory, visual or audiovisual (2) steady-state modulation: the auditory and visual signals were modulated only in one sensory feature (e.g. visual gratings modulated in luminance at 6. Hz) or in two features (e.g. tones modulated in frequency at 40. Hz & amplitude at 0.2. Hz). This design enabled us to investigate crossmodulation frequencies that are elicited when two stimulus features are modulated concurrently (i) in one sensory modality or (ii) in auditory and visual modalities. In support of within-modality integration, we reliably identified crossmodulation frequencies when two stimulus features in one sensory modality were modulated at different frequencies. In contrast, no crossmodulation frequencies were identified when information needed to be combined from auditory and visual modalities. The absence of audiovisual crossmodulation frequencies suggests that the previously reported audiovisual interactions in primary sensory areas may mediate low level spatiotemporal coincidence detection that is prominent for stimulus transients but less relevant for sustained SSR responses. In conclusion, our results indicate that information in SSRs is integrated over multiple time scales within but not across sensory modalities at the primary cortical level.

Details

Original languageEnglish
Pages (from-to)1478-1489
Number of pages12
JournalNeuroImage
Volume60
Issue number2
Publication statusPublished - 2 Apr 2012

Keywords

  • Audiovisual, Crossmodal, Crossmodulation frequencies, MEG, Multisensory integration, Steady-state responses

ASJC Scopus subject areas