Abstract
In dynamic cluttered environments, audition and vision may benefit from each other in determining what deserves further attention and what does not. We investigated the underlying neural mechanisms responsible for attentional guidance by audiovisual stimuli in such an environment. Event-related potentials (ERPs) were measured during visual search through dynamic displays consisting of line elements that randomly changed orientation. Search accuracy improved when a target orientation change was synchronized with an auditory signal as compared to when the auditory signal was absent or synchronized with a distractor orientation change. The ERP data show that behavioral benefits were related to an early multisensory interaction over left parieto-occipital cortex (50-60. ms post-stimulus onset), which was followed by an early positive modulation (80-100. ms) over occipital and temporal areas contralateral to the audiovisual event, an enhanced N2pc (210-250. ms), and a contralateral negative slow wave (CNSW). The early multisensory interaction was correlated with behavioral search benefits, indicating that participants with a strong multisensory interaction benefited the most from the synchronized auditory signal. We suggest that an auditory signal enhances the neural response to a synchronized visual event, which increases the chances of selection in a multiple object environment.
Original language | English |
---|---|
Pages (from-to) | 1208-1218 |
Number of pages | 11 |
Journal | NeuroImage |
Volume | 55 |
Issue number | 3 |
DOIs | |
Publication status | Published - 1 Apr 2011 |
Keywords
- Attention
- Audiovisual integration
- Event-related potential
- Visual search
ASJC Scopus subject areas
- Neurology
- Cognitive Neuroscience