Abstract
To form a coherent percept of the environment, the brain must integrate sensory signals emanating from a common source but segregate those from different sources. Temporal regularities are prominent cues for multisensory integration, particularly for speech and music perception. In line with models of predictive coding, we suggest that the brain adapts an internal model to the statistical regularities in its environment. This internal model enables cross‐sensory and sensorimotor temporal predictions as a mechanism to arbitrate between integration and segregation of signals from different senses.
Original language | English |
---|---|
Journal | Annals of the New York Academy of Sciences |
Early online date | 31 Mar 2018 |
DOIs | |
Publication status | E-pub ahead of print - 31 Mar 2018 |