Learning to use an invisible visual signal for perception
Research output: Contribution to journal › Article › peer-review
Colleges, School and Institutes
- Max Planck Institute for Biological Cybernetics
- SUNY College of Optometry
How does the brain construct a percept from sensory signals? One approach to this fundamental question is to investigate perceptual learning as induced by exposure to statistical regularities in sensory signals [1-7]. Recent studies showed that exposure to novel correlations between sensory signals can cause a signal to have new perceptual effects [2, 3]. In those studies, however, the signals were clearly visible. The automaticity of the learning was therefore difficult to determine. Here we investigate whether learning of this sort, which causes new effects on appearance, can be low level and automatic by employing a visual signal whose perceptual consequences were made invisible - a vertical disparity gradient masked by other depth cues. This approach excluded high-level influences such as attention or consciousness. Our stimulus for probing perceptual appearance was a rotating cylinder. During exposure, we introduced a new contingency between the invisible signal and the rotation direction of the cylinder. When subsequently presenting an ambiguously rotating version of the cylinder, we found that the invisible signal influenced the perceived rotation direction. This demonstrates that perception can rapidly undergo "structure learning" by automatically picking up novel contingencies between sensory signals, thus automatically recruiting signals for novel uses during the construction of a percept.
Copyright © 2010 Elsevier Ltd. All rights reserved.
|Number of pages||4|
|Publication status||Published - 26 Oct 2010|
- Adult, Cues, Germany, Humans, Learning, Photic Stimulation, Sensory Thresholds, Vision, Ocular, Visual Perception