Activities per year
Abstract
When they process sentences, language comprehenders activate perceptual and motor representations of described scenes. On the “immersed experiencer” account, comprehenders engage motor and perceptual systems to create experiences that someone participating in the described scene would have. We tested two predictions of this view. First, the distance of mentioned objects from the protagonist of a described scene should produce perceptual correlates in mental simulations. And second, mental simulation of perceptual features should be multimodal, like actual perception of such features. In Experiment 1, we found that language about objects at different distances modulated the size of visually simulated objects. In Experiment 2, we found a similar effect for volume in the auditory modality. These experiments lend support to the view that language driven mental simulation encodes experiencer-specific spatial details. The fact that we obtained similar simulation effects for two different modalities — audition and vision — confirms the multimodal nature of mental simulations during language understanding.
Original language | English |
---|---|
Pages (from-to) | 1-16 |
Number of pages | 16 |
Journal | Language and Cognition |
Volume | 4 |
Issue number | 1 |
DOIs | |
Publication status | Published - 1 Mar 2012 |
Bibliographical note
10.1515/ langcog-2012-0001Keywords
- mental simulation
- distance
- perception
- Psycholinguistics
- reaction time study
- experimental psychology
Fingerprint
Dive into the research topics of 'Language comprehenders represent object distance both visually and auditorily: Evidence for the immersed experiencer view'. Together they form a unique fingerprint.Activities
- 1 Guest lecture or Invited talk
-
Language-induced mental simulation: Distance, grammar, and the senses
Bodo Winter (Speaker)
21 Nov 2016Activity: Academic and Industrial events › Guest lecture or Invited talk