Neural coding of global form in the human visual cortex

Dirk Ostwald, JM Lam, Sheng Li, Zoe Kourtzi

Research output: Contribution to journalArticle

85 Citations (Scopus)

Abstract

Extensive psychophysical and computational work proposes that the perception of coherent and meaningful structures in natural images relies on neural processes that convert information about local edges in primary visual cortex to complex object features represented in the temporal cortex. However, the neural basis of these mid-level vision mechanisms in the human brain remains largely unknown. Here, we examine functional MRI (fMRI) selectivity for global forms in the human visual pathways using sensitive multivariate analysis methods that take advantage of information across brain activation patterns. We use Glass patterns, parametrically varying the perceived global form (concentric, radial, translational) while ensuring that the local statistics remain similar. Our findings show a continuum of integration processes that convert selectivity for local signals (orientation, position) in early visual areas to selectivity for global form structure in higher occipitotemporal areas. Interestingly, higher occipitotemporal areas discern differences in global form structure rather than low-level stimulus properties with higher accuracy than early visual areas while relying on information from smaller but more selective neural populations (smaller voxel pattern size), consistent with global pooling mechanisms of local orientation signals. These findings suggest that the human visual system uses a code of increasing efficiency across stages of analysis that is critical for the successful detection and recognition of objects in complex environments.
Original languageEnglish
Pages (from-to)2456-69
Number of pages14
JournalJournal of Neurophysiology
Volume99
Issue number5
DOIs
Publication statusPublished - 1 May 2008

Fingerprint

Dive into the research topics of 'Neural coding of global form in the human visual cortex'. Together they form a unique fingerprint.

Cite this