Deep temporal models and active inference

Research output: Contribution to journalArticle

Standard

Deep temporal models and active inference. / Friston, Karl J.; Rosch, Richard; Parr, Thomas; Price, Cathy; Bowman, Howard.

In: Neuroscience and biobehavioral reviews, Vol. 77, 01.06.2017, p. 388-402.

Research output: Contribution to journalArticle

Harvard

APA

Vancouver

Author

Friston, Karl J. ; Rosch, Richard ; Parr, Thomas ; Price, Cathy ; Bowman, Howard. / Deep temporal models and active inference. In: Neuroscience and biobehavioral reviews. 2017 ; Vol. 77. pp. 388-402.

Bibtex

@article{183bb0c924bf4b22a5cafb89f167407d,
title = "Deep temporal models and active inference",
abstract = "How do we navigate a deeply structured world? Why are you reading this sentence first – and did you actually look at the fifth word? This review offers some answers by appealing to active inference based on deep temporal models. It builds on previous formulations of active inference to simulate behavioural and electrophysiological responses under hierarchical generative models of state transitions. Inverting these models corresponds to sequential inference, such that the state at any hierarchical level entails a sequence of transitions in the level below. The deep temporal aspect of these models means that evidence is accumulated over nested time scales, enabling inferences about narratives (i.e., temporal scenes). We illustrate this behaviour with Bayesian belief updating – and neuronal process theories – to simulate the epistemic foraging seen in reading. These simulations reproduce perisaccadic delay period activity and local field potentials seen empirically. Finally, we exploit the deep structure of these models to simulate responses to local (e.g., font type) and global (e.g., semantic) violations; reproducing mismatch negativity and P300 responses respectively.",
keywords = "Active inference , Bayesian , Hierarchical , Reading , Violation , Free energy , P300 , MMN",
author = "Friston, {Karl J.} and Richard Rosch and Thomas Parr and Cathy Price and Howard Bowman",
year = "2017",
month = "6",
day = "1",
doi = "10.1016/j.neubiorev.2017.04.009",
language = "English",
volume = "77",
pages = "388--402",
journal = "Neuroscience and biobehavioral reviews",
issn = "0149-7634",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - Deep temporal models and active inference

AU - Friston, Karl J.

AU - Rosch, Richard

AU - Parr, Thomas

AU - Price, Cathy

AU - Bowman, Howard

PY - 2017/6/1

Y1 - 2017/6/1

N2 - How do we navigate a deeply structured world? Why are you reading this sentence first – and did you actually look at the fifth word? This review offers some answers by appealing to active inference based on deep temporal models. It builds on previous formulations of active inference to simulate behavioural and electrophysiological responses under hierarchical generative models of state transitions. Inverting these models corresponds to sequential inference, such that the state at any hierarchical level entails a sequence of transitions in the level below. The deep temporal aspect of these models means that evidence is accumulated over nested time scales, enabling inferences about narratives (i.e., temporal scenes). We illustrate this behaviour with Bayesian belief updating – and neuronal process theories – to simulate the epistemic foraging seen in reading. These simulations reproduce perisaccadic delay period activity and local field potentials seen empirically. Finally, we exploit the deep structure of these models to simulate responses to local (e.g., font type) and global (e.g., semantic) violations; reproducing mismatch negativity and P300 responses respectively.

AB - How do we navigate a deeply structured world? Why are you reading this sentence first – and did you actually look at the fifth word? This review offers some answers by appealing to active inference based on deep temporal models. It builds on previous formulations of active inference to simulate behavioural and electrophysiological responses under hierarchical generative models of state transitions. Inverting these models corresponds to sequential inference, such that the state at any hierarchical level entails a sequence of transitions in the level below. The deep temporal aspect of these models means that evidence is accumulated over nested time scales, enabling inferences about narratives (i.e., temporal scenes). We illustrate this behaviour with Bayesian belief updating – and neuronal process theories – to simulate the epistemic foraging seen in reading. These simulations reproduce perisaccadic delay period activity and local field potentials seen empirically. Finally, we exploit the deep structure of these models to simulate responses to local (e.g., font type) and global (e.g., semantic) violations; reproducing mismatch negativity and P300 responses respectively.

KW - Active inference

KW - Bayesian

KW - Hierarchical

KW - Reading

KW - Violation

KW - Free energy

KW - P300

KW - MMN

U2 - 10.1016/j.neubiorev.2017.04.009

DO - 10.1016/j.neubiorev.2017.04.009

M3 - Article

VL - 77

SP - 388

EP - 402

JO - Neuroscience and biobehavioral reviews

JF - Neuroscience and biobehavioral reviews

SN - 0149-7634

ER -