Realizing active inference in variational message passing: the outcome-blind certainty seeker

Théophile Champion, Marek Grześ, Howard Bowman

Research output: Contribution to journalArticlepeer-review

284 Downloads (Pure)


Active inference is a state-of-the-art framework in neuroscience that offers a unified theory of brain function. It is also proposed as a framework for planning in AI. Unfortunately, the complex mathematics required to create new models can impede application of active inference in neuroscience and AI research. This letter addresses this problem by providing a complete mathematical treatment of the active inference framework in discrete time and state spaces and the derivation of the update equations for any new model. We leverage the theoretical connection between active inference and variational message passing as described by John Winn and Christopher M. Bishop in 2005. Since variational message passing is a well-defined methodology for deriving Bayesian belief update equations, this letter opens the door to advanced generative models for active inference. We show that using a fully factorized variational distribution simplifies the expected free energy, which furnishes priors over policies so that agents seek unambiguous states. Finally, we consider future extensions that support deep tree searches for sequential policy optimization based on structure learning and belief propagation.

Original languageEnglish
Pages (from-to)2662-2626
JournalNeural Computation
Issue number10
Early online date19 Jul 2021
Publication statusE-pub ahead of print - 19 Jul 2021

Bibliographical note

© 2021 Massachusetts Institute of Technology.

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience


Dive into the research topics of 'Realizing active inference in variational message passing: the outcome-blind certainty seeker'. Together they form a unique fingerprint.

Cite this