On an adaptive preconditioned Crank–Nicolson MCMC algorithm for infinite dimensional Bayesian inference

Zixi Hu, Zhewei Yao, Jinglai Li

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

Many scientific and engineering problems require to perform Bayesian inference for unknowns of infinite dimension. In such problems, many standard Markov Chain Monte Carlo (MCMC) algorithms become arbitrary slow under the mesh refinement, which is referred to as being dimension dependent. To this end, a family of dimensional independent MCMC algorithms, known as the preconditioned Crank–Nicolson (pCN) methods, were proposed to sample the infinite dimensional parameters. In this work we develop an adaptive version of the pCN algorithm, where the covariance operator of the proposal distribution is adjusted based on sampling history to improve the simulation efficiency. We show that the proposed algorithm satisfies an important ergodicity condition under some mild assumptions. Finally we provide numerical examples to demonstrate the performance of the proposed method.
Original languageEnglish
Pages (from-to)492-503
JournalJournal of Computational Physics
Volume332
Early online date7 Dec 2016
DOIs
Publication statusPublished - Mar 2017

Keywords

  • Bayesian inference
  • infinite dimensional inverse problems
  • Adaptive Markov Chain
  • Monte Carlo

Fingerprint

Dive into the research topics of 'On an adaptive preconditioned Crank–Nicolson MCMC algorithm for infinite dimensional Bayesian inference'. Together they form a unique fingerprint.

Cite this