Adaptive construction of surrogates for the Bayesian solution of inverse problems

Jinglai Li, Youssef M. Marzouk

Research output: Contribution to journalArticlepeer-review

46 Citations (Scopus)

Abstract

The Bayesian approach to inverse problems typically relies on posterior sampling approaches, such as Markov chain Monte Carlo, for which the generation of each sample requires one or more evaluations of the parameter-to-observable map or forward model. When these evaluations are computationally intensive, approximations of the forward model are essential to accelerating sample-based inference. Yet the construction of globally accurate approximations for nonlinear forward models can be computationally prohibitive and in fact unnecessary, as the posterior distribution typically concentrates on a small fraction of the support of the prior distribution. We present a new approach that uses stochastic optimization to construct polynomial approximations over a sequence of distributions adaptively determined from the data, eventually concentrating on the posterior distribution. The approach yields substantial gains in efficiency and accuracy over prior-based surrogates, as demonstrated via application to inverse problems in partial differential equations.
Original languageEnglish
Pages (from-to)A1163-A1186
Number of pages24
JournalSIAM Journal on Scientific Computing
Volume36
Issue number3
DOIs
Publication statusPublished - 12 Jun 2014

Keywords

  • Bayesian inference
  • cross-entropy method
  • importance sampling
  • inverse problem
  • Kullback–Leibler divergence
  • Markov chain Monte Carlo
  • polynomial chaos

Fingerprint

Dive into the research topics of 'Adaptive construction of surrogates for the Bayesian solution of inverse problems'. Together they form a unique fingerprint.

Cite this