On Estimating the Gradient of the Expected Information Gain in Bayesian Experimental Design

Ziqiao Ao, Jinglai Li*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Downloads (Pure)

Abstract

Bayesian Experimental Design (BED), which aims to find the optimal experimental conditions for Bayesian inference, is usually posed as to optimize the expected information gain (EIG). The gradient information is often needed for efficient EIG optimization, and as a result the ability to estimate the gradient of EIG is essential for BED problems. The primary goal of this work is to develop methods for estimating the gradient of EIG, which, combined with the stochastic gradient descent algorithms, result in efficient optimization of EIG. Specifically, we first introduce a posterior expected representation of the EIG gradient with respect to the design variables. Based on this, we propose two methods for estimating the EIG gradient, UEEG-MCMC that leverages posterior samples generated through Markov Chain Monte Carlo (MCMC) to estimate the EIG gradient, and BEEG-AP that focuses on achieving high simulation efficiency by repeatedly using parameter samples. Theoretical analysis and numerical studies illustrate that UEEG-MCMC is robust agains the actual EIG value, while BEEG-AP is more efficient when the EIG value to be optimized is small. Moreover, both methods show superior performance compared to several popular benchmarks in our numerical experiments.
Original languageEnglish
Title of host publicationProceedings of the 38th AAAI Conference on Artificial Intelligence
EditorsMichael Wooldridge, Jennifer Dy, Sriraam Natarajan
PublisherAssociation for the Advancement of Artificial Intelligence
Pages20311-20319
Number of pages9
ISBN (Print)1577358872, 9781577358879
DOIs
Publication statusPublished - 24 Mar 2024
EventThe 38th Annual AAAI Conference on Artificial Intelligence - Vancouver Convention Centre – West Building, Vancouver, Canada
Duration: 20 Feb 202427 Feb 2024
https://aaai.org/aaai-conference/

Publication series

NameProceedings of the AAAI Conference on Artificial Intelligence
Number18
Volume38
ISSN (Print)2159-5399
ISSN (Electronic)2374-3468

Conference

ConferenceThe 38th Annual AAAI Conference on Artificial Intelligence
Abbreviated titleAAAI-24
Country/TerritoryCanada
CityVancouver
Period20/02/2427/02/24
Internet address

Keywords

  • RU: Decision/Utility Theory
  • RU: Probabilistic Inference
  • RU: Stochastic Optimization

Fingerprint

Dive into the research topics of 'On Estimating the Gradient of the Expected Information Gain in Bayesian Experimental Design'. Together they form a unique fingerprint.

Cite this