Maximum conditional entropy Hamiltonian Monte Carlo sampler

Research output: Contribution to journalArticlepeer-review

Authors

Colleges, School and Institutes

Abstract

The performance of Hamiltonian Monte Carlo (HMC) sampler depends critically on some algorithm parameters such as the total integration time and the numerical integration stepsize. The parameter tuning is particularly challenging when the mass matrix of the HMC sampler is adapted. We propose in this work a Kolmogorov-Sinai entropy (KSE) based design criterion to optimize these algorithm parameters, which can avoid some potential issues in the often used jumping-distance based measures. For near-Gaussian distributions, we are able to derive the optimal algorithm parameters with respect to the KSE criterion analytically. As a byproduct the KSE criterion also provides a theoretical justification for the need to adapt the mass matrix in HMC sampler. Based on the results, we propose an adaptive HMC algorithm, and we then demonstrate the performance of the proposed algorithm with numerical examples.

Bibliographic note

Not yet published as of 08/06/2021.

Details

Original languageEnglish
JournalSIAM Journal on Scientific Computing
Publication statusAccepted/In press - 5 May 2021