Maximum conditional entropy Hamiltonian Monte Carlo sampler

Tengchao Yu, Hongqiao Wang, Jinglai Li

Research output: Contribution to journalArticlepeer-review

174 Downloads (Pure)


The performance of Hamiltonian Monte Carlo (HMC) sampler depends critically on some algorithm parameters such as the total integration time and the numerical integration stepsize. The parameter tuning is particularly challenging when the mass matrix of the HMC sampler is adapted. We propose in this work a Kolmogorov-Sinai entropy (KSE) based design criterion to optimize these algorithm parameters, which can avoid some potential issues in the often used jumping-distance based measures. For near-Gaussian distributions, we are able to derive the optimal algorithm parameters with respect to the KSE criterion analytically. As a byproduct the KSE criterion also provides a theoretical justification for the need to adapt the mass matrix in HMC sampler. Based on the results, we propose an adaptive HMC algorithm, and we then demonstrate the performance of the proposed algorithm with numerical examples.
Original languageEnglish
Pages (from-to)A3607–A3626
JournalSIAM Journal on Scientific Computing
Issue number5
Publication statusPublished - 26 Oct 2021


Dive into the research topics of 'Maximum conditional entropy Hamiltonian Monte Carlo sampler'. Together they form a unique fingerprint.

Cite this