Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling

  • Shiliang Sun
  • , Jing Zhao*
  • , Minghao Gu
  • , Shanhu Wang
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

The Hamiltonian Monte Carlo (HMC) sampling algorithm exploits Hamiltonian dynamics to construct efficient Markov Chain Monte Carlo (MCMC), which has become increasingly popular in machine learning and statistics. Since HMC uses the gradient information of the target distribution, it can explore the state space much more efficiently than random-walk proposals, but may suffer from high autocorrelation. In this paper, we propose Langevin Hamiltonian Monte Carlo (LHMC) to reduce the autocorrelation of the samples. Probabilistic inference involving multi-modal distributions is very difficult for dynamics-based MCMC samplers, which is easily trapped in the mode far away from other modes. To tackle this issue, we further propose a variational hybrid Monte Carlo (VHMC) which uses a variational distribution to explore the phase space and find new modes, and it is capable of sampling from multi-modal distributions effectively. A formal proof is provided that shows that the proposed method can converge to target distributions. Both synthetic and real datasets are used to evaluate its properties and performance. The experimental results verify the theory and show superior performance in multi-modal sampling.

Original languageEnglish
Article number560
JournalEntropy
Volume25
Issue number4
DOIs
StatePublished - Apr 2023

Keywords

  • Hamiltonian Monte Carlo
  • Langevin dynamics
  • Markov chain Monte Carlo
  • multi-modal sampling
  • variational distribution

Fingerprint

Dive into the research topics of 'Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling'. Together they form a unique fingerprint.

Cite this