About this Event
3620 South Vermont Avenue, Los Angeles, CA 90089
Abstract: The efficient generation of random samples is a central task within many of the quantitative sciences. The workhorse of Bayesian statistics and statistical physics, Markov chain Monte Carlo (MCMC) comprises a large class of algorithms for sampling from arbitrarily complex and/or high-dimensional probability distributions. The Metropolis-Hastings method (MH) stands as the seminal MCMC algorithm, and its basic operation underlies most of the MCMC techniques developed to this day.
We provide an all-encompassing, measure theoretic mathematical formalism that describes essentially any Metropolis-Hastings algorithm using three ingredients: a random proposal, an involution on an extended phase space and an accept-reject mechanism. This unified framework illuminates under-appreciated relationships between a variety of known algorithms while yielding a means for deriving new methods.
As an immediate application we identify several novel algorithms including a multiproposal version of the popular preconditioned Crank- Nicolson (pCN) sampler suitable for infinite-dimensional target measures which are absolutely continuous with respect to a Gaussian base measure. We also develop a new class of ‘extended phase space’ methods, based on Hamiltonian mechanics. These methods provide a versatile approach to bypass expensive gradient computations through skillful reduced order modeling and/or data driven approaches. A selection of case studies will be presented that use our multiproposal pCN algorithm (mpCN) to resolve a selection of problems in Bayesian statistical inversion for partial differential equations motivated by fluid flow measurement.
This is joint work with Andrew J. Holbrook (UCLA), Justin Krometis (Virginia Tech) and Cecilia Mondaini (Drexel).
0 people are interested in this event