From Markov Chain to Hamiltonian Monte Carlo
Despite the promise of big data, inferences are often limited not by sample size but rather by systematic effects. Only by carefully modeling these effects can we take full advantage of the data -- big data must be complemented with big models and the algorithms that can fit them. In this workshop I will discuss the challenges inherent to Bayesian computation of complex models, and their particular consequences for Markov chain Monte Carlo, and demonstrate those ideas through interactive exercises in Python. We will conclude with a conceptual introduction to Hamiltonian Monte Carlo and its ability to surmount these challenges by exploiting a model’s own differential information.
The course will assume familiarity with the basics of calculus, linear algebra, probability theory, Bayesian inference, and Bayesian computation.
For a self-contained introduction to probability theory one can review
while an introduction to Bayesian inference is given in
In order to participate in all of the interactive exercises attendees will need a laptop with an installation of Python.
One exercise will also require the latest version of PyStan (http://pystan.readthedocs.io/en/latest/) installed.
Please verify that you can run the 8schools model as discussed in
and report any installation issues at http://discourse.mc-stan.org as early as possible.