This is an experimental implementation of a low rank approximation of mass matrices for Hamiltonian MCMC samplers, specifically for PyMC.
This is for experimentation only! Do not use for actual work (yet)!
But feel welcome to try it out, and tell me how it worked for your models!
pip install git+https://github.com/aseyboldt/covadapt.git
See the notebooks/covadapt_intro.ipynb
notebook.
When we use preconditioning with a mass matrix to improve performance of HMC
based on previous draws, we often ignore information that we already computed:
the gradients of the posterior density at those samples. But those gradients
contain a lot of information about the posterior geometry and as such also
about possible preconditioners. If for example we assume that the posterior is
an
We can evaluate a precondition matrix
is small. (Where
is minimal.
Given an arbitrary but sufficiently nice posterior
If we only allow diagonal preconditioning matrices, we can find the minimum analytically as
This diagonal preconditioner is already implemented in PyMC and nuts-rs.
If we approximate the integral in
If we have more dimensions than draws this does not have a unique solution,
so we introduce regularization. Some regularization methods based on the logdet or trace of
To avoid quadratic memory and computational costs with the dimensionality,
we write
We can now define a Riemannian metric on the space of all
A lot of the work that went into this package was during my time at Quantopian, while trying to improve sampling of a (pretty awesome) model for portfolio optimization. Thanks a lot for making that possible!