You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wanted to open this issue to have a place to discuss the ways in which SSMProblems must be designed to integrate effectively with AbstractMCMC/Turing.
A few notes on the PMCMC side of things:
PMMH
For PMMH the only requirement is that our filtering algorithm returns an unbiased estimate of the marginal likelihood.
PMMH is also commonly used to jointly infer parameters and trajectories in which case it makes sense for the filtering algorithm to also return a sampled trajectory along with the likelihood.
We could use the AbstractMCMC interface here, but it seems a bit strange since each sample is independent of the last.
PG
The terminology for particle Gibbs is a bit inconsistent between publications. I think the terminology that is most convenient for us is that the particle Gibbs algorithm is split up into too alternating steps:
Conditional-SMC: sample a trajectory conditioned on the current parameters θ, the observations, and a reference trajectory (this is called particle Gibbs in AdvancedPS)
Parameter update: sample new parameters θ, given the reference trajectory and observations
The parameter update step is fiddly. Sometimes there is a closed form, often one uses MH within Gibbs. Chopin's particleshere, let's the user define an update_theta function to do this.
It makes sense to use AbstractMCMC for the C-SMC samples since these form a Markov chain
The text was updated successfully, but these errors were encountered:
I wanted to open this issue to have a place to discuss the ways in which SSMProblems must be designed to integrate effectively with AbstractMCMC/Turing.
A few notes on the PMCMC side of things:
PMMH
For PMMH the only requirement is that our filtering algorithm returns an unbiased estimate of the marginal likelihood.
PMMH is also commonly used to jointly infer parameters and trajectories in which case it makes sense for the filtering algorithm to also return a sampled trajectory along with the likelihood.
We could use the
AbstractMCMC
interface here, but it seems a bit strange since each sample is independent of the last.PG
The terminology for particle Gibbs is a bit inconsistent between publications. I think the terminology that is most convenient for us is that the particle Gibbs algorithm is split up into too alternating steps:
The parameter update step is fiddly. Sometimes there is a closed form, often one uses MH within Gibbs. Chopin's
particles
here, let's the user define anupdate_theta
function to do this.It makes sense to use
AbstractMCMC
for the C-SMC samples since these form a Markov chainThe text was updated successfully, but these errors were encountered: