HSSM Seems to Return Priors as Posterior #606
Unanswered
theonlydvr
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Hi @theonlydvr, I don't have an immediate answer looking at this, need to actually look at the likelihood. However can't confirm this is the case here from just looking at it. Will get back. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've been working on implementing an RLDDM in HSSM following some advice from @krishnbera which so far has worked great! One of the issues I've run into though has been speed so I've tried working on setting up a pytensor version of the code so I can take advantage of the faster samplers and a GPU. To test this approach out, I've started by working with a simple RL-only case rather than the full RLDDM. However, whenever I try sampling this model, I seem to just get the priors back. I've included my code and some other model details below. Any help/advice would be greatly appreciated!
Beta Was this translation helpful? Give feedback.
All reactions