Remix.run Logo
nextos 13 hours ago

These days, the advantage is that a generative model can be cleanly decoupled from inference. With probabilistic languages such as Stan, Turing or Pyro it is possible to encode a model and then perform maximum likelihood, variational Bayes, approximate Bayesian inference, as well as other more specialized approaches, depending on the problem at hand.

If you have experienced problems with convergence, give Stan a try. Stan is really robust, polished, and simple. Besides, models are statically typed and it warns you when you do something odd.

Personally, I think once you start doing multilevel modeling to shrink estimates, there's no way back. At least in my case, I now see it everywhere. Thanks to efficient variational Bayes methods built on top of JAX, it is doable even on high-dimensional models.