This repository contains examples for my the libraries related Bayesian inference I maintain. Install this with
pkg> add https://github.com/tpapp/DynamicHMCExamples.jl
which will download a working and tested set of versions. Optionally, you can
pkg> up after this.
Note that this is not an introduction to Bayesian inference, merely an implementation in Julia using a certain approach that I find advantageous. The focus is on coding the (log) posterior as a function, then passing this to a modern Hamiltonian Monte Carlo sampler (a variant of NUTS, as described in Betancourt (2017).
- you don't need to use a DSL,
- you are not formulating your model as a directed acyclic graph,
- and you can calculate some are all derivatives manually.
The implicit requirement for this approach is of course that you need to understand how to translate your model to a posterior function and code it in Julia.
The examples show how to do transformations and automatic differentiation with related libraries that wrap a log posterior function. However, if you prefer, you can use other approaches, such as manually coding the transformations or symbolic differentiation.