LogDensityProblemsAD.jl
Automatic differentiation backends for LogDensityProblems.jl.
The only exposed function is ADgradient
. Example:
using LogDensityProblemsAD, ForwardDiff
∇ℓ = ADgradient(:ForwardDiff, ℓ) # assumes ℓ implements the LogDensityProblems interface
Currently, the following backends are supported:
backend | notes |
---|---|
ForwardDiff.jl | robust, but not ideal for ℝⁿ→ℝ functions |
ReverseDiff.jl | |
Zygote.jl | |
Enzyme.jl | experimental |
Tracker.jl | not heavily maintained, you may prefer Zygote |