API reference

DifferentiationInterfaceModule
DifferentiationInterface

An interface to various automatic differentiation backends in Julia.

Exports

Derivative

DifferentiationInterface.prepare_derivativeFunction
prepare_derivative(f,     backend, x) -> extras
prepare_derivative(f!, y, backend, x) -> extras

Create an extras object subtyping DerivativeExtras that can be given to derivative operators.

Warning

If the function changes in any way, the result of preparation will be invalidated, and you will need to run it again. In the two-argument case, y is mutated by f! during preparation.

Gradient

DifferentiationInterface.prepare_gradientFunction
prepare_gradient(f, backend, x) -> extras

Create an extras object subtyping GradientExtras that can be given to gradient operators.

Warning

If the function changes in any way, the result of preparation will be invalidated, and you will need to run it again.

Jacobian

DifferentiationInterface.prepare_jacobianFunction
prepare_jacobian(f,     backend, x) -> extras
prepare_jacobian(f!, y, backend, x) -> extras

Create an extras object subtyping JacobianExtras that can be given to Jacobian operators.

Warning

If the function changes in any way, the result of preparation will be invalidated, and you will need to run it again. In the two-argument case, y is mutated by f! during preparation.

Second order

DifferentiationInterface.SecondOrderType
SecondOrder

Combination of two backends for second-order differentiation.

Constructor

SecondOrder(outer, inner)

Fields

  • outer::ADTypes.AbstractADType: backend for the outer differentiation

  • inner::ADTypes.AbstractADType: backend for the inner differentiation

DifferentiationInterface.prepare_hvpFunction
prepare_hvp(f, backend, x, v) -> extras

Create an extras object subtyping HVPExtras that can be given to Hessian-vector product operators.

Warning

If the function changes in any way, the result of preparation will be invalidated, and you will need to run it again.

DifferentiationInterface.prepare_hvp_same_pointFunction
prepare_hvp_same_point(f, backend, x, v) -> extras_same

Create an extras_same object subtyping HVPExtras that can be given to Hessian-vector product operators if they are applied at the same point x.

Warning

If the function or the point changes in any way, the result of preparation will be invalidated, and you will need to run it again.

DifferentiationInterface.prepare_hessianFunction
prepare_hessian(f, backend, x) -> extras

Create an extras object subtyping HessianExtras that can be given to Hessian operators.

Warning

If the function changes in any way, the result of preparation will be invalidated, and you will need to run it again.

Primitives

DifferentiationInterface.prepare_pushforwardFunction
prepare_pushforward(f,     backend, x, dx) -> extras
prepare_pushforward(f!, y, backend, x, dx) -> extras

Create an extras object subtyping PushforwardExtras that can be given to pushforward operators.

Warning

If the function changes in any way, the result of preparation will be invalidated, and you will need to run it again. In the two-argument case, y is mutated by f! during preparation.

DifferentiationInterface.prepare_pushforward_same_pointFunction
prepare_pushforward_same_point(f,     backend, x, dx) -> extras_same
prepare_pushforward_same_point(f!, y, backend, x, dx) -> extras_same

Create an extras_same object subtyping PushforwardExtras that can be given to pushforward operators if they are applied at the same point x.

Warning

If the function or the point changes in any way, the result of preparation will be invalidated, and you will need to run it again. In the two-argument case, y is mutated by f! during preparation.

DifferentiationInterface.value_and_pushforwardFunction
value_and_pushforward(f,     backend, x, dx, [extras]) -> (y, dy)
value_and_pushforward(f!, y, backend, x, dx, [extras]) -> (y, dy)
Info

Required primitive for forward mode backends.

DifferentiationInterface.prepare_pullbackFunction
prepare_pullback(f,     backend, x, dy) -> extras
prepare_pullback(f!, y, backend, x, dy) -> extras

Create an extras object subtyping PullbackExtras that can be given to pullback operators.

Warning

If the function changes in any way, the result of preparation will be invalidated, and you will need to run it again. In the two-argument case, y is mutated by f! during preparation.

DifferentiationInterface.prepare_pullback_same_pointFunction
prepare_pullback_same_point(f,     backend, x, dy) -> extras_same
prepare_pullback_same_point(f!, y, backend, x, dy) -> extras_same

Create an extras_same object subtyping PullbackExtras that can be given to pullback operators if they are applied at the same point x.

Warning

If the function or the point changes in any way, the result of preparation will be invalidated, and you will need to run it again. In the two-argument case, y is mutated by f! during preparation.

DifferentiationInterface.value_and_pullbackFunction
value_and_pullback(f,     backend, x, dy, [extras]) -> (y, dx)
value_and_pullback(f!, y, backend, x, dy, [extras]) -> (y, dx)
Info

Required primitive for reverse mode backends.

Backend queries

DifferentiationInterface.check_hessianFunction
check_hessian(backend)

Check whether backend supports second order differentiation by trying to compute a hessian.

Warning

Might take a while due to compilation time.

Miscellaneous

DifferentiationInterface.DifferentiateWithType
DifferentiateWith

Callable function wrapper that enforces differentiation with a specified (inner) backend.

This works by defining new rules overriding the behavior of the outer backend that would normally be used.

Warning

This is an experimental functionality, whose API cannot yet be considered stable. At the moment, it only supports one-argument functions, and rules are only defined for ChainRules.jl-compatible outer backends.

Fields

  • f: the function in question
  • backend::AbstractADType: the inner backend to use for differentiation

Constructor

DifferentiateWith(f, backend)

Example

using DifferentiationInterface
import ForwardDiff, Zygote

function f(x)
    a = Vector{eltype(x)}(undef, 1)
    a[1] = sum(x)  # mutation that breaks Zygote
    return a[1]
end

dw = DifferentiateWith(f, AutoForwardDiff());

gradient(dw, AutoZygote(), [2.0])  # calls ForwardDiff instead

# output

1-element Vector{Float64}:
 1.0
DifferentiationInterface.GreedyColoringAlgorithmType
GreedyColoringAlgorithm <: ADTypes.AbstractColoringAlgorithm

Matrix coloring algorithm for sparse Jacobians and Hessians, in which vertices are colored sequentially by order of decreasing degree.

Compatible with the ADTypes.jl coloring framework.

Implements

Warning

Symmetric coloring is not used by DifferentiationInterface.jl at the moment: Hessians are colored by columns just like Jacobians.

Reference

What Color Is Your Jacobian? Graph Coloring for Computing Derivatives, Gebremedhin et al. (2005)

Internals

This is not part of the public API.

DifferentiationInterface.BipartiteGraphType
BipartiteGraph

Represent a bipartite graph between the rows and the columns of a non-symmetric m × n matrix A.

This graph is defined as G = (R, C, E) where R = 1:m is the set of row indices, C = 1:n is the set of column indices and (i, j) ∈ E whenever A[i, j] is nonzero.

Fields

  • A_colmajor::AbstractMatrix: output of col_major applied to A
  • A_rowmajor::AbstractMatrix: output of row_major applied to A

Reference

What Color Is Your Jacobian? Graph Coloring for Computing Derivatives, Gebremedhin et al. (2005)

DifferentiationInterface.CompressedMatrixType
CompressedMatrix{dir}

Compressed representation B of a (m, n) sparse matrix A obtained by summing some of its columns (if dir == :col) or rows (if dir == :row) if they have the same color.

Fields

fieldtypesizemeaningif dir is :colif dir is :row
sparsityAbstractMatrix{Bool}(m, n)sparsity patterncolumn-majorrow-major
colorsVector{Int}n or mcolor assignments in 1:ccolors[j] of col jcolors[i] of row i
groupsVector{Vector{Int}}cgroups with same colorgroups[k] = {j : colors[j] = k}groups[k] = {i : colors[i] = k}
aggregatesAbstractMatrix{<:Real}(m, c) or (c, n)color-summed values BB[:, c] = sum(A[:, groups[k]])B[c, :] = sum(A[groups[k], :])
DifferentiationInterface.SymbolicsSparsityDetectorType
SymbolicsSparsityDetector <: ADTypes.AbstractSparsityDetector

Sparsity detection algorithm based on the Symbolics.jl tracing system.

Compatible with the ADTypes.jl sparsity detection framework.

Danger

This functionality is in a package extension, and requires Symbolics.jl to be loaded.

Implements

Reference

Sparsity Programming: Automated Sparsity-Aware Optimizations in Differentiable Programming, Gowda et al. (2019)

ADTypes.modeMethod
mode(backend::SecondOrder)

Return the outer mode of the second-order backend.

DifferentiationInterface.basisMethod
basis(backend, a::AbstractArray, i::CartesianIndex)

Construct the i-th stardard basis array in the vector space of a with element type eltype(a).

Note

If an AD backend benefits from a more specialized basis array implementation, this function can be extended on the backend type.

DifferentiationInterface.color_groupsMethod
color_groups(colors)

Return groups::Vector{Vector{Int}} such that i ∈ groups[c] iff colors[i] == c.

Assumes the colors are contiguously numbered from 1 to some cmax.

DifferentiationInterface.pick_chunksizeMethod
pick_chunksize(input_length)

Pick a reasonable chunk size for chunked derivative evaluation with an input of length input_length.

The result cannot be larger than DEFAULT_CHUNKSIZE=8.