Scalar Functions

Instead of working with vector-valued functions like dynamics functions, we often need to define scalar functions that accept our state and control vectors, such as cost / objective / reward functions. This page provides the API for working with these types of functions, represented by the abstract ScalarFunction type, which is a specialization of an AbstractFunction with an output dimension of 1.

RobotDynamics.ScalarFunctionType
ScalarFunction <: AbstractFunction

Represents a scalar function of the form:

\[c = f(x,u)\]

where $c \in \mathbb{R}$.

Evaluation

Since the function return a scalar, both evaluate and evaluate! call the same function methods. To avoid confusion, evaluate should always be preferred when working with a ScalarFunction. To use, simply implement one of the following methods:

evaluate(fun, x, u, p)
evaluate(fun, x, u)

where p is tuple of parameters.

Differentiation

First and second-order derivatives of scalar functions are commonly referred to as gradients and Hessians. We use the convention that a gradient is a 1-dimensional array (i.e. an AbstractVector with size (n,)) while the Jacobian of a scalar function is a row vector (i.e. an AbstractMatrix with size (1,n)). Theses methods can be called using:

gradient!(::DiffMethod, fun, grad, z)
hessian!(::DiffMethod, fun, hess, z)

Which allows the user to dispatch on the DiffMethod. These methods can also be called by calling the more generic jacobian and ∇jacobian! methods:

jacobian!(sig, diff, fun, J, y, z)
∇jacobian!(sig, diff, fun, H, b, y, z)

where the length of y, and b is 1, and b[1] == one(eltype(b)).

To implement UserDefined methods, implement any one of the following gradient methods:

gradient!(::UserDefined, fun, grad, z)
gradient!(fun, grad, z)
gradient!(fun, grad, x, u, p)
gradient!(fun, grad, x, u)

and any one of the following Hessian methods:

hessian!(::UserDefined, fun, hess, z)
hessian!(fun, hess, z)
hessian!(fun, hess, x, u, p)
hessian!(fun, hess, x, u)