DataDrivenSparse.ADMMType
mutable struct ADMM{T, R<:Number} <: DataDrivenSparse.AbstractSparseRegressionAlgorithm

ADMM is an implementation of Lasso using the alternating direction methods of multipliers and loosely based on this implementation. It solves the following problem

\[\argmin_{x} \frac{1}{2} \| Ax-b\|_2 + \lambda \|x\|_1\]

Fields

  • thresholds

    Sparsity threshold parameter

  • rho

    Augmented Lagrangian parameter

Example

opt = ADMM()
opt = ADMM(1e-1, 2.0)
DataDrivenSparse.ClippedAbsoluteDeviationType
struct ClippedAbsoluteDeviation{T} <: DataDrivenSparse.AbstractProximalOperator

Proximal operator which implements the (smoothly) clipped absolute deviation operator.

abs(x) > ρ ? x : sign(x) * max(abs(x) - λ, 0)

Where ρ = 5λ per default.

#Fields

  • ρ

    Upper threshold

Example

opt = ClippedAbsoluteDeviation()
opt = ClippedAbsoluteDeviation(1e-1)

See by Zheng et. al., 2018.

DataDrivenSparse.ImplicitOptimizerType
mutable struct ImplicitOptimizer{T<:DataDrivenSparse.AbstractSparseRegressionAlgorithm} <: DataDrivenSparse.AbstractSparseRegressionAlgorithm

Optimizer for finding a sparse implicit relationship via alternating the left hand side of the problem and solving the explicit problem, as introduced here.

\[\argmin_{x} \|x\|_0 ~s.t.~Ax= 0\]

Fields

  • optimizer

    Explicit Optimizer

Example

ImplicitOptimizer(STLSQ())
ImplicitOptimizer(0.1f0, ADMM)
DataDrivenSparse.SR3Type
mutable struct SR3{T, V, P<:DataDrivenSparse.AbstractProximalOperator} <: DataDrivenSparse.AbstractSparseRegressionAlgorithm

SR3 is an optimizer framework introduced by Zheng et. al., 2018 and used within Champion et. al., 2019. SR3 contains a sparsification parameter λ, a relaxation ν. It solves the following problem

\[\argmin_{x, w} \frac{1}{2} \| Ax-b\|_2 + \lambda R(w) + \frac{\nu}{2}\|x-w\|_2\]

Where R is a proximal operator and the result is given by w.

Fields

  • thresholds

    Sparsity threshold

  • nu

    Relaxation parameter

  • proximal

    Proximal operator

Example

opt = SR3()
opt = SR3(1e-2)
opt = SR3(1e-3, 1.0)
opt = SR3(1e-3, 1.0, SoftThreshold())

Note

Opposed to the original formulation, we use nu as a relaxation parameter, as given in Champion et. al., 2019. In the standard case of hard thresholding the sparsity is interpreted as λ = threshold^2 / 2, otherwise λ = threshold.

DataDrivenSparse.STLSQType
struct STLSQ{T<:Union{Number, AbstractVector}, R<:Number} <: DataDrivenSparse.AbstractSparseRegressionAlgorithm

STLSQ is taken from the original paper on SINDY and implements a sequentially thresholded least squares iteration. λ is the threshold of the iteration. It is based upon this matlab implementation. It solves the following problem

\[\argmin_{x} \frac{1}{2} \| Ax-b\|_2 + \rho \|x\|_2\]

with the additional constraint

\[\lvert x_i \rvert > \lambda\]

If the parameter ρ > 0, ridge regression will be performed using the normal equations of the corresponding regression problem.

Fields

  • thresholds

    Sparsity threshold

  • rho

    Ridge regression parameter

Example

opt = STLSQ()
opt = STLSQ(1e-1)
opt = STLSQ(1e-1, 1.0) # Set rho to 1.0
opt = STLSQ(Float32[1e-2; 1e-1])

Note

This was formally STRRidge and has been renamed.