FirstOrderSolvers.GAPAType

GAPA(α=1.0, β=0.0; kwargs...)`

GAP Adaptive Same as GAP but with adaptive α1,α2 set to optimal αopt=2/(1+sinθ) according to the estimate of the Friedrischs angle θ. β is averaging factor between αopt and 2: α1=α2= (1-β)*αopt + β*2.0.

FirstOrderSolvers.checkstatusMethod

checkstatus(stat::FeasibilityStatus, x) Returns false if no check was made If convergence check is done, returns true and sets stat.status to one of: :Continue, :Optimal, :Infeasible

FirstOrderSolvers.checkstatusMethod

checkstatus(stat::HSDEStatus, x) Returns false if no check was made If convergence check is done, returns true and sets stat.status to one of: :Continue, :Optimal, :Unbounded, :Infeasible

FirstOrderSolvers.conjugategradient!Method

conjugategradient!(x,A,b,r,p,Ap; tol = size(A,2)*eps(), max_iters = 10000)

Solve A*x==b with the conjugate gradient method. Uses x as warm start and stores solution in x. r,p,Ap should be same size as x and will be changed.

Implemented as in "Matrix Compuatations" 2nd edition, Golub and Van Loan (1989)

FirstOrderSolvers.support_linesearchMethod

Line search types: Val{:False} : No line search available

Val{:True} : Line search available, but at the cost of extra evaluations Should provide an nonexpansive operator y=T(x) as T!(y, alg<:FOSAlgorithm, data<:FOSSolverData, x, i, status, longstep) so the algorithm becomes x^(k+1) = (1-αk)x^k = αk*T(x^k) Val{:Fast} : Line search available with low extra cost Should provide two operators y=S1(x), z=S2(y) as S1!(y, alg<:FOSAlgorithm, data<:FOSSolverData, x, i, status, longstep)S2!(y, alg<:FOSAlgorithm, data<:FOSSolverData, x, i, status, longstep) so the algorithm becomes x^(k+1) = (1-αk)x^k = αk*S2(S1(x^k))S1 has to be an affine operator, i.e. S1(x+y)+S1(0)=S1(x)+S2(y).