Home

BackwardsLinalg.jl

copyltu!(A::AbstractMatrix) -> AbstractMatrix

copy the lower triangular to upper triangular.

lq_back(A, L, Q, dL, dQ) -> Matrix

backward for QR decomposition, for an arbituary shaped input matrix.

References: Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra. HaiJun's paper.

qr_back(A, q, r, dq, dr) -> Matrix

backward for QR decomposition, for an arbituary shaped input matrix.

References: Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra. Differentiable Programming Tensor Networks, Hai-Jun Liao, Jin-Guo Liu, Lei Wang, Tao Xiang

randomized SVD.

lq_back(A, l, q, dl, dq) -> Matrix

backward for LQ decomposition, for an arbituary shaped input matrix.

References: Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra. Differentiable Programming Tensor Networks, Hai-Jun Liao, Jin-Guo Liu, Lei Wang, Tao Xiang

qr_back_fullrank(q, r, dq, dr) -> Matrix

backward for QR decomposition, for input matrix (in forward pass) with M > N.

References: Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra.

svd_back(U, S, V, dU, dS, dV)

adjoint for SVD decomposition.

References: https://j-towns.github.io/papers/svd-derivative.pdf https://giggleliu.github.io/2019/04/02/einsumbp.html

References: * Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra.