BackwardsLinalg.jl
BackwardsLinalg.copyltu!
BackwardsLinalg.lq_back
BackwardsLinalg.lq_back_fullrank
BackwardsLinalg.qr_back
BackwardsLinalg.qr_back_fullrank
BackwardsLinalg.rsvd
BackwardsLinalg.svd_back
BackwardsLinalg.symeigen_back
BackwardsLinalg.copyltu!
— Method.copyltu!(A::AbstractMatrix) -> AbstractMatrix
copy the lower triangular to upper triangular.
BackwardsLinalg.lq_back
— Method.lq_back(A, L, Q, dL, dQ) -> Matrix
backward for QR decomposition, for an arbituary shaped input matrix.
References: Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra. HaiJun's paper.
BackwardsLinalg.qr_back
— Method.qr_back(A, q, r, dq, dr) -> Matrix
backward for QR decomposition, for an arbituary shaped input matrix.
References: Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra. Differentiable Programming Tensor Networks, Hai-Jun Liao, Jin-Guo Liu, Lei Wang, Tao Xiang
BackwardsLinalg.rsvd
— Method.randomized SVD.
BackwardsLinalg.lq_back_fullrank
— Method.lq_back(A, l, q, dl, dq) -> Matrix
backward for LQ decomposition, for an arbituary shaped input matrix.
References: Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra. Differentiable Programming Tensor Networks, Hai-Jun Liao, Jin-Guo Liu, Lei Wang, Tao Xiang
BackwardsLinalg.qr_back_fullrank
— Method.qr_back_fullrank(q, r, dq, dr) -> Matrix
backward for QR decomposition, for input matrix (in forward pass) with M > N.
References: Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra.
BackwardsLinalg.svd_back
— Method.svd_back(U, S, V, dU, dS, dV)
adjoint for SVD decomposition.
References: https://j-towns.github.io/papers/svd-derivative.pdf https://giggleliu.github.io/2019/04/02/einsumbp.html
BackwardsLinalg.symeigen_back
— Method.References: * Seeger, M., Hetzel, A., Dai, Z., Meissner, E., & Lawrence, N. D. (2018). Auto-Differentiating Linear Algebra.