Linear Algebra - Derivative (Or Differential) of Symmetric Square Root of A Matrix - Mathematics Stack Exchange
Linear Algebra - Derivative (Or Differential) of Symmetric Square Root of A Matrix - Mathematics Stack Exchange
I know the expression for an inverse, which is sort of like a matrix version of the power rule.
Would this approach work for a symmetric square root as well (i.e., (1/2)S^(-1/2))?
Share Cite Edit Follow edited Nov 19, 2013 at 2:21 asked Oct 26, 2013 at 14:25
Scott
303 2 6
You can easily get an expression for 𝑑𝑆 in terms of 𝐴 and 𝑑𝐴 using the product rule applied to 𝑑(𝑆 2) .
This will involve some matrix multiplication; I'm not sure what the final expression in terms of "half-
vectorization" would look like. – Anthony Carapetis Oct 26, 2013 at 14:40
Ah, thanks. Yes, it amounts to finding d(A), then isolating the d(S) terms. – Scott Oct 26, 2013 at
16:06
(d√𝐴
‾)√𝐴
‾ ‾+ √𝐴
‾ ‾(d√𝐴
‾ ‾) = d𝐴,
‾
‾) = (√𝐴 ‾)
−1
vec(d√𝐴 ‾ ⊕ √𝐴 vec(d𝐴).
⊤
‾ ‾ ‾
‾
Since 𝐴 is positive definite, √𝐴‾is unique and positive definite, and hence the Kronecker sum
is positive definite (thus non-singular). Further, since the differential and vec operator can be
https://math.stackexchange.com/questions/540361/derivative-or-differential-of-symmetric-square-root-of-a-matrix 1/3
2024/8/1 23:52 linear algebra - Derivative (or differential) of symmetric square root of a matrix - Mathematics Stack Exchange
interchanged in the left hand side of the equation above, the Jacobian identification rule (p.
198 in Magnus and Neudecker, Matrix Differential Calculus with Applications in Statistics
and Econometrics, 3rd ed., chapter 9, section 5) results
‾= (√𝐴 ‾) ,
−1
D√𝐴 ‾ ⊕ √𝐴
⊤
‾ ‾ ‾
Share Cite Edit Follow edited Jun 10, 2015 at 22:37 answered Jun 10, 2015 at 21:36
Abhishek Halder
1,133 8 15
The function 𝐴 → 𝑆 = √𝐴 ‾‾is defined and differentiable on the set of SPD matrices. Let
𝐾 = 𝐷𝑆 𝐴 (𝐻) be the derivative of 𝑆 in 𝐴, where 𝐻 is a variable SYMMETRIC matrix. Here
10 𝑆𝑆 = 𝐴 implies 𝐾𝑆 + 𝑆𝐾 = 𝐻 , a Sylvester equation in the unknown 𝐾 . We may assume
𝐴 = 𝑑𝑖𝑎𝑔((𝜆𝑖 )𝑖 ) where 𝜆𝑖 > 0. Then 𝑆 = 𝑑𝑖𝑎𝑔((√𝜆 ‾‾𝑖 ) 𝑖 ). If 𝐻 = [ℎ 𝑖,𝑗 ], 𝐾 = [𝑘𝑖,𝑗 ], then, by an
ℎ 𝑖,𝑗
easy identification, we obtain 𝑘𝑖,𝑗 = . Of course, if 𝑛 = 1, we find the usual
√‾
𝜆‾
𝑖 + √‾ 𝜆‾
𝑗
ℎ 1,1
derivative ℎ 1,1 → .
2√‾ 𝜆‾‾
1
EDIT 1. About the half-vectorization operator, we can store half of matrix 𝐾 because it is
symmetric as the matrix 𝐻 .
∞
EDIT 2. Another form of 𝐾 is ∫0 𝑒−𝑡𝑆 𝐻 𝑒−𝑡𝑆 𝑑𝑡 . That implies that if 𝐻 is a small symmetric
∞
matrix, then √‾
𝐴‾
‾+‾
‾‾
𝐻 ‾+ ∫0 𝑒−𝑡√𝐴𝐻 𝑒−𝑡√𝐴𝑑𝑡 .
‾≈ √𝐴
‾
EDIT 3. Proof of the above result. The integral converges (easy) and it suffices to prove that
𝐾𝑆 + 𝑆𝐾 = 𝐻 (the solution in 𝐾 of this equation is unique). One has
+∞ +∞
𝐾𝑆 + 𝑆𝐾 = ∫0 𝑒−𝑡𝑆 𝐻 𝑒−𝑡𝑆 𝑆 + 𝑆𝑒−𝑡𝑆 𝐻 𝑒−𝑡𝑆 𝑑𝑡 = − ∫0 (𝑒−𝑡𝑆 𝐻 𝑒−𝑡𝑆 )′ 𝑑𝑡 = 𝐻.
Share Cite Edit Follow edited Mar 9, 2019 at 18:55 answered Oct 27, 2013 at 3:42
Pink Panther user91684
841 1 7 20
https://math.stackexchange.com/questions/540361/derivative-or-differential-of-symmetric-square-root-of-a-matrix 2/3
2024/8/1 23:52 linear algebra - Derivative (or differential) of symmetric square root of a matrix - Mathematics Stack Exchange
6 𝐴 = 𝑆 2,
𝑑𝑠 = 𝑀 −1 𝑑𝑎
𝑎 = vec(𝐴), 𝑠 = vec(𝑆), 𝑀 = 𝑆𝑇 ⊕𝑆
However, to answer the original question, one must introduce Duplication and Elimination
matrices.
𝛼 = vech(𝐴), 𝛼 = 𝐿 𝑛𝑎, 𝑎 = 𝐷 𝑛𝛼
𝜎 = vech(𝑆), 𝜎 = 𝐿 𝑛𝑠, 𝑠 = 𝐷 𝑛𝜎
𝐿 𝑛 𝑑𝑠 = 𝐿 𝑛𝑀 −1 (𝐷 𝑛𝐿 𝑛 𝑑𝑎)
𝑑𝜎 = 𝐿 𝑛𝑀 −1 𝐷 𝑛 𝑑𝛼
∂𝜎
= 𝐿 𝑛𝑀 −1 𝐷 𝑛
∂𝛼
Share Cite Edit Follow edited May 6, 2019 at 12:42 answered May 6, 2019 at 12:34
greg
37.2k 5 27 87
https://math.stackexchange.com/questions/540361/derivative-or-differential-of-symmetric-square-root-of-a-matrix 3/3