Talk: Dynamical Systems On Graphs
Talk: Dynamical Systems On Graphs
on graphs
Nithin Govindarajan
December 4, 2024
Overview
The problem: what are the low-rank properties of inverses of sparse matrices?
Motivation: shortcomings of existing rank-structured representations in applications
A potential framework: GIRS matrices and their representations
GIRS representations on acyclic graphs: tree quasi-separable matrices
Conclusions & future work
2
Overview
The problem: what are the low-rank properties of inverses of sparse matrices?
3
Question: what is the algebraic structure of the inverse of a tridiagonal matrix?
a1 b1
c1 a2 b2
A=
c2 a3 b3
c3 a4 b4
c4 a5
4
Answer: quasi-separable structure
d1 p1 r1 q2 p1 r1 r2 q3 p1 r1 r2 r3 q4 p1 r1 r2 r3 r4 q5
u2 t1 v1 d 2 p2 r2 q3 p2 r2 r3 q4 p2 r2 r3 r4 q5
−1
A = u3 t1 v1
u3 t2 v2 d3 p3 r3 q4 p3 r3 r4 q5
u4 t3 t2 t1 v1 u4 t3 t2 v2 u4 t3 v3 d4 p4 q5
u5 t4 t3 t2 t1 v1 u5 t4 t3 t2 v2 u5 t4 t3 v3 u5 t4 v4 d5
5
Answer: quasi-separable structure
d1 p1 r1 q2 p1 r1 r2 q3 p1 r1 r2 r3 q4 p1 r1 r2 r3 r4 q5
u2 t1 v1 d 2 p2 r2 q3 p2 r2 r3 q4 p2 r2 r3 r4 q5
−1
A = u3 t1 v1
u3 t2 v2 d3 p3 r3 q4 p3 r3 r4 q5
u4 t3 t2 t1 v1 u4 t3 t2 v2 u4 t3 v3 d4 p4 q5
u5 t4 t3 t2 t1 v1 u5 t4 t3 t2 v2 u5 t4 t3 v3 u5 t4 v4 d5
6
Answer: quasi-separable structure
d1 p1 r1 q2 p1 r1 r2 q3 p1 r1 r2 r3 q4 p1 r1 r2 r3 r4 q5
u2 t1 v1 d 2 p2 r2 q3 p2 r2 r3 q4 p2 r2 r3 r4 q5
−1
A = u3 t1 v1
u3 t2 v2 d3 p3 r3 q4 p3 r3 r4 q5
u4 t3 t2 t1 v1 u4 t3 t2 v2 u4 t3 v3 d4 p4 q5
u5 t4 t3 t2 t1 v1 u5 t4 t3 t2 v2 u5 t4 t3 v3 u5 t4 v4 d5
7
Answer: quasi-separable structure
d1 p1 r1 q2 p1 r1 r2 q3 p1 r1 r2 r3 q4 p1 r1 r2 r3 r4 q5
u2 t1 v1 d2 p2 r2 q3 p2 r2 r3 q4 p2 r2 r3 r4 q5
−1
A =
u t
3 1 1v u t
3 2 2v d 3 p r
3 3 4q p3 r3 r4 q5
u4 t3 t2 t1 v1 u4 t3 t2 v2 u4 t3 v3 d4 p4 q5
u5 t4 t3 t2 t1 v1 u5 t4 t3 t2 v2 u5 t4 t3 v3 u5 t4 v4 d5
8
d 2
Continuous case: tridiagonal matrix A is a discretization of operator A := w (x) dx 2
d 2 q(x)
w (x) − λq(x) = 1, q(0) = 0, q(1) = 0, w (x) = 1
dx 2
⇔
Integral formulation:
Z 1
q(x) − λ K (x, y )q(y )dy = f (x)
0
(
x(y − 1), 0 ≤ x ≤ y
with semi-separable kernel K (x, y ) = , f (x) = 12 x(x − 1)
y (x − 1) y ≤ x ≤ 1
9
d 2
Continuous case: tridiagonal matrix A is a discretization of operator A := w (x) dx 2
10
Quasi-separable matrices are closed under inversion!
Quasi-separable matrices
Tridiagonal matrices
11
Focus of this talk: what can we say for more general sparse matrices?
12
Overview
The problem: what are the low-rank properties of inverses of sparse matrices?
13
Rank-structured matrices in practice: boundary element method for BVPs
14
Rank-structured matrices in practice: Schur complements in PDE discretizations
−1
Low off-diagonal rank structure in Sk = Ak − Ck Sk−1 Bk , S0 = A0
∗ ∗ ∗
∗ ∗ ... ∗
.. .. ..
. . ∗ .
∗ ∗ ∗
A0 B1
∗
∗ ∗
..
.
.. ..
.. ∗
∗ ∗ . .
..
C1 A1 . . .. .. ..
. . ∗ .
∗ ∗ ∗ ..
=
.
.. .. .. ..
. .
∗
. . Bn−1
..
.
..
..
.
..
∗
..
.
. .
Cn−1 An−1 ∗
.. ..
. .
∗ ∗
∗
..
∗ ∗ ∗ .
.. .. ..
. . .
∗
∗ ∗ ∗
Chandrasekaran, Shiv, Patrick Dewilde, Ming Gu, and Naveen Somasunderam. ”On the numerical rank
of the off-diagonal blocks of Schur complements of discretized elliptic PDEs.” SIAM Journal on Matrix
Analysis and Applications 31, no. 5 (2010): 2261-2290.
15
Rank-structured matrices in practice: optimal control of spatially distributed systems
∂ 2 v (x,t) 2 v (x,t)
Vibrating string: ∂t 2
= k(x) ∂ ∂x 2
+ b(x)u(x, t)
ẋ = Ax + Bu
y = Cx + Du
Rice, Justin K., and Michel Verhaegen. ”Distributed control: A sequentially semi-separable approach
for spatially heterogeneous linear systems.” IEEE Transactions on Automatic Control 54, no. 6 (2009):
1270-1283.
Bamieh, Bassam, Fernando Paganini, and Munther A. Dahleh. ”Distributed control of spatially
invariant systems.” IEEE Transactions on automatic control 47, no. 7 (2002): 1091-1107.
16
Many frameworks for efficient linear algebra with rank-structured matrices
All have their benefits and special use cases:
FMM matrices (Rokhlin & Greengard)
Hierarchichally semi-separable (HSS) matrices (Chandrasekaran & Gu)
Sequentially Semi-Separable (SSS) matrices (Chandrasekaran, Dewilde, van der
Veen)
HODLR
H-matrices and H 2 -matrices (Hackbusch)
Quasi-separable matrices (Eidelman, Gohberg)
Semi-separable matrices (Van Barel, Vandebril, Mastronardi)
Our interest:
Rank-structured matrices with closure property → direct solvers & preconditioners
17
SSS matrices: input-output map of mixed linear time-variant (LTV) system
x1 x2 x3 x4 x5
y1 y2 y3 y4 y5
state-space dynamics:
18
SSS matrices: input-output map of mixed linear time-variant (LTV) system
x1 x2 x3 x4 x5
y1 y2 y3 y4 y5
19
The ranks of the so-called Hankel blocks dictate the state dimension sizes
x1 x2 x3 x4 x5
y1 y2 y3 y4 y5
20
The ranks of the so-called Hankel blocks dictate the state dimension sizes
x1 x2 x3 x4 x5
y1 y2 y3 y4 y5
21
The ranks of the so-called Hankel blocks dictate the state dimension sizes
x1 x2 x3 x4 x5
y1 y2 y3 y4 y5
22
The ranks of the so-called Hankel blocks dictate the state dimension sizes
x1 x2 x3 x4 x5
y1 y2 y3 y4 y5
23
It is quite easy to write down the SSS representation for the tridiagonal matrix!
a1 b1 0 0 0 x1 y1
c1 a2 b2 0 0 x2 y2
0 c2 a3 b3 0 x3 = y3
0 0 c3 a4 b4 x4 y4
0 0 0 c4 a5 x5 y5
gk = 0 · gk+1 + 1 · xk , gn = 1 · xn
hk = 0 · hk−1 + 1 · xk , h1 = 1 · x1
yk = bk · gk+1 + ck · hk−1 + ak xk .
24
Square partitions: Hankel block ranks are preserved during inversion
Lemma
−1
B11 B12 A11 A12
Let = ∈ F(n1 +n2 )×(n1 +n2 ) with square A11 ∈ Fn1 ×n1 . Then,
B21 B22 A21 A22
gk = wk · gk+1 + vk · xk ,
hk = rk · hk−1 + qk · xk ,
yk = bk · gk+1 + ck · hk−1 + ak xk .
25
Algebraic properties of SSS: closure under sums, products, and inverses!
Inverse of an SSS matrix (with square partitions) is again an SSS matrix of the
same state dimensions.
Sums of SSS matrices are SSS, but with a doubling of the state dimensions.
Products of SSS matrices are also SSS with a doubling of the state dimensions.
26
From mat-vec to solving Ax = b: matrix representation of state-space equations
−V2T
I −W2 g2 0
I −W3 −V3T
g3
0
I −W4 −V4T
g4
0
I −V5T
g5
0
I −Q1T
h1
0
−R2 I −Q2T h2 0
−R3 I −Q3T h3 = 0
−R4 I −Q4T h4 0
U1 D1 x1 b1
U2 P2 D2
x2
b2
U3 P3 D3
x3
b3
U4 P4 D4 x4 b4
P5 D5 x5 b5
27
From mat-vec to solving Ax = b: re-ordering yields a fast solver
28
√
SSS matrices not suitable for 2D Laplacians: Hankel ranks grow with O( n)
∗ ∗ ∗
∗ ∗ ... ∗
.. .. ..
. . ∗ .
∗ ∗ ∗
..
∗
∗ ∗ .
∗ .. ..
∗ ∗ . .
..
. .. .. ..
. . ∗ .
∗ ∗ ∗ ..
.
.. ..
. .
∗
.. ..
. . ∗
.. .. ..
. . .
.. .. ∗
. .
∗ ∗
∗
..
∗ ∗ ∗ .
.. .. ..
. . .
∗
∗ ∗ ∗
√ √
with n-by- n block partitioning → approx. n1.5 parameters
29
Overview
The problem: what are the low-rank properties of inverses of sparse matrices?
30
Graph-partitioned matrices: associate a directed graph with a block-partitioned matrix
y (3,0)
y (2,0)
y (3,1)
x (3,0)
y (1,0)
y (2,1)
x (2,0) y (3,2)
y (0,0) x (3,1) Associate G = (V, E) with a
y (1,1)
x (1,0) y (2,2) block-partitioned matrix
x (2,1) y (3,3)
y (0,1) x (3,2)
x (0,0) y (1,2)
y (2,3)
X
x (1,1) yi = T{i, j}x j , i ∈ V.
x (2,2)
y (0,2) x (3,3)
x (0,1) y (1,3) j∈V
x (1,2)
x (2,3)
y (0,3)
x (0,2)
x (1,3)
x (0,3)
31
Hankel blocks induced by graph cuts
y (3,0)
y (2,0)
y (3,1)
x (3,0)
y (1,0)
y (2,1)
y (3,2)
Let A ⊂ V and Ā = V \ A so that
x (2,0)
y (0,0) x (3,1)
y (1,1)
x (1,0) y (2,2) T{A, A} T{A, Ā}
y (3,3)
x (2,1)
x (3,2) Π1 TΠ2 = .
x (0,0)
y (0,1)
y (1,2) T{Ā, A} T{Ā, Ā}
x (1,1) y (2,3)
x (2,2)
y (0,2) x (3,3)
x (0,1) y (1,3) Call T{Ā, A} as the Hankel block
x (1,2)
y (0,3)
x (2,3) induced by A.
x (0,2)
x (1,3)
x (0,3)
32
GIRS: a full characterization of all low-rank structures in (T, G)
(3, 0)
Definition (GIRS property)
(2, 0)
(3, 1)
(1, 0)
(T, G) satisfies the graph-induced
(2, 1)
(3, 2) rank structure for a constant
(0, 0)
(1, 1)
A (2, 2)
c ≥ 0 if ∀A ⊂ V,
(3, 3)
Ā (0, 1)
(1, 2)
(2, 3) rank T{Ā, A} ≤ c · E(A),
(0, 2)
(1, 3) where E(A) the number of border
(0, 3) edges.
33
The GIRS property is an invariant under inversion
(3, 0)
Theorem (GIRS property)
(2, 0)
(3, 1)
If (T, G) satisfies the
(1, 0)
(2, 1)
(3, 2)
graph-induced rank structure for a
(0, 0)
(1, 1)
constant c ≥ 0, then so does
A (2, 2)
(3, 3) (T−1 , G).
Ā (0, 1)
(1, 2)
(2, 3)
(0, 2) Proof.
(1, 3)
Recall the lemma from
(0, 3)
earlier...
34
The 2D-Laplacian satisfies the GIRS property for c = 1 if G is the adjacency graph
(3, 0)
∗ ∗ ∗
∗ ∗ ... ∗
(2, 0)
..
(3, 1)
.. ..
. . ∗ .
∗ ∗ ∗
(1, 0)
..
(2, 1) ∗
∗ ∗ .
(3, 2)
∗ .. ..
∗ ∗ . .
..
(0, 0)
. ..
.
..
. ∗
..
.
(1, 1)
∗ ∗ ∗ ..
(2, 2)
.
(3, 3)
.. ..
. .
∗
(0, 1)
.. ..
∗
(1, 2) . .
(2, 3)
..
.
..
.
..
.
.. .. ∗
. .
(0, 2)
(1, 3)
∗ ∗ ∗
..
∗ ∗ ∗ .
.. .. ..
. . .
∗
(0, 3)
∗ ∗ ∗
In fact, all sparse matrices are GIRS with c = 1 w.r.t. their adjacency graph...
35
GIRS representations: run “LTV systems” on arbitrary graphs
Associate with every edge (i, j) ∈ E the state vector h (i,j) ∈ Fρ(i,j) .
j2 “State-space” dynamics:
j1
i
· · · Aij1 ,jp Bij1
Aj1 ,j1
h (i,j1 ) h (j1 ,i)
.. .. .. .. .. .
.
i . . . . . .
=
Aijp ,j1 · · · Aijp ,jp Bijp
h (i,j ) h (j ,i)
p p
j3 yi Cij1 · · · Cijp Di xi
j4
36
GIRS representations generalize SSS matrices
Bii+1 Bii−1
0 0
S1 = Sn =
Cii+1 Di Cii−1 Di
...
1 2 3 n−2 n−1 n
Aii+1,i−1 Bii+1
0
Si = Aii−1,i+1 0 Bii−1
Cii+1 Cii−1 Di
38
For general GIRS representation, Gauss elimination is needed for mat-vec operation!
X
h (i,j) − Aij,w h (w ,i) − Bij x i = 0, (i, j) ∈ E
w ∈N (i)
Cji h (i,j) + Dj x i
X
= yj, j ∈ V.
i∈N (j)
↓
I−A B h 0
=
C D x y
X
h (i,j) − Aij,w h (w ,i) − Bij x i = 0, (i, j) ∈ E
i2 w ∈N (i)
i1
Cji h (i,j) + Dj x i
X
= yj, j ∈ V.
j i∈N (j)
↓
i3
j4 Group adjoining variables:
θj = (h (i1 ,j) , . . . , h (ip ,j) , x j ) and γj = (0, . . . , 0, y j )
Conditions for a fast solver:
1. state dimensions ρ(i,j) are small, ↓
2. degrees of the nodes are small,
Block-sparsity pattern of Ξθ = γ satisfies
3. G is a good elimination order. Ξij = 0 if (i, j) ∈
/ E.
40
Example: 2-by-3 mesh
A12,2 A12,4 B12 −I 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
h 0
A1 1 1 −I (2,1)
4,2 A4,4 B4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h (4,1) 0
1
C2 C14 D1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 x1
y 1
−I 0 0 A21,1 A21,3 A21,5 B21
0 0 0 0 0 0 0 0 0 0 0 0 0 h (1,2)
0
A3,1 A23,3 A23,5
2 B23 −I 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 h (3,2)
0
A25,1 A25,3 A25,5 B25
0 0 0 0 0 0 0 0 0 −I 0 0 0 0 0 0 h (5,2)
0
0 0 0 C21 C23 C25 D2 0 0 0 0 0 0 0 0 0 0 0 0 0
x2 y 2
0 −I 0 0 A32,2 A32,6 B32
0 0 0 0 0 0 0 0 0 0 0 0 h (2,3) 0
0 0 0 0 0 0 0 A36,2 A36,6 B36 0 0 0 0 0 0 −I 0 0
h (6,3)
0
C32 C36 D3
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
x3 y 3
=
0 −I 0 0 0 0 0 0 0 0 A41,1 A41,5 B42 0 0 0 0 0 0 0 h (1,4) 0
A45,1 A45,5 B45 0 −I 0 0 h (5,4) 0
0 0 0 0 0 0 0 0 0 0 0 0 0
C41 C45 D4 y 4
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 x4
0
0 0 0 0 0 −I 0 0 0 0 0 0 0 A52,2 A52,4 A52,6 B51 0 0 0 h (2,5)
0
0 0 0 0 0 0 0 0 0 0 0 −I 0 A54,2 A54,4 A54,6 B53 0 0 0
h (4,5)
A56,2 A56,4 A56,6 B55 −I 0 h 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
(6,5)
0 0 0 0 0 0 0 0 0 0 0 0 0 C52 C54 C56 D5 0 0 0
x 5 y 5
0 0 0 0 0 0 0 0 −I 0 0 0 0 0 0 0 0 A63,3 A63,5 B63 h (3,6)
0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 −I 0 A65,3 A65,5 B65
h
(5,6) 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 C63 C65 D6 x6 y6
41
GIRS representations (generically) satisfy the closure property
I−A B h 0
=
C D x y
↓
I − (A − BD−1 C) BD−1
h 0
=
D−1 C D−1 y x
Aij1 ,j1 − Bij1 inv(Di )Cij1 · · · Aij1 ,jp − Bij1 inv(Di )Cijp Bij1 inv(Di )
.. .. .. ..
i
S = . . . .
Aijp ,j1 − Bijp inv(Di )Cij1 · · · Aijp ,jp − Bijp inv(Di )Cijp Bijp inv(Di )
inv(Di )Cij1 ··· inv(Di )Cijp inv(Di )
Aii+1,i−1 Bii+1
0
Si = Aii−1,i+1 0 Bii−1
i
Ci+1 i
Ci−1 Di
Theorem
A GIRS representation with rank-profile {ρe }e∈E of a graph-partitioned matrix (T, G)
satisfies the GIRS-property for
c = max ρe .
e∈E
Proof.
A proof of this theorem was given in a talk in CAM23 at Selva di Fasano by Shiv
Chandrasekaran.
44
SSS matrices: the result can be extended in the other as well.
i j
45
Can the implication be in both directions in general?
Conjecture
A graph-partitioned matrix (T, G) is GIRS-c if, and only if, there exists GIRS
representation for (T, G) with ρe < c for all e ∈ E.
46
Overview
The problem: what are the low-rank properties of inverses of sparse matrices?
47
A partial verification of the GIRS conjecture
Theorem
The GIRS conjecture holds for acyclic graphs, i.e., graphs with no cycles.
48
Proving GIRS conjecture: a special tree quasi-separable (TQS) realization always exists
(k, j)
0 · · · Aki1 ,ip Aki1 ,j Bki1
.. .. .. .. ..
. . . . .
k Sk = k
Aip ,i1 ··· 0 Akip ,j Bkip
k
Aj,ii · · · Akj,ip 0 Bkj
...
(k, i1 ) Cki1 · · · Ckip Ckj Dk
(k, ip )
49
Proving GIRS conjecture: a special tree quasi-separable (TQS) realization always exists
5 7
1
4
2 6 T{3, 2} = C36 A63,7 A76,5 A57,2 B25
50
SSS generalization: Hankel block ranks specify dimensions of minimal TQS representation
51
TQS is a strict generalization of SSS and HSS
52
Overview
The problem: what are the low-rank properties of inverses of sparse matrices?
53
The state of affairs: acyclic graph-partitioned matrices
54
The state of affairs: general graph-partitioned matrices
55
Future work
56
Special thanks to collaborators
Shivkumar Chandrasekaran
Patrick Dewilde
Ethan Epperly
Vamshi C. Madala
Lieven De Lathauwer
57