0% found this document useful (0 votes)
20 views57 pages

Talk: Dynamical Systems On Graphs

The document discusses the low-rank properties of inverses of sparse matrices, particularly focusing on GIRS matrices and their representations in the context of dynamical systems on graphs. It highlights the quasi-separable structure of tridiagonal matrices and explores the algebraic structures preserved by the inverses of these matrices. The document also addresses practical applications of rank-structured matrices in boundary element methods and optimal control of spatially distributed systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views57 pages

Talk: Dynamical Systems On Graphs

The document discusses the low-rank properties of inverses of sparse matrices, particularly focusing on GIRS matrices and their representations in the context of dynamical systems on graphs. It highlights the quasi-separable structure of tridiagonal matrices and explores the algebraic structures preserved by the inverses of these matrices. The document also addresses practical applications of rank-structured matrices in boundary element methods and optimal control of spatially distributed systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Rank-structured matrices induced by dynamical systems

on graphs

Nithin Govindarajan

December 4, 2024
Overview

The problem: what are the low-rank properties of inverses of sparse matrices?
Motivation: shortcomings of existing rank-structured representations in applications
A potential framework: GIRS matrices and their representations
GIRS representations on acyclic graphs: tree quasi-separable matrices
Conclusions & future work

2
Overview

The problem: what are the low-rank properties of inverses of sparse matrices?

Motivation: shortcomings of existing rank-structured representations in applications

A potential framework: GIRS matrices and their representations

GIRS representations on acyclic graphs: tree quasi-separable matrices

Conclusions & future work

3
Question: what is the algebraic structure of the inverse of a tridiagonal matrix?

 
a1 b1
c1 a2 b2 
 
A=
 c2 a3 b3 

 c3 a4 b4 
c4 a5

Adjacency graph G(A):

4
Answer: quasi-separable structure

 
d1 p1 r1 q2 p1 r1 r2 q3 p1 r1 r2 r3 q4 p1 r1 r2 r3 r4 q5
 u2 t1 v1 d 2 p2 r2 q3 p2 r2 r3 q4 p2 r2 r3 r4 q5 
−1
 
A =  u3 t1 v1
 u3 t2 v2 d3 p3 r3 q4 p3 r3 r4 q5 

 u4 t3 t2 t1 v1 u4 t3 t2 v2 u4 t3 v3 d4 p4 q5 
u5 t4 t3 t2 t1 v1 u5 t4 t3 t2 v2 u5 t4 t3 v3 u5 t4 v4 d5

algebraic structure: All off-diagonal blocks are unit rank...


representation: Quasi-separable matrices (there are others)

5
Answer: quasi-separable structure

 
d1 p1 r1 q2 p1 r1 r2 q3 p1 r1 r2 r3 q4 p1 r1 r2 r3 r4 q5
 u2 t1 v1 d 2 p2 r2 q3 p2 r2 r3 q4 p2 r2 r3 r4 q5 
−1
 
A =  u3 t1 v1
 u3 t2 v2 d3 p3 r3 q4 p3 r3 r4 q5 

 u4 t3 t2 t1 v1 u4 t3 t2 v2 u4 t3 v3 d4 p4 q5 
u5 t4 t3 t2 t1 v1 u5 t4 t3 t2 v2 u5 t4 t3 v3 u5 t4 v4 d5

algebraic structure: All off-diagonal blocks are unit rank...


representation: Quasi-separable matrices (there are others)

6
Answer: quasi-separable structure

 
d1 p1 r1 q2 p1 r1 r2 q3 p1 r1 r2 r3 q4 p1 r1 r2 r3 r4 q5
 u2 t1 v1 d 2 p2 r2 q3 p2 r2 r3 q4 p2 r2 r3 r4 q5 
−1
 
A =  u3 t1 v1
 u3 t2 v2 d3 p3 r3 q4 p3 r3 r4 q5 

 u4 t3 t2 t1 v1 u4 t3 t2 v2 u4 t3 v3 d4 p4 q5 
u5 t4 t3 t2 t1 v1 u5 t4 t3 t2 v2 u5 t4 t3 v3 u5 t4 v4 d5

algebraic structure: All off-diagonal blocks are unit rank...


representation: Quasi-separable matrices (there are others)

7
Answer: quasi-separable structure

 
d1 p1 r1 q2 p1 r1 r2 q3 p1 r1 r2 r3 q4 p1 r1 r2 r3 r4 q5
 u2 t1 v1 d2 p2 r2 q3 p2 r2 r3 q4 p2 r2 r3 r4 q5 
−1
 
A =
 u t
3 1 1v u t
3 2 2v d 3 p r
3 3 4q p3 r3 r4 q5 

 u4 t3 t2 t1 v1 u4 t3 t2 v2 u4 t3 v3 d4 p4 q5 
u5 t4 t3 t2 t1 v1 u5 t4 t3 t2 v2 u5 t4 t3 v3 u5 t4 v4 d5

algebraic structure: All off-diagonal blocks are unit rank...


representation: Quasi-separable matrices (there are others)

no. of parameters in the quasi-separable representation of A−1



no. of nonzero entries in A

8
d 2
Continuous case: tridiagonal matrix A is a discretization of operator A := w (x) dx 2

A simple boundary value ODE problem:

d 2 q(x)
w (x) − λq(x) = 1, q(0) = 0, q(1) = 0, w (x) = 1
dx 2

Integral formulation:
Z 1
q(x) − λ K (x, y )q(y )dy = f (x)
0
(
x(y − 1), 0 ≤ x ≤ y
with semi-separable kernel K (x, y ) = , f (x) = 12 x(x − 1)
y (x − 1) y ≤ x ≤ 1

9
d 2
Continuous case: tridiagonal matrix A is a discretization of operator A := w (x) dx 2

Discretization (e.g. using Nyström’s method) of


Z 1
(δ(x − y ) − λK (x, y )) q(y )dy = f (x)
0

yields the linear system:


    
d1 p1 q2 p1 q3 p1 q4 p1 q5 q1 b1
u2 v1 d2 p2 q3 p2 q4 p2 q5  q2  b2 
    
u3 v1 u3 v2 d3 p3 q4 p3 q5   q3  = b3 
   

u4 v1 u4 v2 u4 v3 d4 p4 q5  q4  b4 
u5 v1 u5 v2 u5 v3 u5 v4 d5 q5 b5

10
Quasi-separable matrices are closed under inversion!

Closure property: The inverse of a quasi-separable matrix is again quasi-separable.


Tridiagonal matrices are a special case of quasi-separable.
Note: the inverse of a quasi-separable is generally not tridiagonal.
Addition & products preserve “quasi-separable structures” (more later!)

Quasi-separable matrices

Tridiagonal matrices

11
Focus of this talk: what can we say for more general sparse matrices?

Given a sparse matrix A ∈ Cn×n with adjacency graph G(A):


1. What are the algebraic structures preserved by A−1 ?
2. Does there exist suitable representation A−1 satisfying the closure property?

1. graph-induced rank-structure (GIRS)


2. Rank-structured matrices induced by dynamical systems on graphs:
Acyclic adjacency graphs: a complete answer
Non-acyclic adjacency graphs: a partial answer

12
Overview

The problem: what are the low-rank properties of inverses of sparse matrices?

Motivation: shortcomings of existing rank-structured representations in applications

A potential framework: GIRS matrices and their representations

GIRS representations on acyclic graphs: tree quasi-separable matrices

Conclusions & future work

13
Rank-structured matrices in practice: boundary element method for BVPs

Exterior Helmholtz with Dirichlet boundary conditions ∂Ω


e√ikr
Ω u(x) ∼ r
,r →∞
∇2 u(x) + k 2 u(x) = 0, x∈ R2 \Ω
u(x) = g (x), x ∈ ∂Ω

Reformulate to Fredholm integral equation on ∂Ω



Fast multipole method (FMM): exploit low-rank structures in far-field
Rokhlin, Vladimir. ”Rapid solution of integral equations of classical potential theory.” Journal of
computational physics 60, no. 2 (1985): 187-207.

14
Rank-structured matrices in practice: Schur complements in PDE discretizations
−1
Low off-diagonal rank structure in Sk = Ak − Ck Sk−1 Bk , S0 = A0
 
∗ ∗ ∗
 ∗ ∗ ... ∗
 

 
 .. .. .. 
 . . ∗ . 
 
  
∗ ∗ ∗ 
A0 B1 


 ∗
 ∗ ∗
..
.





.. ..
.. ∗
 
 ∗ ∗ . . 
..
   
C1 A1 . . .. .. ..
 
 
 . . ∗ . 

∗ ∗ ∗ ..
 = 

 .


.. .. .. ..
 
. .
 

   
 . . Bn−1 





..
.
..
..
.
..

..
.





 . . 

Cn−1 An−1 ∗ 

 .. .. 

 . . 

∗ ∗
 
 ∗ 
..
 

 ∗ ∗ ∗ .


 .. .. ..


 . . .

∗ 
∗ ∗ ∗

Chandrasekaran, Shiv, Patrick Dewilde, Ming Gu, and Naveen Somasunderam. ”On the numerical rank
of the off-diagonal blocks of Schur complements of discretized elliptic PDEs.” SIAM Journal on Matrix
Analysis and Applications 31, no. 5 (2010): 2261-2290.

15
Rank-structured matrices in practice: optimal control of spatially distributed systems

∂ 2 v (x,t) 2 v (x,t)
Vibrating string: ∂t 2
= k(x) ∂ ∂x 2
+ b(x)u(x, t)

Low-rank structure in state-space models:

ẋ = Ax + Bu
y = Cx + Du

Vehicle platoon Communication constraints on feedback u = Fx!

Rice, Justin K., and Michel Verhaegen. ”Distributed control: A sequentially semi-separable approach
for spatially heterogeneous linear systems.” IEEE Transactions on Automatic Control 54, no. 6 (2009):
1270-1283.
Bamieh, Bassam, Fernando Paganini, and Munther A. Dahleh. ”Distributed control of spatially
invariant systems.” IEEE Transactions on automatic control 47, no. 7 (2002): 1091-1107.
16
Many frameworks for efficient linear algebra with rank-structured matrices
All have their benefits and special use cases:
FMM matrices (Rokhlin & Greengard)
Hierarchichally semi-separable (HSS) matrices (Chandrasekaran & Gu)
Sequentially Semi-Separable (SSS) matrices (Chandrasekaran, Dewilde, van der
Veen)
HODLR
H-matrices and H 2 -matrices (Hackbusch)
Quasi-separable matrices (Eidelman, Gohberg)
Semi-separable matrices (Van Barel, Vandebril, Mastronardi)

Our interest:
Rank-structured matrices with closure property → direct solvers & preconditioners

17
SSS matrices: input-output map of mixed linear time-variant (LTV) system

x1 x2 x3 x4 x5

y1 y2 y3 y4 y5

state-space dynamics:

gk = Wk g k+1 + Vk⊤ x k , g n = Vn⊤ x n


hk = Rk h k−1 + Q⊤
k xk, h 1 = Q⊤
1 x1
yk = Uk g k+1 + Pk h k−1 + Dk x k .

18
SSS matrices: input-output map of mixed linear time-variant (LTV) system

x1 x2 x3 x4 x5

y1 y2 y3 y4 y5

Resulting input-output relation:

U1 V2⊤ U1 W2 V3⊤ U1 W2 W3 V4⊤ U1 W2 W3 W4 V5⊤


    
D1 x1 y1
⊤ ⊤ ⊤ ⊤  x 
P Q
2 1 D 2 U V
2 3 U W V
2 3 4 U W W V
2 3 4 5   2 y 2 
  
 P3 R2 Q⊤ ⊤ ⊤ ⊤

P3 Q2 D3 U3 V4 U3 W4 V5  x 3  = y 3 
    
 1
 P4 R3 R2 Q⊤ P4 R3 Q2⊤ P4 Q3⊤ D4 U4 V5⊤  x4  y 4
1
P5 R4 R3 R2 Q⊤1 P5 R4 R3 Q⊤
2 P5 R4 Q⊤
3 P5 Q⊤
4 D5 x5 y5

19
The ranks of the so-called Hankel blocks dictate the state dimension sizes

x1 x2 x3 x4 x5

y1 y2 y3 y4 y5

state dimension of h 1 ⇔ rank H1


    
A11 A12 A13 A14 A15 x1 y1
 A21
 A22 A23 A24 A25  x 2  y 2 
   

 x 3  = y 3 
 A31 A32 A33 A34 A35     

 A41 A42 A43 A44 A45  x 4  y 4 
A51 A52 A53 A54 A55 x5 y5

Notice how the cuts correspond to the Hankel blocks

20
The ranks of the so-called Hankel blocks dictate the state dimension sizes

x1 x2 x3 x4 x5

y1 y2 y3 y4 y5

state dimension of h 2 ⇔ rank H2


    
A11 A12 A13 A14 A15 x1 y1
 A21
 A22 A23 A24 A25  x 2  y 2 
   

 x 3  = y 3 
 A31 A32 A33 A34 A35     

 A41 A42 A43 A44 A45  x 4  y 4 
A51 A52 A53 A54 A55 x5 y5

Notice how the cuts correspond to the Hankel blocks

21
The ranks of the so-called Hankel blocks dictate the state dimension sizes

x1 x2 x3 x4 x5

y1 y2 y3 y4 y5

state dimension of h 3 ⇔ rank H3


    
A11 A12 A13 A14 A15 x1 y1
 A21
 A22 A23 A24 A25  x 2  y 2 
   

 x 3  = y 3 
 A31 A32 A33 A34 A35     

 A41 A42 A43 A44 A45  x 4  y 4 
A51 A52 A53 A54 A55 x5 y5

Notice how the cuts correspond to the Hankel blocks

22
The ranks of the so-called Hankel blocks dictate the state dimension sizes

x1 x2 x3 x4 x5

y1 y2 y3 y4 y5

state dimension of h 4 ⇔ rank H4


    
A11 A12 A13 A14 A15 x1 y1
 A21
 A22 A23 A24 A25  x 2  y 2 
   

 x 3  = y 3 
 A31 A32 A33 A34 A35     

 A41 A42 A43 A44 A45  x 4  y 4 
A51 A52 A53 A54 A55 x5 y5

Notice how the cuts correspond to the Hankel blocks

23
It is quite easy to write down the SSS representation for the tridiagonal matrix!

    
a1 b1 0 0 0 x1 y1

 c1 a2 b2 0 0  x2  y2 
   
 0 c2 a3 b3 0  x3  = y3 
    
 0 0 c3 a4 b4  x4  y4 
0 0 0 c4 a5 x5 y5

The w ’s and r ’s are zero for the mixed LTV system:

gk = 0 · gk+1 + 1 · xk , gn = 1 · xn
hk = 0 · hk−1 + 1 · xk , h1 = 1 · x1
yk = bk · gk+1 + ck · hk−1 + ak xk .

24
Square partitions: Hankel block ranks are preserved during inversion

Lemma
   −1
B11 B12 A11 A12
Let = ∈ F(n1 +n2 )×(n1 +n2 ) with square A11 ∈ Fn1 ×n1 . Then,
B21 B22 A21 A22

rank B21 = rank A21 , rank B12 = rank A12 .

The inverse of the tridiagonal is also described by a mixed LTV system:

gk = wk · gk+1 + vk · xk ,
hk = rk · hk−1 + qk · xk ,
yk = bk · gk+1 + ck · hk−1 + ak xk .

but w ’s and r ’s will no longer zero!

25
Algebraic properties of SSS: closure under sums, products, and inverses!

Inverse of an SSS matrix (with square partitions) is again an SSS matrix of the
same state dimensions.
Sums of SSS matrices are SSS, but with a doubling of the state dimensions.
Products of SSS matrices are also SSS with a doubling of the state dimensions.

26
From mat-vec to solving Ax = b: matrix representation of state-space equations

−V2T
    
I −W2 g2 0

 I −W3 −V3T 
 g3  
  0 


 I −W4 −V4T 
 g4  
  0 


 I −V5T 
 g5  
  0 


 I −Q1T 
 h1  
  0 

−R2 I −Q2T h2 0
    
    
−R3 I −Q3T h3 = 0
    
  
−R4 I −Q4T h4 0
    
    
    
 U1 D1  x1   b1 
    

 U2 P2 D2 
 x2  
  b2 


 U3 P3 D3 
 x3  
  b3 

 U4 P4 D4  x4   b4 
P5 D5 x5 b5

27
From mat-vec to solving Ax = b: re-ordering yields a fast solver

Block-sparsity pattern matches the underlying graph!


−Q1T
    
I h1 0

 D1 U1 
 x1  
  b1 


 I −V2T −W2 
 g2  
  0 

 −R
 2 I −Q2T 
 h2  
  0 

 P D2 U2  x2   b2 
 2    
I −V3T −W3 g3 0
    
    
−R3 I −Q3T h3 = 0
    
  
P3 D3 U3 x3 b3
    
    
−V4T −W4
    
 I  g4   0 
−Q4T
    

 −R4 I 
 h4  
  0 


 P4 D4 U4 
 x4  
  b4 

 I −V5T  g5   0 
P5 D5 x5 b5

Recall: Line graphs have a perfect elimination order!

28

SSS matrices not suitable for 2D Laplacians: Hankel ranks grow with O( n)
 
∗ ∗ ∗
 ∗ ∗ ... ∗
 

 
 .. .. .. 
 . . ∗ . 
 

 ∗ ∗ ∗ 

 
 .. 
 ∗
 ∗ ∗ . 


∗ .. .. 
 ∗ ∗ . . 
..
 

. .. .. .. 

 . . ∗ . 

 ∗ ∗ ∗ .. 

 . 

.. ..
 
. .
 
 ∗ 
.. ..
 

 . . ∗ 


.. .. .. 

 . . . 


 .. .. ∗ 

 . . 

∗ ∗
 
 ∗ 
..
 

 ∗ ∗ ∗ .


 .. .. ..


 . . .

∗ 
∗ ∗ ∗

√ √
with n-by- n block partitioning → approx. n1.5 parameters

29
Overview

The problem: what are the low-rank properties of inverses of sparse matrices?

Motivation: shortcomings of existing rank-structured representations in applications

A potential framework: GIRS matrices and their representations

GIRS representations on acyclic graphs: tree quasi-separable matrices

Conclusions & future work

30
Graph-partitioned matrices: associate a directed graph with a block-partitioned matrix

y (3,0)

y (2,0)
y (3,1)
x (3,0)
y (1,0)
y (2,1)
x (2,0) y (3,2)
y (0,0) x (3,1) Associate G = (V, E) with a
y (1,1)
x (1,0) y (2,2) block-partitioned matrix
x (2,1) y (3,3)
y (0,1) x (3,2)
x (0,0) y (1,2)
y (2,3)
X
x (1,1) yi = T{i, j}x j , i ∈ V.
x (2,2)
y (0,2) x (3,3)
x (0,1) y (1,3) j∈V
x (1,2)
x (2,3)
y (0,3)
x (0,2)
x (1,3)

x (0,3)

31
Hankel blocks induced by graph cuts
y (3,0)

y (2,0)
y (3,1)
x (3,0)
y (1,0)
y (2,1)
y (3,2)
Let A ⊂ V and Ā = V \ A so that
x (2,0)
y (0,0) x (3,1)
y (1,1)  
x (1,0) y (2,2) T{A, A} T{A, Ā}
y (3,3)
x (2,1)
x (3,2) Π1 TΠ2 = .
x (0,0)
y (0,1)
y (1,2) T{Ā, A} T{Ā, Ā}
x (1,1) y (2,3)
x (2,2)
y (0,2) x (3,3)
x (0,1) y (1,3) Call T{Ā, A} as the Hankel block
x (1,2)
y (0,3)
x (2,3) induced by A.
x (0,2)
x (1,3)

x (0,3)

This generalizes the Hankel blocks from earlier!

32
GIRS: a full characterization of all low-rank structures in (T, G)

(3, 0)
Definition (GIRS property)
(2, 0)
(3, 1)
(1, 0)
(T, G) satisfies the graph-induced
(2, 1)
(3, 2) rank structure for a constant
(0, 0)
(1, 1)
A (2, 2)
c ≥ 0 if ∀A ⊂ V,
(3, 3)
Ā (0, 1)
(1, 2)
(2, 3) rank T{Ā, A} ≤ c · E(A),
(0, 2)
(1, 3) where E(A) the number of border
(0, 3) edges.

33
The GIRS property is an invariant under inversion

(3, 0)
Theorem (GIRS property)
(2, 0)
(3, 1)
If (T, G) satisfies the
(1, 0)
(2, 1)
(3, 2)
graph-induced rank structure for a
(0, 0)
(1, 1)
constant c ≥ 0, then so does
A (2, 2)
(3, 3) (T−1 , G).
Ā (0, 1)
(1, 2)
(2, 3)
(0, 2) Proof.
(1, 3)
Recall the lemma from
(0, 3)
earlier...

34
The 2D-Laplacian satisfies the GIRS property for c = 1 if G is the adjacency graph

(3, 0)  
∗ ∗ ∗
 ∗ ∗ ... ∗
 
(2, 0)

 
..
(3, 1) 


.. ..
. . ∗ .




 ∗ ∗ ∗ 

(1, 0)
 
 .. 
(2, 1)  ∗
 ∗ ∗ . 
(3, 2)


∗ .. .. 
 ∗ ∗ . . 
..
 
(0, 0) 
 . ..
.
..
. ∗
..
.


(1, 1) 
∗ ∗ ∗ ..

(2, 2)
 
 . 
(3, 3)
 
.. ..
 
. .
 

(0, 1)
 
.. ..
 

(1, 2) . .
 
 
(2, 3) 


..
.
..
.
..
.




 .. .. ∗ 
. .
(0, 2)
 
 
(1, 3) 
 ∗ ∗ ∗


..
 

 ∗ ∗ ∗ .


 .. .. ..

 . . .

∗ 
(0, 3) 
∗ ∗ ∗

In fact, all sparse matrices are GIRS with c = 1 w.r.t. their adjacency graph...

35
GIRS representations: run “LTV systems” on arbitrary graphs

Associate with every edge (i, j) ∈ E the state vector h (i,j) ∈ Fρ(i,j) .

j2 “State-space” dynamics:
j1
  i
· · · Aij1 ,jp Bij1

Aj1 ,j1
 
h (i,j1 ) h (j1 ,i)
 ..   ..  .. .. ..  . 
 . 
i  .   . . . .  . 
=
 Aijp ,j1 · · · Aijp ,jp Bijp

h (i,j )    h (j ,i) 

p p
j3 yi Cij1 · · · Cijp Di xi
j4

36
GIRS representations generalize SSS matrices

Bii+1 Bii−1
   
0 0
S1 = Sn =
Cii+1 Di Cii−1 Di

...
1 2 3 n−2 n−1 n
Aii+1,i−1 Bii+1
 
0
Si = Aii−1,i+1 0 Bii−1 
Cii+1 Cii−1 Di

Line graph: diagonal edge-to-edge operators can be set to zero without


loss-of-generality!

Decouples dynamics in upstream & downstream flow!
37
GIRS representation allow for linear parametrizations of 2D Laplacians
Edge-to-edge operators again zero (similar to tridiagonal matrices):

(i+1,j) (i−1,j) (i,j+1) (i,j−1) x(i,j)


 
(i+1,j+1) 0 0 0 0 ∗
(i−1,j+1)  0 0 0 0 ∗ 
(i,j)
 
S = (i+1,j−1) 
 0 0 0 0 ∗ 

(i−1,j−1)  0 0 0 0 ∗ 
y(i,j) ∗ ∗ ∗ ∗ ∗

Using a scalar partitioning → approx. 52 · n parameters

38
For general GIRS representation, Gauss elimination is needed for mat-vec operation!

X
h (i,j) − Aij,w h (w ,i) − Bij x i = 0, (i, j) ∈ E
w ∈N (i)

Cji h (i,j) + Dj x i
X
= yj, j ∈ V.
i∈N (j)


    
I−A B h 0
=
C D x y

Solve (I − A)h = −Bx to find h first!

One needs to be cautious that I − A is not singular!


39
GIRS representations admit fast solvers through the sparse embedding trick!

X
h (i,j) − Aij,w h (w ,i) − Bij x i = 0, (i, j) ∈ E
i2 w ∈N (i)
i1
Cji h (i,j) + Dj x i
X
= yj, j ∈ V.
j i∈N (j)


i3
j4 Group adjoining variables:
θj = (h (i1 ,j) , . . . , h (ip ,j) , x j ) and γj = (0, . . . , 0, y j )
Conditions for a fast solver:
1. state dimensions ρ(i,j) are small, ↓
2. degrees of the nodes are small,
Block-sparsity pattern of Ξθ = γ satisfies
3. G is a good elimination order. Ξij = 0 if (i, j) ∈
/ E.

40
Example: 2-by-3 mesh

 
A12,2 A12,4 B12 −I 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
  
h 0
 A1 1 1 −I   (2,1) 
 4,2 A4,4 B4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0  h (4,1)   0 
 1
 C2 C14 D1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0   x1 
  
 y 1 

−I 0 0 A21,1 A21,3 A21,5 B21
  
0 0 0 0 0 0 0 0 0 0 0 0 0  h (1,2) 
0
   
A3,1 A23,3 A23,5
2 B23 −I 0 0
 
 0 0 0 0 0 0 0 0 0 0 0 0 0  h (3,2) 
  0
A25,1 A25,3 A25,5 B25
   

 0 0 0 0 0 0 0 0 0 −I 0 0 0 0 0 0  h (5,2) 
  0
  

 0 0 0 C21 C23 C25 D2 0 0 0 0 0 0 0 0 0 0 0 0 0 
 x2   y 2
0 −I 0 0 A32,2 A32,6 B32  

 0 0 0 0 0 0 0 0 0 0 0 0  h (2,3)   0
  

 0 0 0 0 0 0 0 A36,2 A36,6 B36 0 0 0 0 0 0 −I 0 0 
 h (6,3)  
  0 
C32 C36 D3
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
x3  y 3 
 

= 
  
0 −I 0 0 0 0 0 0 0 0 A41,1 A41,5 B42 0 0 0 0 0 0 0  h (1,4)  0
 
 
  
A45,1 A45,5 B45 0 −I 0 0  h (5,4)   0 
  
 0 0 0 0 0 0 0 0 0 0 0 0 0 
C41 C45 D4 y 4
   
 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0  x4   
  
 0
0 0 0 0 0 −I 0 0 0 0 0 0 0 A52,2 A52,4 A52,6 B51 0 0 0  h (2,5) 
  
  0 

 0 0 0 0 0 0 0 0 0 0 0 −I 0 A54,2 A54,4 A54,6 B53 0 0 0 
 h (4,5)  
  
A56,2 A56,4 A56,6 B55 −I 0 h 0


 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
 (6,5)   
  

 0 0 0 0 0 0 0 0 0 0 0 0 0 C52 C54 C56 D5 0 0 0 
 x 5  y 5 

   

 0 0 0 0 0 0 0 0 −I 0 0 0 0 0 0 0 0 A63,3 A63,5 B63  h (3,6)   
  0 

 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 −I 0 A65,3 A65,5 B65
 h
 (5,6)  0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 C63 C65 D6 x6 y6

41
GIRS representations (generically) satisfy the closure property
    
I−A B h 0
=
C D x y

I − (A − BD−1 C) BD−1
    
h 0
=
D−1 C D−1 y x

Aij1 ,j1 − Bij1 inv(Di )Cij1 · · · Aij1 ,jp − Bij1 inv(Di )Cijp Bij1 inv(Di )
 
 .. .. .. .. 
i

S = . . . . 

Aijp ,j1 − Bijp inv(Di )Cij1 · · · Aijp ,jp − Bijp inv(Di )Cijp Bijp inv(Di )
 
 
inv(Di )Cij1 ··· inv(Di )Cijp inv(Di )

Also, addition and product have nice formulas


42
SSS inverse and GIRS representation inverse: a subtle difference
...
1 2 3 n−2 n−1 n

The formula in the previous slide gives:

−Bii+1 inv(Di )Cii+1 Aii+1,i−1 − Bii+1 inv(Di )Cii−1 Bii+1 inv(Di )


 

Si =  Aii−1,i+1 − Bii−1 inv(Di )Cii+1 −Bii−1 inv(Di )Cii−1 Bii−1 inv(Di ) 


i
inv(D )Ci+1i i
inv(D )Ci−1i inv(Di )

SSS theory guarantees more! A realization of form:

Aii+1,i−1 Bii+1
 
0
Si =  Aii−1,i+1 0 Bii−1 
i
Ci+1 i
Ci−1 Di

The latter requires no Gauss elimination for mat-vec!


43
GIRS representations satisfy the GIRS property by construction

Theorem
A GIRS representation with rank-profile {ρe }e∈E of a graph-partitioned matrix (T, G)
satisfies the GIRS-property for
c = max ρe .
e∈E

Proof.
A proof of this theorem was given in a talk in CAM23 at Selva di Fasano by Shiv
Chandrasekaran.

44
SSS matrices: the result can be extended in the other as well.

i j

A one-to-one relationship between Hankel block ranks and state dimensions:

ρ(i,j) = rank H(i,j) := rank T{Ā, A}

Stronger result: Implication is in both directions:

ρ(i,j) < c ⇔ T is GIRS-c

45
Can the implication be in both directions in general?

Conjecture
A graph-partitioned matrix (T, G) is GIRS-c if, and only if, there exists GIRS
representation for (T, G) with ρe < c for all e ∈ E.

Small GIRS constant implies compact GIRS representation!

This conjecture is intimately tied to the construction/realization problem!

46
Overview

The problem: what are the low-rank properties of inverses of sparse matrices?

Motivation: shortcomings of existing rank-structured representations in applications

A potential framework: GIRS matrices and their representations

GIRS representations on acyclic graphs: tree quasi-separable matrices

Conclusions & future work

47
A partial verification of the GIRS conjecture

Acyclic graphs: tree interpretation

chordal structure → “fast” solvers through elimination of leaf nodes

Theorem
The GIRS conjecture holds for acyclic graphs, i.e., graphs with no cycles.

48
Proving GIRS conjecture: a special tree quasi-separable (TQS) realization always exists

Node k with parent j and children i1 , . . . , ip :

(k, j) 
0 · · · Aki1 ,ip Aki1 ,j Bki1

 .. .. .. .. .. 

 . . . . . 

k Sk =  k
 Aip ,i1 ··· 0 Akip ,j Bkip 

 k
 Aj,ii · · · Akj,ip 0 Bkj

... 
(k, i1 ) Cki1 · · · Ckip Ckj Dk
(k, ip )

Tree graph: Diagonal edge-to-edge operators are set to zero!



Decouples dynamics into an explicit flow starting from the leaves!

49
Proving GIRS conjecture: a special tree quasi-separable (TQS) realization always exists

Entries Node k with parent j and children i1 , . . . , ip :

5 7
1
4
2 6 T{3, 2} = C36 A63,7 A76,5 A57,2 B25

Tree graph: Diagonal edge-to-edge operators are set to zero!



Decouples dynamics into an explicit flow starting from the leaves!

50
SSS generalization: Hankel block ranks specify dimensions of minimal TQS representation

ρ(i,j) = rank H(i,j) := rank T{Ā, A}

Construction from a finite number of low-rank factorizations:


Govindarajan, N., Chandrasekaran, S., Dewilde, P. (2024). Tree quasi-separable matrices: a
simultaneous generalization of sequentially and hierarchically semi-separable representations. arXiv
preprint.

51
TQS is a strict generalization of SSS and HSS

TQS reduces to SSS if G is the line graph.


TQS reduces to HSS if G is a binary tree with empty non-leaf nodes.
In all other cases, it is neither SSS nor HSS.

Many of the algorithms for SSS and HSS generalize to TQS:

development of more flexible and powerful code possible!

52
Overview

The problem: what are the low-rank properties of inverses of sparse matrices?

Motivation: shortcomings of existing rank-structured representations in applications

A potential framework: GIRS matrices and their representations

GIRS representations on acyclic graphs: tree quasi-separable matrices

Conclusions & future work

53
The state of affairs: acyclic graph-partitioned matrices

general GIRS repr. GIRS conjecture: solved and true!


Construction: TQS realizations is possible in finite
TQS number of low-rank factorizations.
Special realizations: A special TQS realization always
HSS SSS exists that decouples dynamics into an explicit flow.
Algebraic properties: closed under sums, products, and
inverses.
Fast solvers: chordal structure ensures good elimination
order.

54
The state of affairs: general graph-partitioned matrices

general GIRS repr. GIRS conjecture: yet to be answered!


Construction: no general algorithm for constructing
TQS realizations.
Special realizations: not known when realizations exist
HSS SSS that simplify the dynamics.
Algebraic properties: closure under sums, products, and
inverses.
Fast solvers: contingent on existence of good
elimination orders.

55
Future work

1. Develop formulas, factorization algorithms, software for TQS, e.g.:


Inner-outer
(Pseudo-)inverse
LU / Cholesky
ULV
2. Applications of TQS, e.g.:
Exterior Helmholtz problems on “branchy” domains
Distributed control on acyclic graphs
3. Theoretical work: proving GIRS conjecture for cycle graphs?
4. Construction of more general GIRS representations using optimization-based
techniques?

56
Special thanks to collaborators

Shivkumar Chandrasekaran
Patrick Dewilde
Ethan Epperly
Vamshi C. Madala
Lieven De Lathauwer

57

You might also like