0% found this document useful (0 votes)
11 views

Lecture 3

This document discusses spectral graph theory and presents several useful eigenvalue identities. It proves the Perron-Frobenius theorem about the eigenvalues and eigenvectors of connected graphs. It also discusses how these results can be applied to understand the structure of bipartite graphs.

Uploaded by

Đỗ Đỗ
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Lecture 3

This document discusses spectral graph theory and presents several useful eigenvalue identities. It proves the Perron-Frobenius theorem about the eigenvalues and eigenvectors of connected graphs. It also discusses how these results can be applied to understand the structure of bipartite graphs.

Uploaded by

Đỗ Đỗ
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

ORIE 6334 Spectral Graph Theory August 30, 2016

Lecture 3
Lecturer: David P. Williamson Scribe: Seung Won (Wilson) Yoo

1 Eigenvalue Identities
We first present a few useful eigenvalue identities. Let A ∈ Rn×n be a symmetric matrix
with real eigenvalues λ1 ≥ λ2 ≥ ...λn with corresponding eigenvectors x1 , x2 , . . . , xn such
that the xi are orthonormal.

Lemma 1 The eigenvectors of Ak are x1 , . . . , xn with corresponding eigenvalues λk1 , . . . , λkn .

Proof: Let  
X = x1 x2 . . . xn
 
λ1 0 0
D =  0 ... 0 
 

0 0 λn
Then,
AX = XD
since the columns of X are the eigenvectors. Interestingly, X T X = I as the columns are
orthonormal, so X T = X −1 . This implies A = XDX −1 (right multiplying AX = DX by
X −1 ). Then, Ak = (XDX −1 )k = XDk X −1 , where Dk has the form
 k 
λ1 0 0
Dk =  0 . . . 0 
 

0 0 λkn

It follows that Ak X = XDk ; therefore, the eigenvectors of Ak are x1 , . . . , xn with corre-


sponding eigenvalues λk1 , . . . , λkn . 2
For our next few identities, we need the following fact that we present without proof.

Fact 1 det(AB) = det(A) det(B)

An easy corollary of this fact is det(A−1 ) = 1


det(A) since:

det(A) det(A−1 ) = det(AA−1 ) = det(I) = 1


0
This lecture was drawn from Lau’s 2015 lecture notes, Lecture 1:https://cs.uwaterloo.ca/~lapchi/
cs798/notes/L01.pdf and Spielman’s 2012 lecture notes, Lecture 3: http://www.cs.yale.edu/homes/
spielman/561/2012/lect03-12.pdf.

3-1
Lemma 2
n
Y
det(A) = λi .
i=1

Proof:

det(A) = det(XDX −1 )
= det(X) det(D) det(X −1 )
= det(D)
Yn
= λi .
i=1

2
Recall that in the first lecture, we defined the trace of A to be T r(A) = ni=1 aii . We
P
used the following without proof in the first lecture, and now we can prove it.

Lemma 3
n
X
T r(A) = λi .
i=1

Proof: Consider the characteristic polynomial of A, which we defined in the first lecture
to be det(λI − A); it is a degree n polynomial in λ. Then we can rewrite it as:

det(λI − A) = det(λXX T − XDX T )


= det(X(λI − D)X T )
= det(X) det(λI − D) det(X T )
= det(λI − D)
Yn
= (λ − λi ).
i=1

We see that indeed the eigenvalues


P are precisely the roots of the polynomial. Here, the
coefficient of λn−1 is exactly − ni=1 λi .
Now, recall that the determinant of a matrix Z with permutations Sn is:

X n
Y
det(Z) = sgn(σ) aiσ(i) .
σ∈Sn i=1

The only permutation thatQcan produce a λn−1 term is the identity, so the only
P term in the
sum with a λn−1 term is ni=1 (λ − aii ). The coefficient of λn−1 here is − ni=1 aii . Then,
we are done, with
n
X X n
T r(A) = aii = λi .
i=1 i=1
2

3-2
2 The Perron-Frobenius Theorem
Recall for an undirected graph G, its adjacency matrix is defined as A = (aij ) where
(
1 if (i, j) ∈ E
aij =
0 otherwise

Theorem 4 (Perron-Frobenius) Let G be a connected graph with adjacency matrix A,


eigenvalues λ1 ≥ λ2 ≥ · · · ≥ λn and corresponding eigenvectors x1 , . . . , xn . Then:

(i) λ1 ≥ −λn ;

(ii) λ1 > λ2 ;

(iii) There exists an eigenvector x1 > 0 (that is, every coordinate is strictly positive).

Proof: First we will prove (iii). Let x1 , . . . , xn be the corresponding eigenvectors, and
assume they are orthonormal. Recall that

xT Ax
λ1 = maxn = xT1 Ax1 .
x∈R xT x

Define a vector y such that ∀i, y(i) = |x1 (i)|; then y T y = xT1 x1 = 1. We show that y is also
an eigenvector corresponding to λ1 . To see this, we have

λ1 = xT1 Ax1
X
= aij x1 (i)x1 (j)
ij
X
≤ aij |x1 (i)||x1 (j)|
ij
X
= aij y(i)y(j)
ij

= y T Ay
≤ λ1 .

The last inequality follows by the definition of λ1 , and by the fact that y has unit norm.
Since λ1 = λ1 , all the inequalities must be equalities, so y is an eigenvector of λ1 .
We now argue that none of the entries of y can be zero. We have y ≥ 0 by definition
and y 6= 0 since it is an eigenvector. To show that none of the entires are zero, we use the
fact that the graph is connected, so if there is some j such that y(j) = 0, this means that
there is an edge (i, k) ∈ E such that y(i) = 0 and y(k) 6= 0. Then we have:
X
(Ay)(i) = y(j) ≥ y(k) > 0
j:(i,j)∈E

However, (Ay)(i) = λi y(i) = 0, which is a contradiction.

3-3
Now we prove (i). Let ∀i, y(i) = |xn (i)|. Again, we have

y T y = xTn xn = 1

so the vector y has unit norm. Then

|λn | = |xTn Axn |


X
≤ aij |xn (i)||xn (j)|
ij
X
= aij y(i)y(j)
ij

= y T Ay
≤ λ1 ,

as desired.
For (ii), let ∀i, y(i) = |x2 (i)|. Then y y = xT2 x2 = 1. Then we have that

λ2 = xT2 Ax2
X
≤ aij |x2 (i)||x2 (j)|
ij
X
= aij y(i)y(j)
ij

= y T Ay
≤ λ1 ,

Now we show that somewhere along the way, the inequality is strict. Assume x1 > 0, as
we can from (iii). Since hx1 , x2 i = 0, and both are nonzero, some of the entries of x2 are
positive and some are negative. We split into two cases:
• Case 1: All of the entries of x2 are nonzero. Then, since G is connected, ∃(i, j) ∈ E
such that x2 (i) < 0, x2 (j) > 0. Then, x2 (i)x2 (j) < |x2 (i)||x2 (j)| which gives us the
strict inequality that we wanted. Hence, λ2 < λ1 .

• Case 2: x2 (i) = 0 for some i. If all inequalities are equalities, y is an eigenvector of


λ1 with y ≥ 0. We argued above that when y ≥ 0 and is an eigenvector corresponding
to λ1 and G is connected, then none of the entries of y can be zero. But since
y(i) = x2 (i) = 0 for some i, this is a contradiction.
2

3 Bipartite Graphs
We now turn to showing how all the various identities we’ve proven over this lecture and
the last can be applied to showing something about the structure of graphs. In particular,
we show that the spectrum of the adjacency matrix tells us whether the graph is bipartite
or not.

3-4
Lemma 5 If G is bipartite, and λ is an eigenvalue of adjacency matrix A, then so is −λ.

Proof: If G is bipartite, we can re-index the nodes such that


 
0 B
A= .
BT 0
 
x
Then let v = be an eigenvector of A with eigenvalue λ. Then, we have
y
    
0 B x x
A= =λ
BT 0 y y

Hence we have By = λx and B T x = λy So from this,


    
0 B x −By
=
B T 0 −y BT x
 
−λx
=
λy
 
x
= −λ
−y

So, −λ is an eigenvalue corresponding to the eigenvector [x − y]T . 2


We can now show that this is statement can be made an “if and only if”: that is the
graph G is bipartite if and only if for each eigenvalue λ there is another eigenvalue −λ.

Theorem 6 If for each eigenvalue λ 6= 0 there is another eigenvalue λ0 = −λ, then G is


bipartite.

Proof: Let k be any odd positive integer. Then by hypothesis,


n
X
T r(Ak ) = λki = 0
i=1

It can be shown by induction that (Ak )ij is the number of walks


P from i to j of length exactly
k (recall from the first lecture that we used that (A2 )ij = k aik akj is the number of walks
of length exactly two, using an edge from i to k then k to j). Notice that if there is an odd
cycle of length k, then it must be the case that (Ak )ii > 0, so that T r(Ak ) > 0. But since
T r(Ak ) = 0, there are no odd cycles of length k. Since this is true for any odd positive
integer k, there are no odd cycles in G, which implies that G is bipartite. 2
Now we can show something even stronger than the previous statement: we only need
to look at the smallest and largest eigenvalue to know whether or not the graph is bipartite.

Theorem 7 Suppose G is connected. Then, λn = −λ1 if and only if G is bipartite.

3-5
Proof: By Perron-Frobenius, λ1 ≥ −λn , and by the previous theorem, the graph being
bipartite implies that λ1 = −λn .
For the other direction, Let xn be the eigenvector corresponding to λn with xTn xn . Let
y(i) = |xn (i)| for all i. Again, we have y T y = xTn xn = 1. Also,

|λn | = |xTn Axn |


X
≤ aij |xn (i)||xn (j)|
ij
X
= aij yn (i)yn (j)
ij

= y T Ay
≤ λ1 .

The assumption λn = −λ1 implies that all the inequalities are equalities. This implies that
y is an eigenvector corresponding to λ1 , with y ≥ 0. By our proof of Perron-Frobenius,
since y ≥ 0, we have y > 0 and this implies that xn (i) 6= 0 for all i.
If all the inequalities are equalities, xn (i)xn (j) has the same sign whenever aij > 0.
Since λn = xTn Axn < 0, all of these products must be negative. This implies that for any
edge in the graph, either xn (i) > 0, xn (j) < 0 or xn (i) < 0, xn (j) > 0. This induces the
bipartition
V = {i : xn (i) < 0},
W = {i : xn (i) < 0}.
2
This brings about an interesting research question. What happens when λ1 is close to
−λn ? Does this mean that the graph is “almost bipartite”, in the sense that if we remove
some of the edges, it would become bipartite? Possibly the answer to this question is already
well known.

3-6

You might also like