Graph Signal Processing
Graph Signal Processing
4. Sampling
Definition
Given a graph G(V, E) , a graph signal is defined as a mapping from the set of
vertices to the real space of dimension equal to the cardinality of V :
f : V ! R |V |
f : S ! R|S |
Motivating examples
Problem: infer the structure of the overall field from sparse noisy samples
Motivating examples
- Vehicular Networks:
Motivating examples
ẋ(t) = Ax(t)
Motivating examples
- Social networks:
4. Sampling
Discrete Laplacian
2 3
1 1 0 ··· ··· ··· 0
6 1 2 1 0 ··· ··· 0 7
6 7
6 0 1 2 1 0 ··· 0 7
L=6
6
7
7
6 ··· ··· ··· ··· ··· ··· ··· 7
4 0 ··· ··· 0 1 2 1 5 graph
0 ··· ··· ··· 0 1 1
Toeplitz circulant form
2 3
2 1 0 ··· ··· ··· 1
6 1 2 1 0 ··· ··· 0 7
6 7
6 0 1 2 1 0 ··· 0 7
L=6
6
7
7
6 ··· ··· ··· ··· ··· ··· ··· 7
4 0 ··· ··· 0 1 2 1 5
1 ··· ··· ··· 0 1 2 graph
x̂ = UT x
Inverse GFT:
x = U x̂
• for far from zero (high frequencies), the associated complex exponential
eigenfunctions oscillate much more rapidly
In the graph setting, the graph Laplacian eigenvalues and eigenvectors provide a
similar notion of frequency:
1 1 1 1
0 0 0 0
−1 −1 −1 −1
−1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5
2 2 2 2
1 1 1 1
0 0 0 0
−1 −1 −1 −1
−1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5
13
04/11/19 Sapienza University of Rome
Graph Fourier Transform
Total variation on graphs
x
The total variation of a vector on a graph can be defined as
0 T 1/2
T Vx : kx k = x Lx
Substituting the inverse GFT , we get
x = Ux̂
XN
T Vx2 = xT Lx = x̂T ⇤ x̂ = i kx̂(i)k2
i=1
from which N
2
X 2
|x̂ | |x̂ | = kx 0 2
k = T V 2
i i i i x
i=1
T Vx
|x̂(i)| p
i
Example of GFT
Spectrum Cartography in
Behavior of GFT2
Cognitive Radio
10 1.1 20
8 1
10
6
0.9
0
4
0.8
2 -10
PSD (dB)
0.7
0
-20
0.6
-2
0.5 -30
-4
-6 0.4 -40
-8 0.3
-50
0 20 40 60 80 100
-10 0.2 Eigenvector index
-10 -5 0 5 10
ŷ = H x̂ = diag(ĥ) x̂
y = U ŷ
In total
y = U diag(ĥ) UT x
30
46 25
28
44 20
26
Latitude (deg)
42 15
24
40 10
22
38 5
20
36 0
6 8 10 12 14 16 18 20 0 10 20 30 40 50 60 70 80 90 100
Longitude (deg)
30
35
46 46
28
44 44 30
26
Latitude (deg)
Latitude (deg)
42 42
25
24
40 40
22 20
38 38
20
15
36 36
6 8 10 12 14 16 18 20 6 8 10 12 14 16 18 20
Longitude (deg) Longitude (deg)
48 48
27
35
46 46 26.5
26
44 30 44
25.5
Latitude (deg)
42 42 25
25
24.5
40 40
24
20
38 38 23.5
15 23
36 36
6 8 10 12 14 16 18 20 6 8 10 12 14 16 18 20
Longitude (deg)
48 48
30
35
46 46
28
44 26
44 30
Latitude (deg)
24
42 42
25
22
40 40
20
20
38 38 18
15 16
36 36
6 8 10 12 14 16 18 20 6 8 10 12 14 16 18 20
Longitude (deg)
4. Sampling
Continuous-time signals
R1 R1 2
2 2 t |x(t)| dt
2 1
(t t0 ) |x(t)| dt 1
Time spread: T = R1 t0, = R 1 2 dt
1
|x(t)|2 dt 1
|x(t)|
R1 R1
2 1
(f f0 )2 |X(f )|2 df 1
f |X(f )| 2
df
Frequency spread: F = R1 f0 =, R 1 ,
1
|X(f )|2 df 1
|X(f )|2 df
Heisenberg’s principle:
1
T F
4⇡
A perfectly localized signal in time cannot be perfectly localized in frequency
and viceversa
Define time and frequency spread as the dimension of the intervals such that a
given percentage of energy falls within them:
Z t0 +T /2
Z f0 +W/2
|x(t)|2 dt |X(f )|2 df
t0 T /2 2 f0 W/2 2
Z 1 =↵ . Z 1 = .
|x(t)|2 dt |X(f )|2 df
1 1
Not all pairs (↵, ) are admissible. Aim of the uncertainty principle is to find out
the region of all admissible pairs
⌃ii = 1, if i 2 F; ⌃ii = 0, if i 2
/F
and U is the matrix whose columns are the eigenvectors of L
The matrices D and B are symmetric and idempotent, and then they
represent orthogonal projectors onto the sets S and F respectively
We denote by D the projector onto the complement set S and, similarly for
the frequency domain, by B the projector onto the complement set F
Bx = x
Theorem: A vector x is perfectly localized over both vertex set S and F
frequency set if and only if
max (BDB) =1
In such a case, x is the eigenvector associated to the the unit eigenvalue
i = arg maxkD ik
i
subject to k ik =1 h i, ji = 0, j 6= i
B i = i
This is equivalent to
i = arg maxkDB ik
i
subject to k ik =1
h i, ji = 0, j 6= i
The solution is given by the eigenvectors associated to the operator BDB , i.e.
BDB i = i i
In particular, the band-limited signal maximally concentrated on the set S
is the eigenvector associated to the maximum eigenvalue of BDB
Uncertainty principle
kDxk2 2 kBxk2 2
Define the concentration measures =↵ ; =
kxk 2 kxk 2
Theorem: The only admissible concentration pairs (↵, ) belong to the region
specified by the following inequalities
1 1 1
cos ↵ + cos cos max (BD)
p
1 1 1
cos ↵2 + cos
1 cos max BD
p
1 1 2 1
cos ↵ + cos 1 cos max BD
p p
1 1 1
cos 1 ↵2 + cos 1 2 cos max BD
Uncertainty principle
Admissible region
2
2 2
1 max BD max (B D )
1
(a)
1 2
max BD 2
max BD 1 ↵2
(a): 1 1 1
cos ↵ + cos cos max (BD)
Note: the upper right curve can shrink to the point (1, 1) in case of perfect
localization
4. Sampling
Sampling Theorem
Bs = s
it is possible to recover s from its sampled version r if and only if
kDBk < 1
Proof:
(if part)
Denote by the reconstructed signal
ŝ = Qr
The reconstruction error is zero if
s Qr = s Q I D s = s Q I DB s = 0
If , is invertible and then the error is zero
I DB
kDBk < 1
(only if part)
If , there are band-limited signals that are perfectly localized
kDBk = 1
over the complement set , hence the corresponding samples over would
S S
be null, so that no reconstruction would be possible
33
04/11/19 Sapienza University of Rome
Processing over Graphs
Reconstruction Algorithm # 1: Alternating Projection
Note: X1
1 n
R := I DB = DB
n=0
Iterative Algorithm:
r(0) = r
r(1) = r + DB r(0)
…
r(k) = r + DB r(k 1)
Example of Signal Reconstruction
0
|S| = 10
|S| = 20
-10
|S| = 30
Bandwidth = 10
-20
NMSE (dB)
-30 N = 50
-40
-50
0 5 10 15 20 25 30 35 40
Iteration Index
Reconstruction Algorithm # 2
|F |
X 1
ŝ = 2 hDr, ii i
i=1 i
where i are solution of
BDB i = i i
Goal : Select the sampling set in order to minimize the Mean Square Error
1
0
0 Random Random
MaxFro MaxFro
−1 −2
MinUniSet MinUniSet
−2 MaxSigMin −4 MaxSigMin
NMSE, dB
NMSE, dB
−3 MaxVol MaxVol
MinPinv −6 MinPinv
−4
Exhaustive
−8
−5
−6 −10
−7
−12
−8
5 10 15 20 25 30 20 40 60 80 100 120 140 160 180 200
Number of samples Number of samples
250
200 30
Graph Fourier Transform
150
100 25
50
0 20
-50
0 100 200 300 400 500 600
Graph frequencies
15
0
Normalized Mean Square Deviation
-5 10
-10
5
-15
-20
0
-25
20 40 60 80 100
Vehicular Network in Manhattan
Bandwidth used for processing
|F| = 40
x̂ = U (DU)† y
NMSE (dB)
26 −14
42
−16
41
24 −18
40
−20
39
22
−22
38
37 −24
20
36 −26
6 8 10 12 14 16 18 2 4 6 8 10 12 14 16 18 20
Longitude (deg) Cardinality of F
Question: Is perfect recovery possible in the presence of (very large) spiky noise ?
Numerical results
10
20
M SE, (dB)
30
40
|F| = 5
|F| = 10
50
|F| = 30
60
0 10 20 30 40 50 60 70
Number of noisy samples, |S|
80 90 100
Answer: Yes, provided that # samples is sufficiently larger than # of noise samples
Consensus algorithms
Consensus may be achieved by minimizing the disagreement
N
1 1 T
J(x) = aij (xi xj )2 := x Lx
2 i=1 j Ni
2
with aij 0 and Ni is set of neighbors of node i
Consensus algorithms
2
Choosing < ,
⇥max (L)
Consensus algorithms
A digraph is strongly connected (SC) if, for every pair of nodes vi and vj ,
there exists a strong path from vi to vj and viceversa
A digraph is quasi-strongly connected (QSC) if, for every pair of nodes vi and vj ,
there exists a third node vr that can reach both nodes by a strong path
A digraph is weakly connected (WC) if any pair of distinct nodes can be joined
by a weak path
Consensus algorithms
The possible forms of consensus depend on network topology
i > 0, iff vi Vi
i = 0, otherwise
The zero eigenvalue of a graph is simple if and only if the graph is QSC
Consensus algorithms
Consensus algorithms
10. M. Tsitsvero, S. Barbarossa, P. Di Lorenzo, “Uncertainty principle and sampling of signals defined on graph”
Asilomar Conf., Nov. 2015
11. M. Tsitsvero, S. Barbarossa, P. Di Lorenzo, “Signal Processing on Graph: Uncertainty principle and
sampling”, IEEE Trans. on Signal Processing, Sep. 2016
12. S. Sardellitti, S. Barbarossa, P. Di Lorenzo, “On the Graph Fourier Transform for Directed Graphs”, IEEE
Journal of Selected Topics in Signal Processing, Sep., 2017, pp. 796-811
13. P. Di Lorenzo, P. Banelli, S. Barbarossa, S. Sardellitti, “Distributed Adaptive Learning of Graph Signals”, IEEE
Trans. on Signal Processing, Aug. 15, 2017, pp. 4193-4208
14. P. Di Lorenzo, S. Barbarossa, P. Banelli, S. Sardellitti, “Adaptive Least Mean Squares Estimation of Graph
Signals”, IEEE Trans. on Signal and Inform. Processing over Networks, Dec. 2016, pp. 555-568