0% found this document useful (0 votes)
40 views

Graph Signal Processing

1. Signals on graphs are defined as mappings from the vertices of a graph to real values. Motivating examples include sensor networks, vehicular networks, gene regulatory networks, and social networks. 2. The Graph Fourier Transform projects a graph signal onto the eigenvectors of the graph Laplacian matrix. It provides a notion of frequency for graph signals similar to classical Fourier analysis. 3. The eigenvalues of the graph Laplacian carry information about the smoothness of the corresponding eigenvectors across the graph, with lower eigenvalues associated with smoother eigenvectors.

Uploaded by

ale3265
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Graph Signal Processing

1. Signals on graphs are defined as mappings from the vertices of a graph to real values. Motivating examples include sensor networks, vehicular networks, gene regulatory networks, and social networks. 2. The Graph Fourier Transform projects a graph signal onto the eigenvectors of the graph Laplacian matrix. It provides a notion of frequency for graph signals similar to classical Fourier analysis. 3. The eigenvalues of the graph Laplacian carry information about the smoothness of the corresponding eigenvectors across the graph, with lower eigenvalues associated with smoother eigenvectors.

Uploaded by

ale3265
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

Summary

1.  Signals on Graphs

2.  The Graph Fourier Transform

3.  Localization Properties and Uncertainty Principle

4.  Sampling

04/11/19 Sapienza University of Rome 1


Processing over Graphs

Definition
Given a graph G(V, E) , a graph signal is defined as a mapping from the set of
vertices to the real space of dimension equal to the cardinality of V :

f : V ! R |V |

More generally, if we indicate by S the ensemble of subsets of V (not only


pairs of vertices), we can define a hypergraph H(V, S) and a hypergraph
signal as the mapping

f : S ! R|S |

Example: In unsupervised learning, we have signals defined over the edges


of a graph

04/11/19 Sapienza University of Rome 2


Processing over Graphs

Motivating examples

-  Wireless sensor networks:

- the vertices are sensors


- the adjacency matrix describes the communication links among sensors
- the signal is the observation collected by the sensors

Problem: infer the structure of the overall field from sparse noisy samples

04/11/19 Sapienza University of Rome 3


Processing over Graphs

Motivating examples

-  Vehicular Networks:

- the vertices are cars


- the graph is dictated by the topology of the streets
- the signal is the velocity associated to each car

Problem: infer/predict the traffic flow from sparse noisy samples

04/11/19 Sapienza University of Rome 4


Processing over Graphs

Motivating examples

-  Gene regulatory networks:

- the vertices are proteins, enzymes, …


- the adjacency matrix describes the interaction pattern
- the signal is the concentration of a protein

ẋ(t) = Ax(t)

Problem: infer the structure of (sparse) matrix A from evolution of


concentration over time

04/11/19 Sapienza University of Rome 5


Processing over Graphs

Motivating examples

-  Social networks:

- the vertices are people


- the adjacency matrix describes social relationships
- the signal is an opinion, e.g. the the political orientation

Problem: infer overall political orientation from sparse samples

04/11/19 Sapienza University of Rome 6


Summary – Day 3

1.  Signals on Graphs

2.  The Graph Fourier Transform

3.  Localization Properties and Uncertainty Principle

4.  Sampling

04/11/19 Sapienza University of Rome 7


Processing over Graphs

Graph Fourier Transform

Fourier: Heat equation / diffusion equation

Continuous form (1D):


@f (x; t) @ 2 f (x; t)
=D
@t @x2
Discrete form:

fi [n + 1] fi [n] = "[fi 1 [n] 2fi [n] + fi+1 [n]]


or
f [n + 1] = f [n] " L f [n] = (I " L) f [n]

04/11/19 Sapienza University of Rome 8


Processing over Graphs

Discrete Laplacian
2 3
1 1 0 ··· ··· ··· 0
6 1 2 1 0 ··· ··· 0 7
6 7
6 0 1 2 1 0 ··· 0 7
L=6
6
7
7
6 ··· ··· ··· ··· ··· ··· ··· 7
4 0 ··· ··· 0 1 2 1 5 graph
0 ··· ··· ··· 0 1 1
Toeplitz circulant form
2 3
2 1 0 ··· ··· ··· 1
6 1 2 1 0 ··· ··· 0 7
6 7
6 0 1 2 1 0 ··· 0 7
L=6
6
7
7
6 ··· ··· ··· ··· ··· ··· ··· 7
4 0 ··· ··· 0 1 2 1 5
1 ··· ··· ··· 0 1 2 graph

04/11/19 Sapienza University of Rome 9


Processing over Graphs

A Toeplitz circulant matrix is diagonalized by Fourier basis:


N
X
L = W⇤WH = H
i wi wi
i=1
where
1
Wk` = p ej2⇡k`/N , k, ` = 0, . . . , N 1
N
A Toeplitz circulant Laplacian matrix is diagonalized by the Fourier basis and its
eigenvalues are
N 1
1 X j2⇡mn/N
m =p L1n e
N n=0

DFT is the projection onto the eigenvectors of a circular graph

04/11/19 Sapienza University of Rome 10


Processing over Graphs

Graph Fourier Transform


N
X
L = U⇤UT = T
i ui ui ,
i=1

GFT : projection onto the eigenvectors of the graph Laplacian

x̂ = UT x
Inverse GFT:
x = U x̂

Note: If L is Toeplitz-circulant, the GFT coincides with standard DFT

04/11/19 Sapienza University of Rome 11


Processing over Graphs

In classical Fourier analysis, the eigenvalues carry a specific notion of frequency:

•  for close to zero (low frequencies), the associated complex exponential


eigenfunctions are smooth, slowly oscillating functions

•  for far from zero (high frequencies), the associated complex exponential
eigenfunctions oscillate much more rapidly

In the graph setting, the graph Laplacian eigenvalues and eigenvectors provide a
similar notion of frequency:

•  The graph Laplacian eigenvectors associated with low frequencies vary


slowly across the graph

•  The eigenvectors associated with larger eigenvalues oscillate more rapidly

04/11/19 Sapienza University of Rome 12


Graph Fourier Transform
Can you hear the sound of a graph ? Nodal modes: sign(u2, u3, …u9)
2 2 2 2

1.5 1.5 1.5 1.5

1 1 1 1

0.5 0.5 0.5 0.5

0 0 0 0

−0.5 −0.5 −0.5 −0.5

−1 −1 −1 −1
−1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5

2 2 2 2

1.5 1.5 1.5 1.5

1 1 1 1

0.5 0.5 0.5 0.5

0 0 0 0

−0.5 −0.5 −0.5 −0.5

−1 −1 −1 −1
−1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5

13
04/11/19 Sapienza University of Rome
Graph Fourier Transform
Total variation on graphs

x
The total variation of a vector on a graph can be defined as
0 T 1/2

T Vx : kx k = x Lx
Substituting the inverse GFT , we get
x = Ux̂
XN
T Vx2 = xT Lx = x̂T ⇤ x̂ = i kx̂(i)k2

i=1
from which N
2
X 2
|x̂ |  |x̂ | = kx 0 2
k = T V 2
i i i i x

i=1

T Vx
|x̂(i)|  p
i

04/11/19 Sapienza University of Rome 14


Processing over Graphs

Example of GFT

Spectrum Cartography in
Behavior of GFT2
Cognitive Radio
10 1.1 20

8 1
10
6
0.9
0
4
0.8
2 -10

PSD (dB)
0.7
0
-20
0.6
-2
0.5 -30
-4

-6 0.4 -40

-8 0.3
-50
0 20 40 60 80 100
-10 0.2 Eigenvector index
-10 -5 0 5 10

04/11/19 Sapienza University of Rome 15


Processing over Graphs

Filtering graph signals

1.  Apply GFT


x̂ = UT x
2.  Filter in frequency domain

ŷ = H x̂ = diag(ĥ) x̂

3.  Inverse GFT

y = U ŷ
In total

y = U diag(ĥ) UT x

04/11/19 Sapienza University of Rome 16


Graph Fourier Transform
Examples of GFT

Temperature field GFT


48 30

30

46 25

28

44 20

26
Latitude (deg)

42 15

24

40 10

22

38 5

20

36 0
6 8 10 12 14 16 18 20 0 10 20 30 40 50 60 70 80 90 100
Longitude (deg)

04/11/19 Sapienza University of Rome 17


Graph Fourier Transform
Examples of GFT

Temperature field Noisy signal


48 48

30
35
46 46

28

44 44 30

26
Latitude (deg)

Latitude (deg)
42 42
25
24

40 40

22 20

38 38

20
15
36 36
6 8 10 12 14 16 18 20 6 8 10 12 14 16 18 20
Longitude (deg) Longitude (deg)

04/11/19 Sapienza University of Rome 18


Graph Fourier Transform
Examples of GFT

Noisy temp. field Reconstruction with 2 eigenvectors

48 48

27

35
46 46 26.5

26
44 30 44

25.5
Latitude (deg)

42 42 25
25

24.5
40 40

24
20

38 38 23.5

15 23
36 36
6 8 10 12 14 16 18 20 6 8 10 12 14 16 18 20
Longitude (deg)

04/11/19 Sapienza University of Rome 19


Graph Fourier Transform
Examples of GFT

Noisy temp. field Reconstruction with 24 eigenvectors

48 48

30

35
46 46
28

44 26
44 30
Latitude (deg)

24
42 42
25
22

40 40
20
20

38 38 18

15 16
36 36
6 8 10 12 14 16 18 20 6 8 10 12 14 16 18 20
Longitude (deg)

04/11/19 Sapienza University of Rome 20


Summary – Day 3

1.  Signals on Graphs

2.  The Graph Fourier Transform

3.  Localization Properties and Uncertainty Principle

4.  Sampling

04/11/19 Sapienza University of Rome 21


Processing over Graphs

Uncertainty principle – A review

Continuous-time signals
R1 R1 2
2 2 t |x(t)| dt
2 1
(t t0 ) |x(t)| dt 1
Time spread: T = R1 t0, = R 1 2 dt
1
|x(t)|2 dt 1
|x(t)|
R1 R1
2 1
(f f0 )2 |X(f )|2 df 1
f |X(f )| 2
df
Frequency spread: F = R1 f0 =, R 1 ,
1
|X(f )|2 df 1
|X(f )|2 df
Heisenberg’s principle:
1
T F
4⇡
A perfectly localized signal in time cannot be perfectly localized in frequency
and viceversa

04/11/19 Sapienza University of Rome 22


Processing over Graphs

Uncertainty principle – Alternative approach (Slepian-Landau-Pollack)

Define time and frequency spread as the dimension of the intervals such that a
given percentage of energy falls within them:

Z t0 +T /2
Z f0 +W/2
|x(t)|2 dt |X(f )|2 df
t0 T /2 2 f0 W/2 2
Z 1 =↵ . Z 1 = .
|x(t)|2 dt |X(f )|2 df
1 1

Not all pairs (↵, ) are admissible. Aim of the uncertainty principle is to find out
the region of all admissible pairs

04/11/19 Sapienza University of Rome 23


Processing over Graphs

Localization operators for graph signals

Given a graph G(V, E) and a subset of vertices S ✓ V , we define a


vertex-limiting operator as
D = diag(d11 , . . . , dN N )
where dii = 1, if i 2 S; / S , and a
dii = 0, if i 2
band-limiting operator as
B = U⌃UT ,
where ⌃ is a diagonal matrix selecting the frequency indices, i.e.

⌃ii = 1, if i 2 F; ⌃ii = 0, if i 2
/F
and U is the matrix whose columns are the eigenvectors of L

04/11/19 Sapienza University of Rome 24


Processing over Graphs

Localization operators for graph signals

The matrices D and B are symmetric and idempotent, and then they
represent orthogonal projectors onto the sets S and F respectively

We denote by D the projector onto the complement set S and, similarly for
the frequency domain, by B the projector onto the complement set F

04/11/19 Sapienza University of Rome 25


Processing over Graphs

Perfect localization conditions


A vector x is perfectly localized over the subset S if
Dx = x
and perfectly band-limited over F if

Bx = x
Theorem: A vector x is perfectly localized over both vertex set S and F
frequency set if and only if
max (BDB) =1
In such a case, x is the eigenvector associated to the the unit eigenvalue

Equivalently kBDk = 1; kDBk = 1

04/11/19 Sapienza University of Rome 26


Processing over Graphs

Maximally localized signals

i = arg maxkD ik
i

subject to k ik =1 h i, ji = 0, j 6= i

B i = i
This is equivalent to

i = arg maxkDB ik
i
subject to k ik =1
h i, ji = 0, j 6= i

04/11/19 Sapienza University of Rome 27


Processing over Graphs

Maximally localized signals

The solution is given by the eigenvectors associated to the operator BDB , i.e.

BDB i = i i
In particular, the band-limited signal maximally concentrated on the set S
is the eigenvector associated to the maximum eigenvalue of BDB

Equivalently, the (vertex)limited signal maximally concentrated over a band F


is the eigenvector associated to the maximum eigenvalue of DBD

04/11/19 Sapienza University of Rome 28


Processing over Graphs

Uncertainty principle

kDxk2 2 kBxk2 2
Define the concentration measures =↵ ; =
kxk 2 kxk 2

Theorem: The only admissible concentration pairs (↵, ) belong to the region
specified by the following inequalities
1 1 1
cos ↵ + cos cos max (BD)
p
1 1 1
cos ↵2 + cos
1 cos max BD
p
1 1 2 1
cos ↵ + cos 1 cos max BD
p p
1 1 1
cos 1 ↵2 + cos 1 2 cos max BD

04/11/19 Sapienza University of Rome 29


Processing over Graphs

Uncertainty principle

Admissible region
2
2 2
1 max BD max (B D )
1

(a)

1 2
max BD 2
max BD 1 ↵2

(a): 1 1 1
cos ↵ + cos cos max (BD)
Note: the upper right curve can shrink to the point (1, 1) in case of perfect
localization

04/11/19 Sapienza University of Rome 30


Summary – Day 3

1.  Signals on Graphs

2.  The Graph Fourier Transform

3.  Localization Properties and Uncertainty Principle

4.  Sampling

04/11/19 Sapienza University of Rome 31


Processing over Graphs

Sampling Theorem

Let us denote by r = Ds the sampled signal

Theorem: Given any band-limited signal, i.e. satisfying

Bs = s
it is possible to recover s from its sampled version r if and only if

kDBk < 1

04/11/19 Sapienza University of Rome 32


Processing over Graphs


Proof:
(if part)

Denote by the reconstructed signal
ŝ = Qr
The reconstruction error is zero if
s Qr = s Q I D s = s Q I DB s = 0

If , is invertible and then the error is zero
I DB
kDBk < 1

(only if part)

If , there are band-limited signals that are perfectly localized
kDBk = 1
over the complement set , hence the corresponding samples over would
S S
be null, so that no reconstruction would be possible



33
04/11/19 Sapienza University of Rome

Processing over Graphs


Reconstruction Algorithm # 1: Alternating Projection

Note: X1
1 n
R := I DB = DB
n=0

Iterative Algorithm:

r(0) = r
r(1) = r + DB r(0)

r(k) = r + DB r(k 1)

04/11/19 Sapienza University of Rome 34


Processing over Graphs


Example of Signal Reconstruction

0
|S| = 10
|S| = 20
-10
|S| = 30

Bandwidth = 10
-20
NMSE (dB)

-30 N = 50

-40

-50
0 5 10 15 20 25 30 35 40
Iteration Index

04/11/19 Sapienza University of Rome 35


Processing over Graphs


Reconstruction Algorithm # 2
|F |

X 1

ŝ = 2 hDr, ii i
i=1 i

where i are solution of

BDB i = i i

Note: This algorithm works with a finite number of steps

04/11/19 Sapienza University of Rome 36


Processing over Graphs

Reconstruction from noisy samples




Observation model: r = D (s + n)
Mean Square Error (using Rec. Alg. # 2):

|F | ⇣ ⌘ |F |
X 2 X 1
n T 2
4 trace i D i = n 2
i=1 i i=1 i
2 2
where n is the noise variance and i are related to the eigenvalues
of BDB through

Goal : Select the sampling set in order to minimize the Mean Square Error

04/11/19 Sapienza University of Rome 37


Processing over Graphs

Sampling Strategies


04/11/19 Sapienza University of Rome 38


Processing over Graphs

Sampling Strategies


04/11/19 Sapienza University of Rome 39


Sampling strategies

Not all samples are equal …




Example: IEEE 118 Bus (section of US power grid)

04/11/19 Sapienza University of Rome 40


Sampling strategies
Not all samples are equal: Comparison of alternative sampling strategies

Example: Average results on scale-free graphs

N = 30, |F| = 5 N = 200, |F| = 10

1
0
0 Random Random
MaxFro MaxFro
−1 −2
MinUniSet MinUniSet
−2 MaxSigMin −4 MaxSigMin
NMSE, dB

NMSE, dB
−3 MaxVol MaxVol
MinPinv −6 MinPinv
−4
Exhaustive
−8
−5

−6 −10

−7
−12
−8
5 10 15 20 25 30 20 40 60 80 100 120 140 160 180 200
Number of samples Number of samples

04/11/19 Sapienza University of Rome 41


Sampling strategies

Application: Traffic Prediction from smart sampling 35

250


200 30
Graph Fourier Transform


150

100 25

50

0 20

-50
0 100 200 300 400 500 600
Graph frequencies
15
0
Normalized Mean Square Deviation

-5 10

-10

5
-15

-20
0

-25
20 40 60 80 100
Vehicular Network in Manhattan
Bandwidth used for processing
|F| = 40

04/11/19 Sapienza University of Rome 42


Sampling strategies

Application: Cartography of e.m. field from sparse measurements

Approach:

Build similarity matrix

(Ei Ej )2 /(2 2 )
A ij = e I(kri rj k  r0 )

Build dictionary

L = U ⇤UT

Use signal model
y = D x = DU s

Reconstruct overall field from sparse measurements

x̂ = U (DU)† y

04/11/19 Sapienza University of Rome 43


Sampling strategies

Application: Cartography of e.m. field from sparse measurements


04/11/19 Sapienza University of Rome 44


Sampling

Reconstruction of non-band-limited signals



47

−6
Random Sampling
46 30 Greedy − Max σmin from [ ]
−8
45 Greedy − Min Σi 1/σ2i
−10
28 Greedy − Max norm vector products
44
−12
43
Latitude (deg)

NMSE (dB)
26 −14
42
−16
41
24 −18
40
−20
39
22
−22
38

37 −24
20

36 −26
6 8 10 12 14 16 18 2 4 6 8 10 12 14 16 18 20
Longitude (deg) Cardinality of F

04/11/19 Sapienza University of Rome 45


Sampling
Perfect reconstruction from sparse noisy data

Observation model: r = s + Dn

If the position of noisy samples is known, perfect reconstruction is possible under

the sampling theorem conditions

If the position of noisy samples is unknown, perfect reconstruction is still possible
using -norm reconstruction, i.e. minimizing
`1
0
ŝ = arg min kr s k1 .
s0 2B

In this case, perfect reconstruction is still possible if
1
|S||F | <
, with
2 ↵ := max |uj (i)|

2↵ j2F
i2V

04/11/19 Sapienza University of Rome 46


Recovery from sparse noisy samples

Question: Is perfect recovery possible in the presence of (very large) spiky noise ?

Numerical results
10

20


M SE, (dB)

30


40

|F| = 5
|F| = 10
50
|F| = 30
60
0 10 20 30 40 50 60 70
Number of noisy samples, |S|
80 90 100


Answer: Yes, provided that # samples is sufficiently larger than # of noise samples

04/11/19 Sapienza University of Rome 47


Processing over Graphs

Consensus algorithms
Consensus may be achieved by minimizing the disagreement
N
1 1 T
J(x) = aij (xi xj )2 := x Lx
2 i=1 j Ni
2
with aij 0 and Ni is set of neighbors of node i

Implementation via steepest descent

x[k + 1] = x[k] ✏ rJ(x) = x[k] ✏Lx[k] := W x[k]


where x[0] = x0 and W =I ✏L
or xi [k + 1] = xi [k] aij (xj [k] xi [k])
j Ni

04/11/19 Sapienza University of Rome 48


Processing over Graphs

Consensus algorithms

Does it converge ? Where ?

2
Choosing < ,
⇥max (L)

the eigenvalues of W are bounded between -1 and 1

In case of connected undirected graph, the algorithm converges to


1 T k
lim x[k] = lim W x0 = 11 x0
k!1 k!1 N
average consensus

04/11/19 Sapienza University of Rome 49


Processing over Graphs

Consensus algorithms

Convergence properties in case of directed graphs is much more intriguing

A digraph is strongly connected (SC) if, for every pair of nodes vi and vj ,
there exists a strong path from vi to vj and viceversa

A digraph is quasi-strongly connected (QSC) if, for every pair of nodes vi and vj ,
there exists a third node vr that can reach both nodes by a strong path

A digraph is weakly connected (WC) if any pair of distinct nodes can be joined
by a weak path

A digraph is disconnected if none of the above connectivity properties holds true

04/11/19 Sapienza University of Rome 50


Processing over Graphs

Consensus algorithms
The possible forms of consensus depend on network topology

strongly connected quasi strongly connected weakly connected


(SC) digraph (QSC) digraph (WC) digraph, with
a two-tree forest

Let us denote by the left eigenvector associated to the null eigenvalue of L

By construction, the right eigenvector associated to the null eigenvalue of L is


composed by all ones

04/11/19 Sapienza University of Rome 51


Processing over Graphs

Let G = {V, E} be a digraph with N nodes and Laplacian matrix L

Assume that G is QSC with K SCC’s G1 = {V1 , E1 } , …GK = {VK , EK }


where G1 is the root SCC (RSCC)

Then, the left-eigenvector of L associated to the zero eigenvalue has the


following structure:

i > 0, iff vi Vi
i = 0, otherwise

The zero eigenvalue of a graph is simple if and only if the graph is QSC

04/11/19 Sapienza University of Rome 52


Processing over Graphs

Consensus algorithms

Case # 1: Strongly connected network

-  All nodes converge to the the same value:


i = 1, 2, . . . , N
N
j=1 i xi [0]
xi [k] N
j=1 i

-  Final value is a weighted average, with weights depending on graph


topology

-  All nodes contribute to the final value

04/11/19 Sapienza University of Rome 53


Processing over Graphs

Consensus algorithms

Case # 2: The network is composed by one directed spanning tree

-  All nodes converge to the the same value:

xi [k] xroot [0] i = 1, 2, . . . , N


-  Only one node contributes to the final value: the root

04/11/19 Sapienza University of Rome 54


References
1.  D. I. Shuman, S. K. Narang, P. Frossard, A. Ortega, and P. Vandergheynst, “The emerging field of signal
processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains,”
IEEE Signal Processing Magazine, pp. 83–98, 2013

2. A. Sandryhaila and J.M.F. Moura, “Discrete signal processing on graphs,”
IEEE Transactions on Signal Processing, vol. 61, pp. 1644–1656, 2013.

3.  Xiaofan Zhu and Michael Rabbat, “Approximating signals supported on graphs,”
ICASSP, 2012, pp. 3921–3924

4.  M. Puschel and J. M. F. Moura, “Algebraic signal processing theory: Foundation and 1-d time,” 2008, pp.
3572–3585.

5. M. Puschel and J. M. F. Moura, “Algebraic signal processing theory: 1-d space,” 2008, pp. 3586–3599.

6.  Ameya Agaskar and Yue M Lu, “A spectral graph uncertainty principle,” IEEE Transactions on Information
Theory, vol. 59, no. 7, pp. 4338–4356, 2013.
7.  Isaac Pesenson, “Sampling in paley-wiener spaces on combinatorial graphs,” Transactions of the American
Mathematical Society, vol. 360, no. 10, pp. 5603–5627, 2008

04/11/19 Sapienza University of Rome 55


References
8.  S. K Narang, A. Gadde, A. Ortega, “Signal processing techniques for interpolation in graph structured data,”
in ICASSP 2013, pp. 5445–5449

9.  M. Tsitsvero, S. Barbarossa, “The degrees of freedom of signals on graph”, EUSIPCO 20015

10.  M. Tsitsvero, S. Barbarossa, P. Di Lorenzo, “Uncertainty principle and sampling of signals defined on graph”
Asilomar Conf., Nov. 2015

11.  M. Tsitsvero, S. Barbarossa, P. Di Lorenzo, “Signal Processing on Graph: Uncertainty principle and
sampling”, IEEE Trans. on Signal Processing, Sep. 2016

12.  S. Sardellitti, S. Barbarossa, P. Di Lorenzo, “On the Graph Fourier Transform for Directed Graphs”, IEEE
Journal of Selected Topics in Signal Processing, Sep., 2017, pp. 796-811

13.  P. Di Lorenzo, P. Banelli, S. Barbarossa, S. Sardellitti, “Distributed Adaptive Learning of Graph Signals”, IEEE
Trans. on Signal Processing, Aug. 15, 2017, pp. 4193-4208

14.  P. Di Lorenzo, S. Barbarossa, P. Banelli, S. Sardellitti, “Adaptive Least Mean Squares Estimation of Graph
Signals”, IEEE Trans. on Signal and Inform. Processing over Networks, Dec. 2016, pp. 555-568

04/11/19 Sapienza University of Rome 56

You might also like