Tutorial3
Tutorial3
Antonio Longa1,2
MobS1 Lab, Fondazione Bruno Kessler,Trento, Italy
SML2 Lab, University of Trento, Italy
Recap 01
Introduction
02
TABLE OF 04 Pros of GAT
03
Graph attention layer
(GAT) CONTENTS
05 Message passing
Implementation
06 Implement our
GCNConv
07 GAT implementation
01 Recap
01 Recap
PROBLEMS:
■ Different sizes
01 Recap
PROBLEMS:
■ Different sizes
𝙂 Adj(𝙂)
𝙂 = 𝙂’ Adj(𝙂)≠ Adj(𝙂’)
𝙂’ Adj(𝙂’)
01 Recap
COMPUTATION GRAPH
The neighbour of a node defines its computation graph
INPUT GRAPH
01 Recap
COMPUTATION GRAPH
The neighbour of a node defines its computation graph
COMPUTATION GRAPH
The neighbour of a node defines its computation graph
COMPUTATION GRAPH
The neighbour of a node defines its computation graph
XA
XB
XE
Neural Networks
Ordering invariant
Aggregation
Sum
Average
02 Introduction
02 Introduction
02 Introduction
02 Introduction
2) Self attention
03 Graph Attention layer
2) Self attention
03 Graph Attention layer
2) Self attention
3) Normalization
03 Graph Attention layer
4) Attention mechanism
1
03 Graph Attention layer
4) Attention mechanism
1
1
03 Graph Attention layer
4) Attention mechanism
1
1
03 Graph Attention layer
5) Use it :)
03 Graph Attention layer
6) Multi-head attention
03 Graph Attention layer
6) Multi-head attention
03 Graph Attention layer
6) Multi-head attention
03 Graph Attention layer
6) Multi-head attention
Concatenation Average
Features representations of
node i at the k-th layer
05 Message passing implementation
Features representations of
node i at the k-th layer
Differentiable function
Eg: MLP
05 Message passing implementation
Differentiable function
Eg: MLP
05 Message passing implementation
Differentiable function
Eg: MLP
message()
05 Message passing implementation
PyTorch Geometric provides the MessagePassing base class.
update() message()
05 Message passing implementation
PyTorch Geometric provides the MessagePassing base class.
Propagate messages
05 Message passing implementation
PyTorch Geometric provides the MessagePassing base class.
METHODS
Propagate messages
Layer Name
05 Message passing implementation
HOW TO USE IT?
Layer Name
05 Message passing implementation
HOW TO USE IT?
Layer Name
Layer Name
Layer Name
In steps:
1. Add self loops
2. A linear transformation to node feature matrix
3. Compute normalization coefficients
4. Normalize node features
5. Sum up neighboring node features
06 Implement our GCNConv
Simple example
In steps:
1. Add self loops
2. A linear transformation to node feature matrix Forward method
3. Compute normalization coefficients
4. Normalize node features Message method
5. Sum up neighboring node features int
06 Implement our GCNConv
GCNConv inherits from MessagePassing
06 Implement our GCNConv
GCNConv inherits from MessagePassing
Jupyter-Notebook