MLC 04 Graph Methods Ranking Communities Link Prediction-Sose2023
MLC 04 Graph Methods Ranking Communities Link Prediction-Sose2023
1 Overview
2 Overview: Graph/Network Methods
3 Ranking on Networks/Graphs
4 Community Detection
5 Link Prediction
6 Summary
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
• Ranking:
• Importance/impact/influence of nodes
• Node-centered criterion
• Centralities
• Pagerank algorithm
• Community Detection
• Detecting groups/clusters etc.
• Different criteria; nodes and edges
• Basically: connection structure
• Extensions: Labels of nodes and/or edges
• Compare to machine learning – clustering algorithms
• Link Prediction
• Consider link structure
• Structural: Hidden/missing links in a network
• Temporal: Predicting new links to appear in a network
1
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Basic topological
features
• Low Density
• Small Diameter
• Scale-free
• High Clustering
Coefficient
2
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Notation
Notations
• AG Adjacency Matrix d : aij 6= 0 iff (vi , vj ) ∈ E , 0 otherwise.
• n = |V |
• m = |E |. Often m ∼ n
• Γ(v ) : neighbors of node v . Γ(v ) = {x ∈ V : (x, v ) ∈ E }.
• Node degree : d(v ) =k Γ(v ) k
3
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
4
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Reprise: Definitions I
5
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Reprise: Definitions II
6
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
7
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Centralities
Node centralities
8
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Node Centrality I
9
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Node Centrality II
• The closeness centrality clos considers the length of these
shortest paths. Then, the shorter its shortest path length to
all other reachable nodes, the higher a vertex ranks:
1
clos(v ) = P .
t∈V \v dG (v , t)
where λ ∈ R is a constant.
10
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Ranking
• Estimate:Importance of a node
• In complex (social) systems also: Influence of a node
• Centrality-based: different criteria – e.g., degree-based,
path-based (relating e.g., to degree, betweenness, closeness,
eigenvector centrality)
• Now: Pagerank:
• Considering link structure/paths
• Very many applications in graph-based machine learning
• Example: Optimizing network structure in graph-neural
networks
11
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Ranking Methods
• 2 independently developed algorithms
– PageRank (Brin & Page)
– HITS: Hypertext Induced Topic Search (Kleinberg)
• Basic idea:
– Popularity
– Link-based
– Intuitively: Web pages are popular, if many popular
pages link to them
– "PageRank is a global ranking of all web pages,
regardless of their content, based solely on their
location in the Web's graph structure.“
[Page et al 1998]
12
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
13
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Popularity-based Ranking
• Hubs and Authorities
– Hubs link to highly rated nodes (collecting links)
– Authorities are linked to (as being relevant) from
other popular nodes
14
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
PageRank
15
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Vorlesung
Web
Science
16
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
17
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Starting Problems
• Sinks & Cycles
– Some pages get fully ranked, some
get no "ranking" contribution
– Cycles "reverse“ the evaluation
– Some nodes without outgoing nodes
è Dangling nodes
• How many iterations?
– Does the process converge?
– Does it converge to a single vector?
18
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
19
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
20
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
21
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
22
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
23
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
24
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
25
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Pagerank: Result
26
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Community Detection
27
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Community detection
Overview
I Given a network/graph, find “modules”
• Single network
• Multiplex networks
• Attributed networks
I Community structures
• Graph Clustering/disjoint communities
• Hierarchical organization
• Overlapping communities
I Questions:
• What is “a community”?
• What are “good” communities?
• How do we evaluate these?
28
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Community detection
Definitions
I A dense subgraph loosely coupled to other modules in the
network
I A community is a set of nodes seen as “one” by nodes outside
of the community
I A subgraph where almost all nodes are linked to other nodes
in the community.
I ...
29
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
30
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
31
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
32
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
33
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
34
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
35
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
CPM Example
Cliques of size 3:
{1, 2, 3}, {1, 3, 4}, {4, 5, 6},
{5, 6, 7}, {5, 6, 8}, {5, 7, 8},
{6, 7, 8}
Communities:
{1, 2, 3, 4}
{4, 5, 6, 7, 8}
36
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
37
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
38
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
39
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
40
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Vertex Similarity
• Jaccard Similarity [ ]-1
• Cosine similarity ∑s
41
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
42
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Block Models
Two communities:
{1, 2, 3, 4} and {5, 6, 7, 8, 9}
43
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Cut
• Most interactions are within group whereas
interactions between groups are few
• Community detection à Minimum cut problem
• Cut: A partition of vertices of a graph into two disjoint
sets
• Minimum cut problem: Find a graph partition such that
the number of edges between the two sets is
minimized
44
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
45
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
46
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Modularity Maximization
• Modularity measures the strength of a community
partition by taking into account the degree distribution
• Given a network with m edges, the expected number
of edges between two nodes with di and dj is
The expected number of
edges between nodes 1
and 2 is
3*2/ (2*14) = 3/14
• Strength of a community:
• Modularity:
47
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Modularity Matrix
• Modularity matrix:
48
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Two Communities:
{1, 2, 3, 4} and {5, 6, 7, 8, 9}
k-means
Modularity Matrix
49
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
• Representative approaches:
– Divisive Hierarchical Clustering
– Agglomerative Hierarchical clustering
50
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
51
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Edge Betweenness
• The strength of a tie can be measured by edge
betweenness
• Edge betweenness: the number of shortest paths
that pass along with the edge
52
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
53
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
54
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
55
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
1 X d̄C (u)
IAODF(C ) := 1 − (1)
nC d(u)
u∈C
56
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
57
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
59
Knowledge-Based Systems (KBS), Prof. Dr. M. Atzmüller, Osnabrück University
Web Mining,
Computer, Java
60
Knowledge-Based Systems (KBS), Prof. Dr. M. Atzmüller, Osnabrück University
61
Knowledge-Based Systems (KBS), Prof. Dr. M. Atzmüller, Osnabrück University
Evaluation: Datasets
■ BibSonomy dump (until January 2010)
■ 175,521 tags, 5,579 users, 467,291 resources,
2,120,322 tag assignments, 700 friendship links
■ Friend, click,Visit graph
■ Delicious (HetRec workshop): 1,861 users,
7,664 bi-directional links, 53,388 tags
■ Last.fm (HetRec workshop): 1,892 users,
12,717 bi-directional links, 11,946 tags
62
Knowledge-Based Systems (KBS), Prof. Dr. M. Atzmüller, Osnabrück University
63
Knowledge-Based Systems (KBS), Prof. Dr. M. Atzmüller, Osnabrück University
64
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Here, we assume that the values δ̄C (i), i = 1, . . . , nC and δ̄C’ (i),
i = 1, . . . , nC 0 are the inter-degrees of the nodes in C and C 0
respectively in ascending order, such that δ̄C (i), i = 1, . . . , τn
denote the minimal τn inter-degrees with respect to C .
65
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Proposition 2
An optimistic estimate for SIDX(C ) is given by
(P (P ))
τn t
n(n − 1) i=1 δ̄C (i)
nC
i=1 δ̄C (i)
oe(SIDX(C )) := 1 − max , min ,
2m p(C ) t=τn t(n − t)
(
n2
4 , if nC ≥ n2 ,
where p(C ) := .
nC (n − nC ) otherwise
66
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Proof.
For a subcommunity C 0 ⊆ C with |C 0 | ≥ τn we have
n(n − 1) m̄C 0
SIDX(C 0 ) = 1− ≤
2m nC (n − nC 0 )
0
PnC 0
n(n − 1) i=1 δ̄C (i)
≤ 1− ≤ (3)
2m nC 0 (n − nC 0 )
Pτn
n(n − 1) i=1 δ̄C (i)
≤ 1− nC . (4)
2m maxt=τn {t(n − t)}
n Pt o
i=1 δ̄C (i)
From (3) it is clear that 1 − n(n−1)
2m minnC
t=τn t(n−t) is an optimistic
estimate for SIDX(C ). On the other hand we have
maxnt=τ
C
n
{t(n − t)} = p(C ), since t(n − t) has its maximum at t = n2 .
Pτn
n(n−1) δ̄C (i)
Together with (4) we obtain 1 − 2m
i=1
p(C ) as another optimistic
estimate.
67
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Proposition 3
For the inverse Average-ODF let d̃C (u) := d̄d(u) C (u)
and δ̃C (i),
i = 1, . . . , nC as these ratios for all nodes in C in ascending order.
Then
τn
1 X
oe(IAODF(C )) := 1 − δ̃C (i)
τn
i=1
68
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Proof.
For a subcommunity C 0 ⊆ C with |C 0 | ≥ τn we have
1 X | {{u, v } ∈ E : v ∈ V \ C 0 } |
IAODF(C 0 ) = 1 − ≤
nC 0 0
d(u)
u∈C
1 X | {{u, v } ∈ E : v ∈ V \ C } |
≤ 1− =
nC 0 0
d(u)
u∈C
nC 0
1 X 1 X
= 1− d̃C (u) ≤ 1 − δ̃C (i) ≤
nC 0 0
nC 0
u∈C i=1
τn
1 X
≤ 1− δ̃C (i) .
τn
i=1
69
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Proposition 4
An optimistic estimate for the local modularity contribution can be
derived based only on the number of edges mC within the
community:
(
0.25, if mC ≥ m2 ,
oe(MODL(C )) = m 2
mC
m − m2 , otherwise.
C
70
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
mC X d(u) d(v )
MODL(C ) = − =
m 4m2
u,v ∈C
mC 1 X X
= − 2
d(u) d(v ) =
m 4m
u∈C v ∈C
mC 1 X
= − d(u)(2mC + m̄C ) =
m 4m2
u∈C
mC 1
= − (2mC + m̄C )2 ≤
m 4m2
mC m2
≤ − C2 =
m m
= oe(MODL(C
ˆ )).
71
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
72
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
73
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
74
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
75
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
NMI
𝑘 (") : number of communities in partition a
𝑘 ($) : number of communities in partition b
Notation:
𝜋a, 𝜋b denote different partitions.
: : number of nodes in Partition a assigned
to the h-th community;
: number of nodes in Partition b assigned
to l-th community;
:: number of nodes assigned to h-th community in
partition a, and l-th community in partition b.
NMI-Example
4, 5,
1, 2, 3
• Partition a: [1, 1, 1, 2, 2, 2] 6
=0.8278
77
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
78
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Accuracy Example
1, 1, 4,
4,
2 5,
2, 3 5, 6 3
6
Ground Truth Clustering Result
Ground Truth
C(vi) = C(vj) C(vi) != C(vj)
Clustering C(vi) = C(vj) 4 0
Result C(vi) != C(vj) 2 9
79
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Community Assessment
w/ Evidence Networks
• Evaluation & comparison of communities
• Are the discovered communities meaningful?
• Hard problem!
– Gold-standard è costly!
– User study è costly!
• Approach in [Mitzlaff et al. 2011]
– Cost-effective
– Secondary data
– Relative ranking of communities
80
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
81
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Friend Graph:
è User v is a friend
of user u
Click Graph:
è User u clicked on a Visit Graph:
post of user v è User u looked at
Size
82
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Evaluation: Paradigm I
[Mitzlaff et al. 2011]
I) Given:
II) Evaluation:
Map users to nodes in evidence network and
calculate quality measure (e.g., modularity)
83
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Evaluation: Paradigm II
[Mitzlaff et al. 2011]
84
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Experiments: Description
[Mitzlaff et al. 2011]
85
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Experiments: Results
[Mitzlaff et al. 2011]
è Quality measure
decreases as
expected for all
considered
initial
clusterings
Initial clustering Random clustering
86
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
è Ranking is
similar for
different
evidence
networks
87
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Experiments: Results
[Mitzlaff et al. 2011]
è Ranking slightly
varies for top
positions –
degree of
explicitness
correlates with
ranking
consistency
88
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Experiments: Results
[Mitzlaff et al. 2011]
• Modularity and conductance values highly
correlate between the Friend Graph and each
of the other evidence networks (Friend
Graph as a reference)
Evidence Modularity Conductance
Network
Follower 0.86 ∓ 0.17 0.89 ∓ 0.28
Graph
Group Graph 0.91 ∓ 0.13 1.00 ∓ 0.01
Copy Graph 0.82 ∓ 0.17 0.99 ∓ 0.03
Click Graph 0.80 ∓ 0.17 0.99 ∓ 0.04
Visit Graph 0.72 ∓ 0.25 0.97 ∓ 0.06
89
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
An animal A health
community community
90
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
è Community detection
è Personalization
è User recommendation
è Necessary: Descriptive method (see descriptive
community detection
è Goal: Personalized view on application
91
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
92
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Link Prediction
Link prediction
I Structural:
Find hidden/missing links
in a network
ex. Missing links in
Wikipedia
I Temporal:
Predicting new links to
appear at time tp based on
the network state at time
t < tp
93
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Link Prediction
94
69 Finally, Section 6 concludes with a summary and outlines interesting directions for future work.
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
2 BACKGROUND
95
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
97
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
98
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
99
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
2
2
100
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Unsupervised ...
Table 7 Overview of all features: formal definition, mean values of user pairs with (∅Valw) and without (∅Valwo) trading interactions, and
Feature Description [Eberhard, Trattner & Atzmueller, 2018]
Formal definition ∅Valw
103
Evaluating link prediction methods 753
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Some [Yang,
Lichtenwalter
& Chawla,
moreFig. 1 Link prediction and evaluation. The black color indicates snapshots of the network from which link
2015]
details
prediction features are calculated (feature network). The gray color indicates snapshots of the network from
which the link prediction instances are labeled (label network). We can observe all links at or before time t,
and we aim to predict future links that will occur at time t + 1
Fig. 1 Link prediction and evaluation. The black color indicates snapshots of the network from which link
prediction features are calculated (feature network). The gray color indicates snapshots of the network from
network
whichsnapshots based on
the link prediction particular
instances segments
are labeled of data. We
(label network). Comparisons among
can observe all links atpredictors
or before time t,
require
andthat evaluation
we aim to predictencompasses
future links thatprecisely
will occurthe same
at time t +set
1 of instances whether the predictor
is unsupervised or supervised. We construct four network snapshots:
– Training
networkfeatures:
snapshots data fromonsome
based period segments
particular in the past, t−x up
of Gdata. to G t−1 , from
Comparisons which
among we
predictors
derive
requirefeature vectors for
that evaluation training data.
encompasses precisely the same set of instances whether the predictor
– Training labels: data
is unsupervised from G t , the
or supervised. Welast training-observable
construct four networkperiod, from which we derive
snapshots:
class labels, whether the link forms or not, for the training feature vectors.
– Training features: data from some period in the past, G t−x up to G t−1 , from which we
– Testing features: data from some period in the past up to G t , from which we derive feature
derive feature vectors for training data.
vectors for testing data. Sometimes it may be ideal to maintain the window size that we
– Training labels: data from G t , the last training-observable period, from which we derive
use for the training feature vector, so we commence the snapshot at G t−x+1 . In other
class labels, whether the link forms or not, for the training feature vectors.
cases, we might want to be sure not to ignore effects of previously existing links, so we
– Testing features: data from some period in the past up to G t , from which we derive feature
commence the snapshot at G Sometimes
vectors for testing data. t−x
. it may be ideal to maintain the window size that we
– Testing
uselabels:
for thedata from G
training t+1 , from
feature which
vector, we commence
so we derive classthelabels for theattesting
snapshot feature
G t−x+1 . In other
vector. These
cases, wedata arewant
might strictly excluded
to be sure notfrom inclusion
to ignore in any
effects training data.
of previously existing links, so we
A classifier is constructed
commence fromatthe
the snapshot training
G t−x . data and evaluated on the testing data. There
are always strictly
– Testing divided
labels: datatraining and, from
from G t+1 testing sets,we
which because
derive G
class is never
t+1 labels for observable in
the testing feature
training. vector. These data are strictly excluded from inclusion in any training data. 104
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
105
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
[Scholz et al., AAAI ICWSM 2013]
Hybrid Rooted Pagerank
Novel
extension of
Rooted
Pagerank
algorithm for
multiplex
networks
106
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
[Scholz et al., AAAI ICWSM 2013]
Link Prediction - Hybrid Rooted PR
Context:
Contact network
at academic
conference
• Co-author
network
(DBLP);
In addition:
• Encounter
network
(close-by)
• Paper
similarity
network
107
Machine Learning for Complex Data, Prof. Dr. M. Atzmüller, Osnabrück University
Summary
108