Dionysus Slides

Download as pdf or txt
Download as pdf or txt
You are on page 1of 82

A Practical Guide to Persistent

Homology

Dmitriy Morozov
Lawrence Berkeley National Lab
A Practical Guide to Persistent
Homology
(Dionysus edition)
from dionysus import *
Code snippets available at: from dionysus.viewer import *
http://hg.mrzv.org/Dionysus-tutorial from readers import *
Dmitriy Morozov
Lawrence Berkeley National Lab
Dionysus
C++ library
Implements various algorithms that Ive found interesting over the years:
ordinary persistence
vineyards
image persistence
zigzag persistence
persistent cohomology
circular coordinates
alpha shapes
Vietoris-Rips complexes
bottleneck and wasserstein distances between diagrams
To make life easier, added Python bindings
This talk exclusively in Python
Python
Good news: You already know Python! Its just like pseudo-code in
your papers, but cleaner. ;-)
Lists and list comprehensions
lst1 = [1,3,5,7,9,11,13]
lst2 = [i for i in lst1 if i < 9]
print lst2 # [1,3,5,7]
Functions
def pow(x):
def f(y):
return y**x
return f
Loops and conditionals
for i in lst1:
if i % 3 == 0 and i > 5:
print square(i)
Lots of extra functionality in modules
from math import sqrt
from dionysus import *
Persistent Homology
Over a decade old now. Introduced as a way to detect prominent topo-
logical features in point clouds. Since then evolved into a rich theory with
many applications.

What is the homology of this point cloud?


Persistent Homology
Over a decade old now. Introduced as a way to detect prominent topo-
logical features in point clouds. Since then evolved into a rich theory with
many applications.

What is the homology of this point cloud?

Squint our eyes


Persistent Homology
Over a decade old now. Introduced as a way to detect prominent topo-
logical features in point clouds. Since then evolved into a rich theory with
many applications.

What is the homology of this point cloud?

Squint our eyes


Persistent Homology
Over a decade old now. Introduced as a way to detect prominent topo-
logical features in point clouds. Since then evolved into a rich theory with
many applications.

What is the homology of this point cloud?

Squint our eyes


Persistent Homology
Over a decade old now. Introduced as a way to detect prominent topo-
logical features in point clouds. Since then evolved into a rich theory with
many applications.

What is the homology of this point cloud?

Squint our eyes


Persistent Homology
Over a decade old now. Introduced as a way to detect prominent topo-
logical features in point clouds. Since then evolved into a rich theory with
many applications.

What is the homology of this point cloud?

Squint our eyes


Persistent Homology
Over a decade old now. Introduced as a way to detect prominent topo-
logical features in point clouds. Since then evolved into a rich theory with
many applications.

What is the homology of this point cloud?

Squint our eyes no natural fixed scale persistent homology


Eye Squinting
P point set in Rn Pr = pP Br (p)
Eye Squinting
P point set in Rn Pr = pP Br (p)
Eye Squinting
P point set in Rn Pr = pP Br (p)
Eye Squinting
P point set in Rn Pr = pP Br (p)
Eye Squinting
P point set in Rn Pr = pP Br (p)
Eye Squinting
P point set in Rn Pr = pP Br (p)

0 H(Pr1 ) H(Pr2 ) . . . H(Rn )


Eye Squinting
P point set in Rn Pr = pP Br (p)

Death
10 points

Dgm1
Birth

0 H(Pr1 ) H(Pr2 ) . . . H(Rn )


Eye Squinting
P point set in Rn Pr = pP Br (p)

Death
10 points

Dgm1
Birth

0 H(Pr1 ) H(Pr2 ) . . . H(Rn )


Eye Squinting
P point set in Rn Pr = pP Br (p)

Death
10 points

Dgm1
Birth

Squinting our eyes gives us a


continuous function. Algorithms
work with (discrete) simplicial
0 H(Pr1 ) H(Pr2 ) . . . H(Rn ) complexes.
Simplices and Complexes
0 (Geometric) k-simplex: convex hull of (k + 1) points.
(Abstract) k-simplex: subset of (k + 1) elements of a
universal set.
2 1
Boundary: [v0 , . . . , vk ] = i (1)i [v0 , . . . , vi , . . . , vk ]
P

s = Simplex([0,1,2])
print "Dimension:", s.dimension Dimension: 2
Vertices:
print "Vertices:" 0
for v in s.vertices: 1
print v 2
Boundary:
print "Boundary:" <1, 2>
for sb in s.boundary: <0, 2>
print sb <0, 1>
Simplices and Complexes
0 (Geometric) k-simplex: convex hull of (k + 1) points.
(Abstract) k-simplex: subset of (k + 1) elements of a
universal set.
2 1
Boundary: [v0 , . . . , vk ] = i (1)i [v0 , . . . , vi , . . . , vk ]
P

Simplicial complex: collection of simplices closed under


face relation.
1 3

2 4

not a simplicial
complex:
Simplices and Complexes
0 (Geometric) k-simplex: convex hull of (k + 1) points.
(Abstract) k-simplex: subset of (k + 1) elements of a
universal set.
2 1
Boundary: [v0 , . . . , vk ] = i (1)i [v0 , . . . , vi , . . . , vk ]
P

Simplicial complex: collection of simplices closed under


face relation.
1 3
complex = [Simplex(vertices) for vertices in
[[0], [1], [2], [3], [4], [5],
0
[0,1], [0,2], [1,2], [0,1,2],
[1,3], [2,4], [3,4]]]
2 4

not a simplicial
complex:
Simplices and Complexes
0 (Geometric) k-simplex: convex hull of (k + 1) points.
(Abstract) k-simplex: subset of (k + 1) elements of a
universal set.
2 1
Boundary: [v0 , . . . , vk ] = i (1)i [v0 , . . . , vi , . . . , vk ]
P

Simplicial complex: collection of simplices closed under


face relation.
1 3
complex = [Simplex(vertices) for vertices in
[[0], [1], [2], [3], [4], [5],
0
[0,1], [0,2], [1,2], [0,1,2],
[1,3], [2,4], [3,4]]]
2 4

not a simplicial simplex9 = Simplex([0,1,2,3,4,5,6,7,8,9])


complex: sphere8 = closure([simplex9], 8)
print len(sphere8)
1022
Homology
over Z2 , a set of
k-chain = formal sum of k-simplices simplices
k-cycle = chain without a boundary
k-boundary = boundary of an (k + 1)-dimensional chain

two cycles are homologous


Z = cycle group
if they differ by a boundary
B = boundary group
H = Z/B

homology: count cycles up to


differences by boundaries
Homology in Dionysus
Dionysus doesnt compute homology directly, but we can get it as a by-
product of persistent homology.

complex = sphere8
Dimension: 0
0 inf
f = Filtration(complex, dim_cmp)
Dimension: 1
p = StaticPersistence(f) Dimension: 2
p.pair_simplices() Dimension: 3
Dimension: 4
dgms = init_diagrams(p,f, lambda s: 0) Dimension: 5
Dimension: 6
for i, dgm in enumerate(dgms): Dimension: 7
print "Dimension:", i Dimension: 8
print dgm 0 inf
03-complex.py
Persistent Homology (pipeline)
Filtration of a simplicial complex:

K1 K2 . . . Kn

(w.l.o.g. assume Ki+1 = Ki + i ).


so, really, an ordering of simplices

1 2 3 2 4 2 5 2 6

1 1 1 1 1
0 0 0 0 0 0
Persistent Homology (pipeline)
Filtration of a simplicial complex:

K1 K2 . . . Kn

(w.l.o.g. assume Ki+1 = Ki + i ).


so, really, an ordering of simplices

1 2 3 2 4 2 5 2 6

1 1 1 1 1
0 0 0 0 0 0

simplices = [([0], 1), ([1], 2), ([0,1], 3), ([2], 4), \


([1,2], 5), ([0,2], 6)]
f = Filtration()
for vertices, time in simplices:
f.append(Simplex(vertices, time))
f.sort(dim_data_cmp)
for s in f:
04-1-filtration.py
print s, s.data # s.data is the time
Persistent Homology (pipeline)
Filtration of a simplicial complex:

K1 K2 . . . Kn

(w.l.o.g. assume Ki+1 = Ki + i ).


so, really, an ordering of simplices

1 2 3 2 4 2 5 2 6

1 1 1 1 1
0 0 0 0 0 0

H1 :
H0 :

H(K1 ) H(K2 ) . . . H(Kn )


Persistent Homology (pipeline)
p = StaticPersistence(f)
Filtration of a simplicial complex:
p.pair_simplices()
dgms = init_diagrams(p,K1 f)
K2 . . . Kn
for i, dgm in enumerate(dgms):
print
(w.l.o.g. "Dimension:",
assume Ki+1 = Kii + i ).
print dgm 04-2-persistence.py
so, really, an ordering of simplices

1 2 3 2 4 2 5 2 6

1 1 1 1 1
0 0 0 0 0 0

H1 :
H0 :

H(K1 ) H(K2 ) . . . H(Kn )


Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
r = min dP (x)
xVor
Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
r = min dP (x)
xVor

from math import sqrt


points = read_points(data/trefoil.pts)
f = Filtration()
fill_alpha_complex(points, f)
show_complex(points, [s for s in f if sqrt(s.data[0]) < 1])

Fills f with all the simplices of the Delaunay triangulation


(thanks to CGALs Delaunay package).

The data field of each simplex is set to a pair (r2 , Vor 6= ).


Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
r = min dP (x)
xVor

from math import sqrt


points = read_points(data/trefoil.pts) an alpha shape is a one-liner thanks
f = Filtration() to list comprehensions

fill_alpha_complex(points, f)
show_complex(points, [s for s in f if sqrt(s.data[0]) < 1])

Fills f with all the simplices of the Delaunay triangulation


(thanks to CGALs Delaunay package).

The data field of each simplex is set to a pair (r2 , Vor 6= ).


Filtrations: -shapes
Kr = Nrv{Br (u) Vor u}
P : Kr ' pP Br (p)
Kr1 Kr2 . . . Kr . . .
r = min dP (x)
xVor

from math import sqrt


points = read_points(data/trefoil.pts)
f = Filtration()
fill_alpha_complex(points, f)
show_complex(points, [s for s in f if sqrt(s.data[0]) < 1])
f.sort(dim_data_cmp)
05-alpha-shapes.py
p = StaticPersistence(f)
p.pair_simplices()

dgms = init_diagrams(p, f, lambda s: sqrt(s.data[0]))


show_diagram(dgms)
Filtrations: Vietoris-Rips
VR(r) = { P | |u v| < r u, v }
(clique complex of r-nearest neighbor graph)

NB: only pairwise distances matter


Filtrations: Vietoris-Rips
VR(r) = { P | |u v| < r u, v }
(clique complex of r-nearest neighbor graph)

NB: only pairwise distances matter


points = read_points(data/trefoil.pts)
distances = PairwiseDistances(points)
distances = ExplicitDistances(distances)
rips = Rips(distances)
f = Filtration()
rips.generate(2, 1.7, f.append)
print "Number of simplices:", len(f)

show_complex(points, f)
show_complex(points, [s for s in f if rips.eval(s) < 1.6])
Filtrations: Vietoris-Rips
VR(r) = { P | |u v| < r u, v }
(clique complex of r-nearest neighbor graph)

NB: only pairwise distances matter


points = read_points(data/trefoil.pts)
distances = PairwiseDistances(points)
distances = ExplicitDistances(distances)
rips = Rips(distances) skeleton
f = Filtration() cutoff
rips.generate(2, 1.7, f.append)
print "Number of simplices:", len(f)

show_complex(points, f)
show_complex(points, [s for s in f if rips.eval(s) < 1.6])
Filtrations: Vietoris-Rips
VR(r) = { P | |u v| < r u, v }
(clique complex of r-nearest neighbor graph)

NB: only pairwise distances matter


points = read_points(data/trefoil.pts)
distances = PairwiseDistances(points)
distances = ExplicitDistances(distances)
rips = Rips(distances) skeleton
f = Filtration() cutoff
rips.generate(2, 1.7, f.append)
print "Number of simplices:", len(f)
06-rips.py
show_complex(points, f)
show_complex(points, [s for s in f if rips.eval(s) < 1.6])

f.sort(rips.cmp)
p = StaticPersistence(f)
p.pair_simplices()

dgms = init_diagrams(p, f, rips.eval)


show_diagram(dgms[:2])
Filtrations: Lower-Star
f
f : Vrt K R
f : |K| R linearly interpolated

a |K|a = f 1 (, a]
Interested in the filtration:

|K|a1 |K|a2 . . . |K|an


Filtrations: Lower-Star
f
f : Vrt K R
f : |K| R linearly interpolated

a |K|a = f 1 (, a]
Interested in the filtration:

|K|a1 |K|a2 . . . |K|an

Ka = { K | max f(v) a}
v
(changes only as a passes vertex values)

|K|a ' Ka
So, instead, we can compute:

Ka1 Ka2 . . . Kan


Filtrations: Lower-Star
f : Vrt K R
f
f : |K| R linearly interpolated

|K|a = f 1 (, a]
Interested in the filtration:

|K|a1 |K|a2 . . . |K|an

Ka = { K | max f(v) a}
v
(changes only as a passes vertex values)

|K|a ' Ka
So, instead, we can compute:

Ka1 Ka2 . . . Kan


Filtrations: Lower-Star
f : Vrt K R
f
f : |K| R linearly interpolated

|K|a = f 1 (, a]
Interested in the filtration:
a
|K|a1 |K|a2 . . . |K|an

Ka = { K | max f(v) a}
v
(changes only as a passes vertex values)

|K|a ' Ka
So, instead, we can compute:

Ka1 Ka2 . . . Kan


Filtrations: Lower-Star
elephant_points, elephant_complex = read_off(data/cgal/elephant.off)
elephant_complex = closure(elephant_complex, 2)
show_complex(elephant_points, elephant_complex)

def pojection(points, axis = 1): # projection onto a coordinate axis


def value(v):
return points[v][axis]
return value
value = projection(elephant_points, 1)
Filtrations: Lower-Star
elephant_points, elephant_complex = read_off(data/cgal/elephant.off)
elephant_complex = closure(elephant_complex, 2)
show_complex(elephant_points, elephant_complex)

def pojection(points, axis = 1): # projection onto a coordinate axis


def value(v):
return points[v][axis]
return value
value = projection(elephant_points, 1)
def max_vertex_compare(value):
def max_vertex(s):
return max(value(v) for v in s.vertices)
def compare(s1, s2):
return cmp(s1.dimension(), s2.dimension()) or \
cmp(max_vertex(s1), max_vertex(s2))
return compare

f = Filtration(elephant_complex, max_vertex_compare(value))
Filtrations: Lower-Star
elephant_points, elephant_complex = read_off(data/cgal/elephant.off)
elephant_complex = closure(elephant_complex, 2)
show_complex(elephant_points, elephant_complex)

def pojection(points, axis = 1): # projection onto a coordinate axis


def value(v):
return points[v][axis]
return value
value = projection(elephant_points, 1)
def max_vertex_compare(value):
def max_vertex(s): 07-ls-filtration.py
return max(value(v) for v in s.vertices)
def compare(s1, s2):
return cmp(s1.dimension(), s2.dimension()) or \
cmp(max_vertex(s1), max_vertex(s2))
return compare

f = Filtration(elephant_complex, max_vertex_compare(value))
p = DynamicPersistenceChains(f)
p.pair_simplices()
dgms = init_diagrams(p, f, lambda s: max(value(v) for v in s.vertices))
show_diagrams(dgms)
Extended Persistence
Extended persistence was introduced as a way to measure the essential
persistence classes:

H(Xa1 ) H(Xa2 ) ... H(Xan )


H(X)

H(X, Xa1 ) H(X, Xa2 ) . . . H(X, Xan ) H(X, )
Extended Persistence
Extended persistence was introduced as a way to measure the essential
persistence classes:

H(Xa1 ) H(Xa2 ) ... H(Xan ) H(X)



H(X, Xa1 ) H(X, Xa2 ) . . . H(X, Xan ) H(X, )

H(X, Y) ' H(X w Y, w)

execfile(08-extended-persistence.py)
Persistent Homology
Filtration D, ordered boundary matrix (indexed by simplices)
D[i, j] = index of i in boundary of j
Persistence Decomposition R = DV , where R is reduced, meaning low-
est ones are in unique rows, and V is upper-triangular.

R D V
Persistent Homology
Filtration D, ordered boundary matrix (indexed by simplices)
D[i, j] = index of i in boundary of j
Persistence Decomposition R = DV , where R is reduced, meaning low-
est ones are in unique rows, and V is upper-triangular.

0 =

R D V
cycle
Persistent Homology
Filtration D, ordered boundary matrix (indexed by simplices)
D[i, j] = index of i in boundary of j
Persistence Decomposition R = DV , where R is reduced, meaning low-
est ones are in unique rows, and V is upper-triangular.


0 =

R D V
boundary cycle chain
Persistent Homology
Filtration D, ordered boundary matrix (indexed by simplices)
D[i, j] = index of i in boundary of j
Persistence Decomposition R = DV , where R is reduced, meaning low-
est ones are in unique rows, and V is upper-triangular.


0 =

R D V
boundary cycle chain
StaticPersistence computes just R, enough for the pairing.
Iterating over StaticPersistence, we can access columns of R, through cycle attribute.
(Also pair(), sign(), unpaired().)

smap = p.make_simplex_map(f)
for i in p:
if not i.sign():
print [smap[j] for j in i.cycle]
Persistent Homology
Filtration D, ordered boundary matrix (indexed by simplices)
D[i, j] = index of i in boundary of j
Persistence Decomposition R = DV , where R is reduced, meaning low-
est ones are in unique rows, and V is upper-triangular.


0 =

R D V
boundary cycle chain
StaticPersistence computes just R, enough for the pairing.
Iterating over StaticPersistence, we can access columns of R, through cycle attribute.
(Also pair(), sign(), unpaired().)

smap = p.make_simplex_map(f)
for i in p:
if not i.sign():
print [smap[j] for j in i.cycle]
DynamicPersistenceChains computes matrices R and V .
Access columns of V through chain. (E.g., gives access to the infinitely persistent classes.)
Persistent Homology
Filtration D, ordered boundary matrix (indexed by simplices)
D[i, j] = index of i in boundary of j
Persistence Decomposition R = DV , where R is reduced, meaning low-
est ones are in unique rows, and V is upper-triangular.


0 =

R D V
boundary cycle chain
while True:
StaticPersistence computes just R, enough for the pairing.
pt =over
Iterating show_diagram(dgms)
StaticPersistence, we can accessexecfile(08-cycle-chain.py)
columns of R, through cycle attribute.
(Also if not pt:
pair(), break
sign(), unpaired().)
print pt
smap i= =p.make_simplex_map(f)
pt[2]
for ismap = persistence.make_simplex_map(f)
in p:
chain
if not =i.sign():
[smap[ii] for ii in i.chain]
pair_cycle = [smap[ii]
print [smap[j] for jfor
in ii in i.pair().cycle]
i.cycle]
pair_chain = [smap[ii] for ii in i.pair().chain]
DynamicPersistenceChains computes matrices R and V .
show_complex(elephant_points, subcomplex = chain)
Access columns of V through chain. (E.g., gives access to the infinitely persistent classes.)
show_complex(elephant_points, subcomplex = pair_cycle + pair_chain)
Diagrams, Stability, and Distances

Dgm(f )
Diagrams, Stability, and Distances
Bottleneck distance:

W (Dgm(f ), Dgm(g)) = inf kx (x)k


Dgm(f )
Dgm(g)
Diagrams, Stability, and Distances
Bottleneck distance:

W (Dgm(f ), Dgm(g)) = inf kx (x)k


Dgm(f )
Dgm(g)
Diagrams, Stability, and Distances
Bottleneck distance:

W (Dgm(f ), Dgm(g)) = inf kx (x)k


bottleneck_distance(dgm1, dgm2)
Dgm(f )
Dgm(g)
Diagrams, Stability, and Distances
Bottleneck distance:

W (Dgm(f ), Dgm(g)) = inf kx (x)k


bottleneck_distance(dgm1, dgm2)
Dgm(f )
Dgm(g) Stability Theorem:

W (Dgm(f ), Dgm(g)) kf gk
Diagrams, Stability, and Distances
Bottleneck distance:

W (Dgm(f ), Dgm(g)) = inf kx (x)k


bottleneck_distance(dgm1, dgm2)
Dgm(f )
Dgm(g) Stability Theorem:

W (Dgm(f ), Dgm(g)) kf gk

Wasserstein distance: (More sensitive to the entire diagram.)


X
Wqq (Dgm(f ), Dgm(g)) = inf kx (x)kq

wasserstein_distance(dgm1, dgm2, q)
Wasserstein Stability Theorem: For Lipschitz functions f and g,
under some technical conditions on the domain,

Wq (Dgm(f ), Dgm(g)) C kf gkk


Circle-Valued Coordinates
How to get a tangible feel for the topological features that we find?
Circle-Valued Coordinates
How to get a tangible feel for the topological features that we find?

H1 (X; Z)
= [X, S 1 ] Maps into circles, natural for:
Phase coordinates for waves
Angle coordinates for directions
Periodic data
Circle-Valued Coordinates
How to get a tangible feel for the topological features that we find?

H1 (X; Z)
= [X, S 1 ] Maps into circles, natural for:

Start with the canonical isomorphism Phase coordinates for waves


between 1-dimensional cohomology Angle coordinates for directions
classes and homotopy classes of maps
Periodic data
into a circle.

Algorithm:
1. Compute persistent cohomology classes
2. Turn each representative cocycle z into a map, X S 1
3. Smooth that map (minimize variation across edges),
staying within the same cohomology/homotopy class
(equivalently, find the harmonic cocycle)
Circle-Valued Coordinates
How to get a tangible feel for the topological features that we find?

H1 (X; Z)
= [X, S 1 ] Maps into circles, natural for:

Start with the canonical isomorphism Phase coordinates for waves


between 1-dimensional cohomology Angle coordinates for directions
classes and homotopy classes of maps
Periodic data
into a circle.

Dgm1

Algorithm:
1. Compute persistent cohomology classes
2. Turn each representative cocycle z into a map, X S 1
3. Smooth that map (minimize variation across edges),
staying within the same cohomology/homotopy class
(equivalently, find the harmonic cocycle)
Circle-Valued Coordinates
How to get a tangible feel for the topological features that we find?

H1 (X; Z)
= [X, S 1 ] Maps into circles, natural for:

Start with the canonical isomorphism Phase coordinates for waves


between 1-dimensional cohomology Angle coordinates for directions
classes and homotopy classes of maps
Periodic data
into a circle.
+2

-1 Vertices map to 0;
edges wind with the
degree given by z (e).

Algorithm:
1. Compute persistent cohomology classes
2. Turn each representative cocycle z into a map, X S 1
3. Smooth that map (minimize variation across edges),
staying within the same cohomology/homotopy class
(equivalently, find the harmonic cocycle)
Circle-Valued Coordinates
How to get a tangible feel for the topological features that we find?

H1 (X; Z)
= [X, S 1 ] Maps into circles, natural for:

Start with the canonical isomorphism Phase coordinates for waves


between 1-dimensional cohomology Angle coordinates for directions
classes and homotopy classes of maps
Periodic data
into a circle.

Algorithm:
1. Compute persistent cohomology classes
2. Turn each representative cocycle z into a map, X S 1
3. Smooth that map (minimize variation across edges),
staying within the same cohomology/homotopy class
(equivalently, find the harmonic cocycle)
Persistent Cohomology in Dionysus
points = read_points(data/annulus.pts)
execfile(10-circular.py)

from math import sqrt

f = Filtration()
fill_alpha_complex(points, f)
f.sort(dim_data_cmp)

p = StaticCohomologyPersistence(f, prime = 11)


p.pair_simplices()
dgms = init_diagrams(p,f, lambda s: sqrt(s.data[0]), lambda n: n.cocycle)

while True:
pt = show_diagram(dgms)
if not pt: break
rf = Filtration((s for s in f if sqrt(s.data[0]) <= (pt[0] + pt[1])/2))
values = circular.smooth(rf, pt[2])
cocycle = [rf[i] for (c,i) in pt[2] if i < len(rf)]
show_complex(points, subcomplex = cocycle)
show_complex(points, values = values)
Persistent Cohomology in Dionysus
points = read_points(data/annulus.pts)
execfile(10-circular.py)

from math import sqrt

f = Filtration()
fill_alpha_complex(points, f)
f.sort(dim_data_cmp)

p = StaticCohomologyPersistence(f, prime = 11)


p.pair_simplices()
dgms = init_diagrams(p,f, lambda s: sqrt(s.data[0]), lambda n: n.cocycle)

while True:
pt = show_diagram(dgms)
if not pt: break
rf = Filtration((s for s in f if sqrt(s.data[0]) <= (pt[0] + pt[1])/2))
values = circular.smooth(rf, pt[2])
cocycle = [rf[i] for (c,i) in pt[2] if i < len(rf)]
show_complex(points, subcomplex = cocycle)
show_complex(points, values = values)
Image Persistence
Noisy domains: instead of f : X R, we have a function f : P R
P a sample of X
For suitably-chosen parameters and :
a1 a2 an
H(K ) H(K ) ... H(K )

H(Ka1 ) H(Ka2 ) . . . H(Kan )
Ka = alpha shape or Vietoris-Rips complex with parameter built
on f1 (, a]
Image Persistence
Noisy domains: instead of f : X R, we have a function f : P R
P a sample of X
For suitably-chosen parameters and :
a1 a2 an
H(K ) H(K ) ... H(K )

H(Ka1 ) H(Ka2 ) . . . H(Kan )
Ka = alpha shape or Vietoris-Rips complex with parameter built
on f1 (, a]
# assume parallel lists points and values
f = Filtration()
f = fill_alpha_complex(points, f)
# use persistence of f to choose alpha and beta chosen

f = Filtration([s for s in f if sqrt(s.data[0]) <= beta])


f.sort(max_vertex_compare(values))
p = ImagePersistence(f, lambda s: sqrt(s.data[0]) <= alpha)
p.pair_simplices()

dgms = init_diagrams(p, f, lambda s: max(values(v) for v in s.vertices))


show_diagrams(dgms)
Conclusions
Persistence is easy to use. Dionysus can help you try out new ideas.
Conclusions
Persistence is easy to use. Dionysus can help you try out new ideas.
Practice reinforces theory. For example, persistent cohomology algorithm,
in practice, is the fastest way I know to compute persistence diagrams.
(This realization is a pure accident of experimental work with circular
coordinates.) Studying why this is the case has lead to Dualities in
Persistent (Co)Homology.
Conclusions
Persistence is easy to use. Dionysus can help you try out new ideas.
Practice reinforces theory. For example, persistent cohomology algorithm,
in practice, is the fastest way I know to compute persistence diagrams.
(This realization is a pure accident of experimental work with circular
coordinates.) Studying why this is the case has lead to Dualities in
Persistent (Co)Homology.
Python bindings were one of the best decisions. (Hint, hint, CGAL.)
However, sometimes much slower than the C++ counter-parts. A lot of
the common functionality is available as examples in C++; dont overlook
them.
Conclusions
Persistence is easy to use. Dionysus can help you try out new ideas.
Practice reinforces theory. For example, persistent cohomology algorithm,
in practice, is the fastest way I know to compute persistence diagrams.
(This realization is a pure accident of experimental work with circular
coordinates.) Studying why this is the case has lead to Dualities in
Persistent (Co)Homology.
Python bindings were one of the best decisions. (Hint, hint, CGAL.)
However, sometimes much slower than the C++ counter-parts. A lot of
the common functionality is available as examples in C++; dont overlook
them.
Dionysus includes significant chunks of open-source code by the following
people (many thanks to them):
Jeffrey Kline (LSQR port to Python)
Bernd Gaertner (implementation of Miniball algorithm used for Cech complexes)
John Weaver (Hungarian algorithm used for Wasserstein distances)
Arne Schmitz (PyGLWidget.py)
Thank you for your
time and attention!
Title

You might also like