0% found this document useful (0 votes)
9 views2 pages

Non Linear Dimensionality Reduction

Uploaded by

Bhaskar Mulik
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views2 pages

Non Linear Dimensionality Reduction

Uploaded by

Bhaskar Mulik
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Non linear dimensionality reduction

Isomap/Isometric mapping

It is a manifold learning algorithm that tries to preserve the geodesic distance between samples
while reducing the dimension

Here, the data forms a spiral shape (nonlinear). The red line determines the geodesic distance
between x1 & x2 while the blue line represents Euclidean distance.

euclidean isn’t preferred over geodesic as it doesn’t really give an idea of how far x1 & x2 in a
nonlinear space.

calculate the geodesic distance between x1 & x2?

1.Run KNN (using Euclidean distance )to form a neighborhood graph/adjacency matrix.

if we figure out ’n’ neighbors for all points & form a ‘neighborhood’ graph/adjacency matrix, this
graph can be used for measuring the geodesic distance between 2 points easily.

2.Calculate geodesic distance using the above graph.

apply any shortest path algorithm for the above-weighted graph (like Dijkstra), it will give us the
geodesic distance between any two points

Consider x1 & x2 above. For shortest path between them, it will include all points falling on the red
spiral between x1 & x2. The shortest distance will then become the sum of weights of all these points
lying in the shortes path hence giving us geodesic distance (approx) between x1 & x2

3. Form a dissimilarity matrix using the above-calculated geodesic distance between points.

4. Square the dissimilarity matrix & double-center it.


5. Eigendecomposition & choosing ‘k’ eigenvectors. This is something similar to what we do in PCA
after calculating the correlation matrix.

A few drawbacks always exist

This version of Isomap is computationally heavy. Though other versions exist which are
comparatively light.

Parameter tuning for KNN is important as a wrong selection of ’n’ can be devastating. We can use
Radius nearest neighbors as well for forming the neighborhood graph.

You might also like