An Introduction To Locally Linear Embedding: L. K. Saul S. T. Roweis
An Introduction To Locally Linear Embedding: L. K. Saul S. T. Roweis
L. K. Saul S. T. Roweis
Outline
• Introduction
• Algorithm
• Part 1: Constrained Least Squares Problem
• Part 2: Eigenvalue Problem
• Examples
• Discussion
• LLE from pairwise distances
• Conclusion
Reconstruction errors: E (W ) | X i ij j
W X | 2
i j
• The weights Wij represent the contribution of the j-th data point
to the i-th reconstruction.
• The weights are determined under the constraints:
• Wij 0 if X j does not belong to this set.
• W
j
ij 1 i.e., the rows of W sum to 1.
| x j j j
w
j
| 2
| w ( x
j
j ) | 2
w j wk C jk
jk
C jk ( x j )T ( x k )
min w w j wk C jk s. t. w j 1
jk j
L w j wk C jk ( w j 1)
jk j
L L
0 (1) and 0 (2) (constraint)
w j
L
(1) wk C jk 0 w C k jk
w j k k
1
(2) wT e 1 eT C 1e 1 T 1
( 4)
e C e
or in componentwise form: w j
lm
C 1
lm
| ( X i t)
i
ij j
W
j
( X t ) | 2
| X i
i
ij j because
W
j
X | 2
t W t 0
j
ij since W
j
ij 1.
The same weights Wij that reconstruct the i-th data point in
D dimensions should also reconstruct its embedded
manifold coordinates in d dimensions (d<D).
(Y ) | Yi WijY j |
2
Cost function:
i j
(Y ) M ij (Yi T Y j ), where
ij
M ij ij Wij W ji WkiWkj
k
Constraints:
1. Coordinates centered at the origin: Y
i
i 0
2. The embedding vectors have unit covariance:
1 (Reconstruction errors for different
N
YY
i
i i
T
I coordinates should be measured
on the same scale.)
min tr (Y T ( I W )T ( I W )Y )
Y : bottom eigenvectors of M ( I W )T ( I W )
C 1
jk
C
lm
1
lm
• For each data point, the user only needs to supply its NNs
and the submatrix of pairwise distances between those
neighbors.
• Is it possible to recover manifold structure with less
information?
• The answer is NO.
• This embedding fails to preserve the
underlying structure of the original
manifold.
PCA
LLE
PCA
LLE
• 20 x 28 grayscale images.
• Two-dimensional
embeddings of faces.
• K = 12 nearest neighbors.
• The coordinates of the
embeddings are related to
meaningful attributes, such
as the pose and expression
of human face.
• The coordinates of
the embeddings are
related to meaningful
attributes, such as the
semantic associations
of words.
T T
min
tr (Y ( I W )( I W ) Y )
Y R d N
YY T I
Y T : principal eigenvectors of M ( I W )( I W )T
• Note that C
i
ij 0 since the xi’s are centered.
• Summing over i, over j and over i,j yields
D C
i
ij
i
ii NC jj Di tr (C ) / N C jj
Dj
ij NCii C jj D j Cii tr (C ) / N
j
1
Cij ( Di D j Dij D0 )
2