0% found this document useful (0 votes)
33 views4 pages

Singular Value Decomposition

Singular value decomposition (SVD) is a matrix decomposition technique that can be used for dimensionality reduction. It decomposes a matrix into three component matrices, one of which contains the singular values in descending order. SVD has applications in information retrieval and latent semantic analysis, where it can reduce the dimensionality of document-term matrices to address issues like synonyms. It maps words with similar meanings to the same compressed space.

Uploaded by

Sahas Parab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views4 pages

Singular Value Decomposition

Singular value decomposition (SVD) is a matrix decomposition technique that can be used for dimensionality reduction. It decomposes a matrix into three component matrices, one of which contains the singular values in descending order. SVD has applications in information retrieval and latent semantic analysis, where it can reduce the dimensionality of document-term matrices to address issues like synonyms. It maps words with similar meanings to the same compressed space.

Uploaded by

Sahas Parab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

SAHAS R.

PARAB
22MDT1081

APPLICATION OF SVD IN AI, ML OR DEEP LEARNING


Singular Value Decomposition (SVD) is a special decomposition method for splitting
an arbitrary matrix A with m rows and n columns (assuming that this matrix also has rank r,
i.e., r columns of the linearly independent matrix A) into a set of matrices:

𝐴 = 𝑈Σ𝑉 𝑇

where:

• Σ (Sigma) is a non-negative descending diagonal matrix of dimension r ∗ r. All


elements not on the main diagonal are zero, and the elements of Σ are called singular
values. Another common notation used for this matrix is S.
• U is the orthogonal matrix with dimension m ∗ r and V is the orthogonal matrix with
dimensions n ∗ r.
o Orthogonal matrix refers to a square matrix where the columns are at 90
degrees to each other and its inner dot product is 0, i.e., given an orthogonal
matrix Q, 𝑄 𝑇 𝑄 = 𝑄𝑄 𝑇 = 𝐼 and 𝑄 𝑇 = 𝑄 −1
o Orthogonal matrix: orthogonal matrix where the columns are unit vectors.

A classic visual representation of SVD.

1
SAHAS R. PARAB
22MDT1081

APPLICATION OF SVD IN AI, ML OR DEEP LEARNING

Applications: Information Retrieval

Singular Value Decomposition has also been widely used in information retrieval, in this
particular application, it is also known as Latent Semantic Analysis (LSA) or Latent Semantic
Indexing (LSI). As we will soon see, this idea is very similar to topic model. The basic
problem of information retrieval is: given some search terms the algorithm will retrieve all
documents containing those search terms, or perhaps more usefully, return documents with
relevant content. semantically related to search terms. For example, if one of the search terms
is "automobile", documents containing the search term "cars" can also be returned.

One approach to this problem is as follows: Given an information, we can convert plain text
into a document term matrix with one row per document and one column per word. Then
convert the search term to a vector in the same space and retrieve the document vectors that
are close to the search vector. There are some problems with vector-based recovery.

• First of all, space has a very high dimension. For example, a typical document
collection can easily refer to more than 100,000 words even when the original word is
used (i.e., "jump", "jumping", "jumped" are all considered the same word). This
creates distance measurement problems due to the dimensionality curse.
• Second, it treats each word as independent, while in languages like English the same
word can mean two different things ("left" versus "left" as in direction), and two
different words can mean the same ("car" and "automobile").

2
SAHAS R. PARAB
22MDT1081

APPLICATION OF SVD IN AI, ML OR DEEP LEARNING


By applying SVD, we can reduce the size to speed up the search, words with similar
meanings will be mapped to the same truncated space. We will cover this application in the
following quick example:

3
SAHAS R. PARAB
22MDT1081

APPLICATION OF SVD IN AI, ML OR DEEP LEARNING


After applying LSA, we can use the compressed features to see which documents are more
similar to a particular document. The following code chunk shows the pairwise cosine
similarity of all the documents.

You might also like