0% found this document useful (0 votes)
3 views1 page

Part 1

Singular Value Decomposition (SVD) is a mathematical technique for factorizing a matrix into three matrices, with applications in signal processing, machine learning, and data compression. The decomposition is represented as A = U ΣV T, where U and V are orthogonal matrices and Σ contains singular values. SVD is particularly useful for dimensionality reduction, recommendation systems, image compression, and noise reduction.

Uploaded by

24167026
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views1 page

Part 1

Singular Value Decomposition (SVD) is a mathematical technique for factorizing a matrix into three matrices, with applications in signal processing, machine learning, and data compression. The decomposition is represented as A = U ΣV T, where U and V are orthogonal matrices and Σ contains singular values. SVD is particularly useful for dimensionality reduction, recommendation systems, image compression, and noise reduction.

Uploaded by

24167026
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Singular Value Decomposition

Ritu Raj Singh


November 2024

Introduction
Singular Value Decomposition (SVD) is a powerful mathematical technique used in linear algebra for factorizing
a matrix into three distinct matrices. It has numerous applications across fields such as signal processing,
machine learning, data compression, and recommendation systems.

• **SVD Formula:**
For a given matrix A of size m × n, SVD decomposes it into three matrices:

A = U ΣV T

Where:
1. U : An m × n orthogonal matrix whose columns are the left singular vectors of A.
2. Σ: An m × n diagonal matrix whose diagonal elements (σ1 , σ2 , σ3 , . . .) are the singular values of
A, sorted in descending order. These values represent the “strength” of each singular vector.
3. V T : The transpose of an n × n orthogonal matrix V , whose columns are right singular vectors of
A.

Key Properties
• The rank of A is equal to the number of non-zero singular values.
• U and V are orthogonal: U T U = I and V T V = I, where I is the identity matrix.
• SVD provides the best low-rank approximation of A by truncating the singular values in Σ.

Applications
1. Dimensionality Reduction: By keeping only the largest singular values and corresponding singular vectors,
SVD reduces the dimensionality of data, often used in techniques like PCA (Principal Component Analysis).

2. Recommendation Systems: SVD is employed in collaborative filtering to predict user preferences by decom-
posing the user-item interaction matrix.

3. Image Compression: By approximating an image matrix with fewer singular values, SVD can significantly
reduce the storage space required for images.

4. Noise Reduction: Removing smaller singular values helps filter out noise from data.

Example: Reconstructing a Matrix using SVD


1. Perform SVD on A to get U , Σ and V T .
2. Use the top k singular values (and corresponding vectors) to approximate A as:

Ak = Uk Σk VkT

where Ak is the rank-k approximation of A.

You might also like