Part 1
Part 1
Introduction
Singular Value Decomposition (SVD) is a powerful mathematical technique used in linear algebra for factorizing
a matrix into three distinct matrices. It has numerous applications across fields such as signal processing,
machine learning, data compression, and recommendation systems.
• **SVD Formula:**
For a given matrix A of size m × n, SVD decomposes it into three matrices:
A = U ΣV T
Where:
1. U : An m × n orthogonal matrix whose columns are the left singular vectors of A.
2. Σ: An m × n diagonal matrix whose diagonal elements (σ1 , σ2 , σ3 , . . .) are the singular values of
A, sorted in descending order. These values represent the “strength” of each singular vector.
3. V T : The transpose of an n × n orthogonal matrix V , whose columns are right singular vectors of
A.
Key Properties
• The rank of A is equal to the number of non-zero singular values.
• U and V are orthogonal: U T U = I and V T V = I, where I is the identity matrix.
• SVD provides the best low-rank approximation of A by truncating the singular values in Σ.
Applications
1. Dimensionality Reduction: By keeping only the largest singular values and corresponding singular vectors,
SVD reduces the dimensionality of data, often used in techniques like PCA (Principal Component Analysis).
2. Recommendation Systems: SVD is employed in collaborative filtering to predict user preferences by decom-
posing the user-item interaction matrix.
3. Image Compression: By approximating an image matrix with fewer singular values, SVD can significantly
reduce the storage space required for images.
4. Noise Reduction: Removing smaller singular values helps filter out noise from data.
Ak = Uk Σk VkT