0% found this document useful (0 votes)
12 views

Math

The document discusses image compression using singular value decomposition (SVD). SVD decomposes images into simpler rank one components, allowing removal of less significant components to achieve data compression. The document shows how SVD decomposes a matrix into rotation, scaling, and rotation transformations represented by matrices U, Σ, and V. It demonstrates compressing an image by applying SVD and reducing the dimensions of the Σ matrix to discard less important information.

Uploaded by

sumaiyaahmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Math

The document discusses image compression using singular value decomposition (SVD). SVD decomposes images into simpler rank one components, allowing removal of less significant components to achieve data compression. The document shows how SVD decomposes a matrix into rotation, scaling, and rotation transformations represented by matrices U, Σ, and V. It demonstrates compressing an image by applying SVD and reducing the dimensions of the Σ matrix to discard less important information.

Uploaded by

sumaiyaahmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Image Compression : Singular Value Decomposition

Sumaiya Ahmed Rani, 210041223


Computer Science and Engineering
Islamic University of Technology

Abstract Geometric Interpretation of Singular Values Data Analysis


Image compression is a crucial aspect of digital data storage and transmission, aim- Applying a matrix A to a vector X can be visualized as performing a rotation (V T ), K Image Size (kb) Error (mse) Compression Ratio
ing to reduce the size of images while preserving essential visual information. Im- a scaling (Σ) and another rotation (U ) on the vector X.
ages can be represented by a large m by n matrix. Using matrices and their various 5 138 0.0175 0.6145
properties, we can manipulate images in many useful ways. 25 218 0.0095 0.3911
45 251 0.0071 0.2989
65 275 0.0058 0.2318
85 293 0.0486 0.1816
105 309 0.0042 0.1369
125 322 0.0036 0.1006
145 333 0.0031 0.0698
165 342 0.0028 0.0447
185 349 0.0023 0.0251
205 355 0.0021 0.083

We can see that as the K increases so does the image size. As k decreases both
the error magnitude and compression ratio decrease.

Approximating the Compressed Image


Singular Value Decomposition (SVD) is a powerful technique in the field of im- Any image can be represented as a matrix of pixels. For simplicity we can choose
age compression. SVD decomposes any matrix into simpler rank one pieces and an image matrix consisting a simple diamond shape where the pixels containing
chooses the pieces in order of importance so that the image can be represented Geometrically, V and U act as rotational transformations, and Sigma acts as a scaling the shape is set to 1 and the other values are set to 0.
with a smaller set of values that requires much less storage space. transformation. In other words, every linear transformation comprises a rotation,
then scaling, then another rotation.
Eigenvalues and Eigenvectors
Application in Image Compression
Most vectors change direction after multiplying by a matrix A. Eigenvectors are
such special vectors that do not change direction after multiplying by a matrix A. Singular Value Decomposition (SVD) is a mathematical technique that can be used
Multiply an eigenvector x by A, and the vector Ax is a number λ times the original for image compression. The basic idea is to represent an image as a sum of simpler
x. The basic equation is components, allowing for the removal of less significant components to achieve
compression.
Ax = λx  
0 0 1 1 0 0
. 0
 1 1 1 1 0
1 1 1 1 1 1
 
1 1 1 1 1 1
 
0 1 1 1 1 0
0 0 1 1 0 0

The m by n image bit matrix can be refactored into the matrices U, Σ, V T .


Figure 1. Schematic representation of the singular value decomposition. In order to bring out compression, the dimensions of diagonal matrix are reduced to
Σp×q where p ≤ m and q ≤ n. After application of SVD, only some singular values
The matrix Data is being decomposed here. The light gray areas show the original from matrix Σ are kept while the lower singular values are removed. This can be
data, and the dark gray areas show the only data used in the matrix approximation. done because of the fact that singular values are arranged in descending order
The number λ is an eigenvalue of A. The eigenvalue λ tells whether the special and that first singular value contains much information than the following singular
vector x is stretched or shrunk or reversed or left unchanged-when it is multiplied Image Representation
values that contain decreasing amount of image information. So, the lower singular
by A. Represent the image matrix as the product of three matrices using SVD: A = U ΣV T values that contains less important information can be discarded.
From the equation we can say that A − λI is singular. So we first calculate for the where A is the original image matrix, U is a matrix of left singular vectors, Σ is a The number of non-zero singular values present in diagonal matrix Σ specifies the
λ values by solving the equation det(A − λI) = 0 then determine the eigenvectors diagonal matrix of singular values, and V T is the transpose of a matrix of right rank of the image matrix A. If the singular values after a certain rank are not zero,
from corresponding eigenvalues. singular vectors. they are considered as redundant and can be removed.

Singular Value Decomposition Rank Reduction


The singular values in Σ represent the importance of the corresponding singular
SVD states that any matrix A can be factorized into three matrices as
vectors. Higher singular values contribute more to the image reconstruction. By
keeping only the top K singular values and their corresponding vectors, where
A = U ΣV T
K is much smaller than the original image size, we can reduce the rank of the
where U and V are orthogonal matrices with orthonormal eigenvectors chosen approximation and as a result the overall storage size also reduces.
from the special matrices AAᵀ and AᵀA respectively. Σ is a diagonal matrix with r
Compression and Compression Ratio
elements along the diagonal equal to the square root of the positive eigenvalues
of
Discard the less significant singular values and their corresponding vectors. The
AAT and AT A.
resulting matrix A = Uk Σk V T k represent a compressed version of the original
They are called the singular values. image.
√    The compression ratio is determined by the choice of K. A smaller K results in
λ1 √0 0 σ1 0 0 higher compression but lower image quality, while a larger retains more details Typically an image matrix pixels consist of 3 bytes for red, green and blue compo-
 0 λ2 √0  =  0 σ2 0  but has less compression. nents. In order to efficiently store the image we need to encode 3 matrices R, G,
0 0 λ3 0 0 σ3 B for each color component respectively. In general it is very likely that the image
we order the eigenvectors such that vectors with higher eigenvalues come before matrices are full-rank. SVD allows us to take an arbitrary matrix and write it down
those with smaller values. as a sum of rank-1 matrices. So for a random image of width w and height h, the
σ1 ≥ σ2 ≥ σ3 matrix will almost certainly have rank min( w, h ). We can approximate A with the
first k < r components. The original decomposition was
There are a few steps to calculating these matrices. A = σ1(u1)(v1T ) + σ2(u2)(v2T ) + ... + σr (ur )(vr T )
Step 1 : Calculate the matrix AT A We can now approximate it as
Step 2: Determine the eigenvalues and eigenvectors of AT A. k
X
T
Step 3: Form the matrix V T by normalizing the eigenvectors of AT A. In A≈ σ1(ui)(vi )
general, we compute eigenvectors by using the matrix AT A - λI and simplify i=1

the matrix for each eigenvalue. This approximation is good, because


Step√4: Determine the matrix Σ from the eigenvalues of AT A and AAT , where σ1 ≥ σ2 ≥ ... ≥ σr ≥ 0
σ = λ. first singular values contribute more than the following.
Step 5: Calculate the eigenvalues and eigenvectors of AAT . The key advantage of SVD in image compression is that it allows for a flexible
Step 6: Form the matrix U by normalizing the eigenvectors of AAT . trade-off between compression and image quality. It captures the most important
Step 7: Rewrite A as U ΣV T features of the image while discarding the less significant ones.

T
Am×n = Um×mΣ m×nVn×n References
1.https://www.lagrange.edu/academics/undergraduate/undergraduate-
research/citations/18-Citations2020.Compton.pdf
2. https://www.ripublication.com/irph/ijert_spl17/ijertv10n1spl_94.pdf
3. https://youtu.be/DG7YTlGnCEo
4. https://timbaumann.info/svd-image-compression-demo/
5. https://zerobone.net/blog/cs/svd-image-compression/

This decomposition provides a broken-down form of the matrix A that has isolated
the most important components from our original matrix. This returns a modifi-
cation of our original matrix A in which the components are smaller in size, thus
reducing the memory requirement in storing the information

You might also like