0% found this document useful (0 votes)
3 views

Matrix Decomposition Methods in Image Processing

This document discusses matrix decomposition methods in image processing, focusing on techniques such as Singular Value Decomposition (SVD), Hessenberg, QR, and LU decompositions. It highlights their applications in data compression, image reconstruction, and machine learning, emphasizing SVD's effectiveness in preserving image features while reducing storage requirements. The document also outlines the mathematical foundations of SVD and its properties, illustrating its significance in digital image processing.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Matrix Decomposition Methods in Image Processing

This document discusses matrix decomposition methods in image processing, focusing on techniques such as Singular Value Decomposition (SVD), Hessenberg, QR, and LU decompositions. It highlights their applications in data compression, image reconstruction, and machine learning, emphasizing SVD's effectiveness in preserving image features while reducing storage requirements. The document also outlines the mathematical foundations of SVD and its properties, illustrating its significance in digital image processing.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

‫كلية العلـــــوم‬

‫قــســــــــــم االنـــظــــمــــة الــــطـبـيـة الـــذكــــــيـــة‬

‫المحاضرة السابعة‬
‫‪Matrix Decomposition Methods in‬‬
‫‪Image Processing‬‬
‫المادة ‪DSP :‬‬
‫المرحلة ‪ :‬الثالثة‬
‫اسم االستاذ‪ :‬م‪.‬م‪ .‬ريام ثائر احمد‬
1. Some Types of Matrix Decomposition Methods
2. The Effect of the Matrix Decomposition Methods on Images
a. The Effect of SVD Method on image
b. The Effect of Hessenberg Decomposition Method on image
c. The Effect of QR Decomposition Method on image
d. The Effect of LU Decomposition Method on image

3. Singular Value Decomposition (SVD) and Image Processing


4. Singular Value Decomposition Method Mathematically
5. Computing the SVD by Hand
6. Theorem
7. Some Singular Value Decomposition (SVD) Properties in DIP
Matrix Decomposition Methods in Image Processing

In the linear algebra discipline in mathematical, a matrix decomposition or


matrix factorization is a factorization of a matrix into a product of matrices.
There are many different matrix decompositions; each finds use among a
particular class of problems.

 Some Types of Matrix Decomposition Methods

a. Decompositions based on eigenvalues and related concepts


a.1 Singular value decomposition
a.2 Hessenberg decomposition
a.3 Eigen decomposition
a.4 Jordan decomposition
a.5 Schur decomposition
a.6 Real Schur decomposition
a.7 QZ decomposition
a.8 Takagi's factorization
a.9 Scale-invariant decompositions

b. Decompositions related to solving systems of linear equations


b.1 QR decomposition
b.2 LU decomposition
b.3 Cholesky decomposition
b.4 LU reduction
b.5 Block LU decomposition
b.6 Rank factorization
b.7 RRQR factorization
b.8 Interpolative decomposition
c. Other decompositions
c.1 Polar decomposition
c.2 Algebraic polar decomposition
c.3 Mostow's decomposition
c.4 Sinkhorn normal form
c.5 Sectoral decomposition
c.6 Williamson's normal form
Methods for matrix decomposition in linear algebra have found numerous
applications in image processing. Therefore, it seems reasonable to investigate
matrix decomposition applications in image processing. The following is a list
of more applications of matrix decomposition methods:
 Data Compression
 CT Scan Reconstruction
 Linear Regression (Least Square Systems)
 Spectral Clustering
 Moore-Penrose Pseudo Inverse
 Signal Estimation Theory
 Derivation of the Recursive Least-Squares Filter
 Sensor Array Signal Processing
In addition, in machine learning and statistics, we often have to deal with
structural data, which is generally represented as a table of rows and columns,
or a matrix. A lot of problems in machine learning can be solved using matrix
algebra and vector calculus. Applications covered are background Removal,
topic modeling, recommendations using collaborative filtering and eigenfaces.

2. The Effect of the Matrix Decomposition Methods on Images:


A multitude of matrix decomposition techniques stemming from linear algebra
to have been applied to image processing. When the decomposition methods are
performed on the whole image, the result will be as the following.

a- The Effect of SVD Method on image


I=USVT
Original Image U-matrix S-matrix VT-matrix

According to the following algorithm:

U
Original Image Apply SVD on Image

b- The Effect of Hessenberg Decomposition Method on image

𝐴�=𝑃�𝐻�𝑃�𝑇�

Original Image Gray H-matrix P-matrix

According to the following figure:


Original Image

Apply Hessenberg Decomposition Method (HDM)

P H

c- The Effect of QR Decomposition Method on image

𝐴�=𝑄�𝑅�

Original Image Gray R-matrix Q-matrix


According to the following figure:

Original Image

Apply QR Matrix Decomposition Method

Q R

d- The Effect of LU Decomposition Method on image

𝐴�=𝐿�𝑈�
Original Image Gray U-matrix L-matrix

According to the following algorithm:

Original Image

Apply LU Matrix Decomposition Method

L U

3. The Singular Value Decomposition (SVD) and Image Processing

Singular Value Decomposition (SVD) has recently emerged as a new paradigm


for processing different types of images. SVD is an attractive algebraic transform
for image processing applications.
In linear algebra, the SVD is a factorization of a rectangular real or complex
matrix analogous to the diagonalizations of symmetric or Hermitian square
matrices using a basis of eigenvectors. SVD is a stable and effective method to
split the system into a set of linearly independent components, each of them
bearing own energy contribution.
In digital image processing, image features are divided into four groups: visual
features, statistical pixel features, transform coefficient features, and algebraic
features. The SVD technique can be considered as an algebraic feature. The
algebraic, usually represent intrinsic properties.
SVD method can transform matrix 𝐴 into product 𝑈S𝑉𝑇, which allows us to
refactor a digital image in three matrices. The use of singular values of such
refactoring allows us to represent the image with a smaller set of values, which
can preserve useful features of the original image, but use less storage space in
the memory, and achieve the image compression process.

The objective of this section is to apply linear algebra “Singular Value


Decomposition (SVD)“ to mid-level image processing, such as image
compression and recognition. The method is factoring a matrix 𝐴 into three new
matrices 𝑈,S, and 𝑉, in such a way that 𝐴=𝑈S𝑉𝑇. Where 𝑈 and 𝑉 are orthogonal
matrices and S is a diagonal matrix.

4. Singular Value Decomposition Method Mathematically:

Let 𝐴 be any 𝑚×𝑛 matrix. Then there are orthogonal matrices 𝑈, and a
diagonal matrix S such that

5. Computing the SVD by Hand:


We now list a simplistic algorithm for computing the SVD of a matrix 𝐴. It can
be used fairly easily for manual computation of small examples. For a given
𝑚×𝑛 matrix 𝐴 the procedure is as follows:
00
Example: The SVD form of the matrix is:

00
00
00
2. Theorem:
Any 𝑚×𝑛 real matrix 𝐴 can be factored uniquely into a product of the form
𝑈S𝑉𝑇, called the SVD of 𝐴, where 𝑈 and 𝑉 are orthogonal matrices and S is an
𝑚×𝑛 diagonal matrix whose diagonal entries called the singular values of 𝐴 are
all real and satisfy the following:

k=min (𝑚�,) 𝜎 �1 � ≥ 𝜎 �2 � ≥ … ≥ 𝜎 � 𝑘 � � ≥ 0

Let 𝜎𝑗 denote the 𝑗𝑡ℎ singular value along the diagonal of 𝑆 for 𝑗=1,...,. If 𝑢𝑗 and
𝑣𝑗 represent the 𝑗𝑡ℎ column vectors of 𝑈 and 𝑉, respectively, then 𝐴 can be
written as

𝐴�� =� 𝜎�1𝑢�1𝑣�𝑇�� +� 𝜎�2𝑢�2𝑣�𝑇�� +�⋯�+� (Complete Form) (1)


𝜎�𝑘�𝑢�𝑘�𝑣�𝑇�.
We can approximate 𝐴 by matrices of lower rank by truncating the expansion
(1). Most of the information contained in 𝐴 will be reproduced using relatively
few terms of the expansion (1). We expect a matrix of the form

𝐴�𝑟�� =� 𝜎�1𝑢�1𝑣�𝑇�� +�𝜎�2𝑢�2𝑣�𝑇�� +� ⋯�+�𝜎�� (Truncated Form)


𝑢� 𝑣�𝑇�.

00
to adequately represent the original image given by 𝐴 even if 𝑟 is much smaller
than 𝑘 (where 𝑟 is the number of the largest singular values of 𝐴) because we
are using the largest singular values first. If 𝜎�𝑟�� � > 0, then 𝐴�𝑟�� � is a rank
approximation to 𝐴. Students can reconstruct the images using the SVD with𝑟�
different ranks. The total storage for 𝐴�𝑟�� will be

𝑇�𝑠��(𝐴�𝑟�) = 𝑟��
(𝑚�+𝑛�+1)
The integer 𝑟 can be chosen confidently less then 𝑛, and the digital image
corresponding to 𝐴�𝑟� still have very close the original image. However, the
chose the different 𝑟 will have a different corresponding image and storage for
it. For typical choices of the k, the storage required for 𝐴�𝑟�� will be less the
percent. 20

Using the command subplot, they can plot all these approximations along with
the original image in the same window for easy comparison.

Moreover, they can compute the error between the original image and its
approximations. One way of doing this is through the Frobenius norm of a matrix
which is defined as

Let 𝐴𝑐 represent a compressed version of the image 𝐴: We define the relative


error as

Students can compute the relative error in the Frobenius norm of the image 𝐴 at
different ranks and check if the results of the norm roughly agree with the error
based on visual perception. They can investigate, for example, how large does
the rank needs to be so that the relative error (in the Frobenius norm) is less than
5%.
20
Grayscale Original Image of Size 497x498

30
As we see the 10𝑡ℎ Iteration the image contains the 100 entries, also form the
30th Iteration we get the image near to original image and form the 70𝑡ℎ Iteration
i.e. A 70×70 matrix, with 4900 entries is significantly reduced the original image
of size 497×498 matrix, with 247506 entries. So, there is no need to go up to
100𝑡ℎ Iteration.

In the following examples, we will show how the SVD works in several
applications in DIP.

3. Some Singular Value Decomposition (SVD) Properties in DIP:

The first property of SVD is that:


a- The singular values 𝜎1,2,…,𝜎𝑛 are unique, but the matrices 𝑈 and 𝑉 are not
unique.
b- The SVD method is a robust and reliable orthogonal matrix decomposition
method.

Due to SVD conceptual and stability reasons, it becomes more and more popular
in the signal processing area. SVD is an attractive algebraic transform for image
processing. SVD has prominent properties in imaging. Although some SVD
properties are fully utilized in image processing, others still need more
investigation and contributed to it.

c- The SVD packs the maximum signal energy into as few coefficients. It has
the ability to adapt to the variations in local statistics of an image. However,
SVD is an image adaptive transform; the transform itself needs to be represented
in order to recover the data.

d- The SVD method decomposes a matrix into orthogonal components with


which optimal sub rank approximations may be obtained. The largest object
components in an image found using the SVD generally correspond to
eigenimages associated with the largest singular values, while image noise
corresponds to eigenimages associated with the smallest singular values. The

SVD is used to approximate the matrix decomposing the data into an optimal
estimate of the signal and the noise components. This property is one of the
40
Al-Mustaqbal University
College of Science
Intelligent Medical System Department

most important properties of the SVD decomposition in noise filtering,


compression and forensic which could also be treated as adding noise
in a proper detectable way.

Study Year: 2024-2025

You might also like