0% found this document useful (0 votes)
11 views32 pages

Madhuri

The project titled 'Eigenvalues & Eigenvectors and its Application' by Madhuri Garnaik explores the theoretical foundations and computational techniques related to eigenvalues and eigenvectors, highlighting their significance in various fields such as mathematics, physics, and engineering. It discusses methods for finding eigenvalues and eigenvectors, their properties, and practical applications including vibration analysis, image compression, and quantum mechanics. The project aims to provide a comprehensive understanding of these concepts to empower learners in their mathematical and scientific endeavors.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views32 pages

Madhuri

The project titled 'Eigenvalues & Eigenvectors and its Application' by Madhuri Garnaik explores the theoretical foundations and computational techniques related to eigenvalues and eigenvectors, highlighting their significance in various fields such as mathematics, physics, and engineering. It discusses methods for finding eigenvalues and eigenvectors, their properties, and practical applications including vibration analysis, image compression, and quantum mechanics. The project aims to provide a comprehensive understanding of these concepts to empower learners in their mathematical and scientific endeavors.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 32

A project

On
“ EIGENVALUES& EIGENVACTORS
AND IT’S APPLICATION ”

Submitted By
MADHURI GARNAIK

Exam Roll No: 22SMTH-030

Guided By

Mr. SANDIP ROUT

PG DEPARTMENT OF MATHEMATICS
GOVERNMENT (AUTONOMOUS) COLLEGE, ANGUL

ODISHA - 759143

1
GOVERNMENT (AUTONOMOUS) COLLEGE,
ANGUL

Certificate
This is to certify that the project work entitled"EIGENVALUE
EIGENVECTOR & IT’S APPLICATION " submitted to the Post
Graduate Department of Mathematics, Government (Autonomous)
College, Angul in partial fulfilment of requirements for the Course
Master of Science in Mathematics, is a work done by Ms.
MADHURI GARNAIK under my supervision and guidance and
that this work has not been submitted elsewhere for the award of any
degree.

Place:Angul
Date: Mr. Sandip Rout
Assistant professor of Mathematics
Govt.(Autonomous)College,Angul.

2
Acknowledgement
I sincerely acknowledge my true sense of obligation to Mr. Sandip Rout,
Department of Mathematics, Govt. (Autonomous) college, Angul, who had
provided me a persistence follow of perseverance as well as assistance to finish
my dissertation. Finally, I must thank to my parents, my friends and teachers
whose blessing helped me to complete such project work.

Date: Name: Madhuri Garnaik

Roll No. 21SMTH030

Government (Autonomous)College, Angul

Angul – 759143, Orissa, India

3
Government (Autonomous) College,
Angul
PG Department of Mathematics

Declaration

I, MADHURI GARNAIK, do hereby declare that my project “


Eigenvalue eigenvector & it’s application” is my original work. It
is not submitted by any other institution or published any time before
the purpose what so ever. This declaration has compiled for fulfilling
requirement for the award of B.Sc. Mathematics of Government
(Autonomous) College, Angul.

Date: Name :Madhuri Garnaik

Place: Angul 3 Year B.Sc. Mathematics


rd

Exam Roll No: 22SMTH-030

ABSTRACT

4
Eigenvalues and Eigenvectors, fundamental concepts in linear
algebra,play a

pivotal role in understanding the behavior of linear transformations and

matrices. This project delves into the theoretical underpinnings of

eigenvalues and eigenvectors, elucidating their definations, properties,


and

significance in various mathematical contexts.Furthermore,computational

techniques for computing eigenvalues and eigenvectors,such as the


power

method and QR algorithm,are explored in detail. Practical applications of

eigenvalues and eigenvectors are also investigated,showcasing their


utility

in diverse fields.Examples include their use in analyzing dynamical


systems

to predict long-term behavior,in image processing for feature extraction


and

compression, and in quantum mechanics to describe the behavior of

quantum states.Through these examples,the project demonstrates the

indispensable role of eigenvalues and eigenvectors in solving real-world

problems across disciplines. By elucidating both the theoretical


foundations

and practical applications,this project aimsbto provide s comprehensive

understanding of eigenvalues and eigenvectors,empowering learners to

leverage these concepts effectively in their own mathematical and


scientific

endeavors.

5
CONTENTS
1.Introduction

1.1Definition of eigenvalues and eigenvectors.

1.2 Importance of eigenvalues and eigenvectors in mathematics and


science .

1.3 Brief overview of applications of eigenvalues and eigenvectors.

2.Mathematical Background

2.1Vector

2.2 Linear combination

2.3 Linearly dependent & Independent vectors.

2.4 Test for Linear independence.(With examples)

3.Method for finding eigenvalues and eigenvectors.

4.Properties of eigenvalues and eigenvectors

4.1 Properties of eigenvalues with examples.

4.2 Properties of eigenvectors with examples.

5 . Application

5.1 Vibration analysis

5.2 Image Compression

5.3 Quantum Mechanics

5.4 Machine Learning

5.5 Medical Imaging

5.6 Financial Modeling

5.7 Climate Modeling

6. Conclusion

7.References

6
1.INTRODUCTION
Eigenvalues and eigenvectors are fundamental concepts in linear
algebra, playing a crucial role in understanding linear transformations and
their applications.

1.1Definition of Eigenvalues and Eigenvectors

• Eigenvalues : An eigenvalue is a scalar value , λ, that represents


how much a linear transformation changes a vector.In otherwords, it
is a measure of how much a vector is stretched or shrunk by a linear
transformation.
• Eigenvector: An eigenvector is a non-zero vector, v, that , when a
linear transformation is applied,to it, results in a scaled version of
the same vector (Av = λv).In otherwords,the eigenvector is a
direction in which the linear transformation stretches or shrinks the
vector.
1.2 Importance of Eigenvalues and Eigenvectors in
Mathematics and Science

• Linear algebra :Eigenvalues and eigenvectors help diagonalize


matrices, solve systems of linear equations, and analyze linear
transformations.
• Differential Equations: Eigenvalues and eigenvectors are used to
solve differential equations, particularly those with periodic or
oscillatory solutions.
• Physics: Eigenvalues and eigenvectors describe energy states in
quantum mechanics and analyze vibration modes in mechanical
systems.
• Engineering: Eigenvalues and eigenvectors are applied in control
theory, signal processing, and image compression.
• Data Analysis :Eigenvalues and eigenvectors are used in principal
component analysis (PCA), factor analysis, and clustering
algorithms.

1.3 Brief Overview of Applications


• Vibration Analysis : Eigenvalues help analyze vibration modes in
mechanical systems.
• Image Compression: Eigenvalues aid in image compression (e.g.,
JPEG).
• Quantum Mechanics: Eigenvalues describe energy states in
quantum systems.

7
• Machine Learning: Eigenvalues optimize neural networks and
clustering algorithms.
• Medical Imaging: Eigenvectors reconstruct images in MRI F.
• Financial Modeling: Eigenvalues analyze risk and optimize
portfolios.
• Climate Modeling: Eigenvectors predict climate patterns.

2.MATHEMATICALBACKGROUND

2.1 Vector : The vector 2 𝑖 +3 𝑗- 4 𝑘 can be regarded as the triplet (2, 3, -


4).

 Definition : An ordered n-tuple (x1, x2,…,xn) of numbers x1 , x2 , …,


xn is

Called an n-dimensional vector.

For example the triplet (2, 3, -4) is a 3-dimensional vector. (1, 0, -2, 3) is a
4-

Dimensional vector. A row matrix is also called a row vector and a column

Matrix is called a column vector.

 Definition - If X1 = (a1 , a2 , …, an ), X2 = (b1 , b2 , …, bn ) be two n-

Dimensional vectors, then their sum and scalar multiplications areX 1 + X2


= (a1 + b1 , a2 + b2 , …, an + bn ), αX1=(αa1,αa2,…,αan),which are n-
dimensional vectors. X1=X2 if and only if a1=b1,a2=b2,…,an=bn

2.2Definition - Linear Combination :If X1 , X2 , …, Xr are r vectors


of n-dimension and if α1 , α2 , …, αr are numbers,

Then the vector α1 X1 + α2 X2 + … + αr Xr is called a linear combination of


the

Vectors X1 , X2 , …, Xr .

2.3 Definition -Linearly dependent and independent vectors

(a) The set of vectors X1 , X2 , …, Xr is said to be linearly


dependent if there

Exist numbers α1 , α2 , …, αr , not all zero, such that α1 X1 + α2X2 + … + αr


Xr = 0

(b) The set of vectors X1 , X2 , …, Xr is said to be linearly


independent if any

8
Relation of the form α1 X1 + α2 X2 + … + αr Xr = 0 ⇒ α1 = 0, α2 = 0, …, αr =
0

Note: (i) If X1 , X2 , …, Xr are linearly dependent, then some vector is a


linear

Combination of others.

(ii) In a plane or 2-dimensional space, non-collinear vectors are linearly

Independent vectors whereas collinear vectors are dependent vectors.

In 3-dimesional space, non-coplanar vectors are linearly independent

Vectors whereas coplanar vectors are dependent vectors.

Example: 𝑖=(1,0,0), 𝑗 =(0,1,0), 𝑘 =(0,0,1), are linearly independent


vectors.

(iii) Any set of vectors containing zero vector 0 is a linearly dependent set.

(iv) Rank of an m × n matrix A is equal to the maximum number of

Independent column vectors or row vectors of A.

2.4 A useful result to test linear independence: Let X1 , X2 , …, Xn be


n

vectors of n-dimensional space. Let A be the matrix having these n-


vectors

as columns (or rows). Then A is a square matrix of order n. If 𝐴 ≠ 0 , then


X1

, X2 , …, Xn are linearly independent. If 𝐴 = 0 , then X 1 , X2 , …, Xn are


linearly

Dependent.

EXAMPLE 1

Show that the vectors (1,2,3),(3,-2,1),(1,-6,-5)are linearly


dependent.

Solution.

[ ]
1 3 1
Let A= 2 −2 −6 With the vector as columns .
3 1 −5

9
[
Then 𝐴 = 2 −2 −6
]
1 3 1

3 1 −5

= 1.(10+6)-3(-10+18)+1.(2+6)=16-24+8=0

∴ the vectors (1, 2, 3), (3, -2, 1) and (1, -6, -5) are linearly dependent.

EXAMPLE 2

Show that vectors X1=(1,2,-3,4),X2=(3,-1,2,1),X3=(1,-5,8,-7)are


Linearly dependent and find the relation between them.

[ ]
1 2 −3 4
Solution. A= 3 −1 2 1 With the vectors as rows .
1 −5 8 −7

We shall use elementary row operations.

[ ]
1 2 −3 4
∴ A~ 0 −7 11 −11 R2→R2-3R1 =R′2,R3→R3-R1=R′3
0 −7 11 −11

[ ]
1 2 −3 4
∼ 0 −7 11 −11 R3→R′3-R′2=R″3
0 0 0 0

Since the maximum number of non-zero rows is 2, which is less than the

Number of vectors, the given vectors are linearly dependent. .

The relation between them is obtained as below.

R3″=0

⇒ R3′-R2′=0

⇒ R3-R1-(R2-3R1)=0

⇒ R3-R2+2R1=0

Since the rows are vectors, we get X3 – X2 + 2X1 = 0 which is the relation

10
Between the vectors.

Note The rows of the matrix are the given vectors. So, only row
operations

Must be used to find the relationship between the vectors.

3. Methods for finding eigenvalue and


eigenvector
Let A be a square matrix of order n. A number λ is called an

Eigen value of A if there exists a non-zero column matrix X such that

AX=λx.Then X is called an eigen vector of A corresponding to λ.

[]
x1
x2
If A= [aij] nXn and X= : , thenAX=λX⇒(A-λI)X=0.
:
xn

This will represent a system of linear homogeneous equations in x 1 , x2 ,


…,

Xn. Since X ≠ 0 at least one of the xi ≠ 0.

Hence, the homogeneous system has nontrivial solutions.

∴ the determinant of coefficients 𝐴 – 𝜆𝐼 =0.

This equation is called the characteristic equation of A.

The determinant 𝐴 – 𝜆𝐼 ,on expansion, will be a nth degree polynomial in


λ

And is known as the characteristic polynomial of A.

The roots of the characteristic equation are the eigen values of A.

Definition Characteristic Equation and Characteristic Polynomial

If λ is a characteristic root of a square matrix A, then 𝐴 – 𝜆𝐼 =0 is called


the

Characteristic equation of A.

The polynomial 𝐴 – 𝜆𝐼 in λ is called the characteristic polynomial of A.

Note

(1)The word ‘eigen’ is German, which means ‘characteristic’ or


‘proper’. So,

11
An eigenvalue is also known as characteristic root or
propervalue.Sometimes

It is also known as latent root.

(2)If A= [ aa 1121 a12


]
a 22
, I= [ 10 01] , then the characteristic equation of A is
| 𝐴 – 𝜆𝐼 |=0

⇒ [ a 11a 21– λ a12


a22−λ ]
⇒ (a11-λ)(a22-λ)-a21a12=0

⇒ a11.a22-(a11+a22)λ+λ²-a21a12=0

⇒ λ²-(a11+a22)λ+(a11a22-a21a12)=0

⇒ λ²-S1λ+S2=0

Where S1=a11+a22=sum of the diagonal elements of A.

S2=a11.a22-a21.a12= 𝐴

[ ] [ ]
a 11 a12 a 13 1 0 0
(3) If A= a 21 a 22 a 23 , I= 0 1 0 ,then the charecteristics equation
a 31 a32 a 33 0 0 1
of

A is 𝐴 − 𝜆𝐼 =0


[ ]
a 11−λ a 12 a 13
a 21 a 22−λ a 23 =0
a31 a 32 a 33−λ

Expanding this determinant we will get λ³ - S 1λ²+S2λ-S3= 0,

where S1 = sum of the diagonal elements of A

S2 = sum of the minors of elements of the main diagonal

S3 =|A|

We will use this formula in problems.

Definition The set of all distinct eigen values of the square matrix A is

called the spectrum of A. The largest of the absolute values of the eigen

values of A is called the spectral radius of A. The set of all eigen vectors

corresponding to an eigen value λ of A, together with zero vector, forms a

12
vector space which is called as the eigenspace of A corresponding to λ.

4. Properties of eigenvalue and eigenvector


4.1Properties of Eigen Vectors
Theorem 4.1.1

(1) Eigen vector corresponding to an eigen value is not unique.

(2)Eigen vectors corresponding to different eigen values are linearly

independent.

Proof

(1) Let λ be an eigen value of a square matrix A of order n.

Let X be an eigen vector corresponding to λ.

Then AX = λX

Multiply by a constant C

∴ C(AX) = C(λX) ⇒ A(CX) = λ(CX)

Since C ≠ 0, X ≠ 0 we have CX ≠ 0

∴ CX is an eigen vector corresponding to λ for any C ≠ 0. Hence, eigen


vector

is not unique for the eigen value λ.

(2) Let λ1, λ2 be two different eigen values of A.

Let X1 , X2 be the corresponding eigen vectors.

∴ AX1 = λ1 X1

and AX2 = λ2 X2

We have to prove X1 and X2 are linearly independent.

Suppose, α1X1+α2X2=0

Then , A(α1X1+α2X2)=0

⇒ α1(AX1)+α2(AX2)=0

⇒ α1(λ1X1)+α2(λ2X2)=0

⇒ (α1λ1)X1+(α2λ2)X2=0

Multiply (3) by λ ,we get

λ1(α1X1)+λ1(α2X2)=0

13
⇒(α1λ1)X1+(α2λ2)X2=0

(4)-(5) ⇒ α2(λ2-λ1)X2=0

Since λ1≠λ2⇒λ2-λ1≠0 and X2≠0

∴ (λ2-λ1)X2≠0

∴ (6)⇒α2=0

∴ (3)⇒ α1X1=0

⇒α1=0, since X1≠0. .

Thus, α1X1+α2X2=0 ⇒α1=0 and α2=0

∴ X1 and X2 are linearly independent.

Note

(1)If all the n eigen values λ1 , λ2 , …, λn of A are different, then the

Corresponding eigen vectors X1 , X2 , …, Xn are linearly independent.

(2)A given eigen vector of A corresponds to only one eigen value of A.


(3)Eigen vectors corresponding to equal eigen values may be linearly

Independent or dependent

EXAMPLE 1

Find the eigen values and eigen vectors of the matrix [ 43 12] .
Solution:

Let A= [ 43 12]
The characteristic equation of A is| 𝐴 – 𝜆𝐼| = 0

⇒ [ 4−λ3 1
2− λ ]
=0 ⇒λ²-S1λ+S2=0

Where S1 = sum of the diagonal elements of A = 4 + 2 = 6

S2= 𝐴 = [ 43 12] =8-3=5


∴ the characteristic equation is λ2-6λ+5=0 ⇒ (λ-1)(λ-5)=0 ⇒λ=1,5

Which are the eigenvalues of A

To find eigenvectors:

14
Let X= [ ]
x1
x2
Be an eigen vector of A corresponding to λ.

Then (A-λI)X=0 ⇒ [ 4−λ3 1


][ ] [ ]
x1 0
=
2− λ x 2 0

⇒ (4− 𝜆) 𝑥1+𝑥2=0,

3𝑥1+ (2−𝜆) 𝑥2=0 } …………………………… (1)

Case (i) If λ = 1, then equations (I) become

3x1 + x2 = 0 and 3x1 + x2 = 0 ∴ x2=-3x1

Choosing x1 = 1, we get x2 = -3. ∴ eigen vector is X1= [−31 ]


Case (ii) If 𝜆=5, then equations (I) become

-x1+x2=0 and 3x1-3x2=0 ∴ x1= x2

Choosing x1=1, we get x2 = 1 ∴eigen vector is X2= [ 11]


Thus, eigen values of A are 1,5 and the corresponding eigen

Vectors are [−31 ][ 11]


Note

In case (i) we have only one equation 3x 1 + x2 = 0 to solve for x1

And x2 . So, we have infinite number of solutions x 1 = k, x2 = -3k,

For any k ≠ 0. We have chosen the simplest solution. Infact [−3k k ]=k [−31 ]
Is an eigen vector for λ= 1 for any k ≠ 0. So, for λ = 1

There are many eigen vectors. This verifies property 1.

Example 2

[ ]
3 −4 4
Find the eigen values and eigen vectors matrix 1 −2 4 .
1 −1 3

Solution.

[ ]
3 −4 4
Let A = 1 −2 4
1 −2 4

15
The characteristic equation of A is |𝐴 – 𝜆𝐼 |=0

[ ]
3−λ −4 4
⇒ 1 −2−λ 4 =0 ⇒ λ³-S1λ²+S2λ-S3=0
1 −2 4−λ

Where S1 = sum of main diagonal elements of A = 3 + (-2) + 3 = 4

S2 = sum of minors of diagonal elements of A

= [−2
−1 3 ] [ 1 3 ] [ 1 −2 ]
4 3 4 3 −4
+ +

=-6+4+9-4+(-6)+4=-2+5+(-2)=1

And S3= 𝐴 = 3(-6+4)+4(3-4)+4(-1+2)=-6-4+4=-6

∴ the characteristic equation is λ²-4λ²+λ+6=0

We choose integer factors of constant term 6 for trial solution. We

Find λ = -1 is a root. To find the other roots we perform synthetic

Division Other roots are given by

⇒ λ2-5λ+6=0

⇒ (λ-2)(λ-3)=0

⇒ λ=2or3

∴ The eigenvalues are λ=-1,2,3 [different roots]

To find eigenvectors:

[]
x1
Let X = x 2 be an eigen vector corresponding to the eigen value λ.
x3

Then (A-λI)X=0 ⇒ 1 −2−λ


[ ][ ] [ ]
3−λ −4 4 x1 0
4 x2 = 0
1 −1 3−λ x3 0

⇒ 3 − 𝜆 𝑥1 − 4𝑥2 + 4𝑥3 = 0

𝑥1 − 2 + 𝜆 𝑥2 + 4𝑥3 = 0 ……………………………….(I)

𝑥1 − 𝑥2 + 3 − 𝜆 𝑥3 = 0

Case(i) If λ=-1,then the equations(І)become

4x1-4x2+4x3=0 ⇒x1-x2+x3=0

16
x1-x2+4x3=0 and x1-x2+4x3=0

The different equations are x1-x2+x3=0 and x1-x2+4x3=0


x1 x2 x3
By rule of cross multiplication, we get = =
−4 +1 1−4 −1+1

⇒ ⇒ = =
x1 x 2 x 3 x1 x 2 x 3
= =
−3 −3 0 1 1 0

[]
1
Choosing x1=1,x2=1,x3=0, we get an eigen vector X1= 1
0

Case (ii) If λ= 2, then equations (I) become

x1 - 4x2 + 4x3 = 0, x1 – 4x2 + 4x3 = 0 and x1 - x2 + x3 = 0

∴ the different equations are x1 - 4x2 + 4x3 = 0 and x1 - x2 + x3 = 0


x1 x2 x3
By the rule of cross multiplication, we get = =
−4 +4 4−1 −1+ 4

⇒ ⇒ = =
x1 x 2 x 3 x1 x 2 x 3
= =
0 3 3 0 1 1

[]
0
Choosing x1 = 0, x2 = 1, x3 = 1, we get an eigen vector X2= 1
1

Case (iii) If λ = 3, then equations (I) become

0x1-4x2+4x3=0 ⇒ 0x1-x2+x3=0

x1-5x2+4x3=0 and x1-x2+0x3=0

The equations are different, but only two of them are independent. So, we

can choose any two of them to solve. From the first two equations, we get
x1 x2 x3
= =
−4 +5 1−0 0+1


x1 x 2 x 3
= =
1 1 1

[]
1
Choosing x1=1, x2=1, x3=1 ,we get an eigen vector X3= 1
1

Thus, the eigen value of A are -1,2,3 and corresponding eigen vector are

17
[] [] []
1 0 1
X1= 1 , X2= 0 , X3= 1
0 1 1

Note

(1) We are using the following integer root theorem for trial solution .“For

the equation of the form xn + an - 1 xn - 1 + an - 2 xn - 2 + … + a1x + a0 = 0 with

integer coefficients ai , any rational root is an integer and is a factor of the

constant term a0 ”. So, it is enough we try factors of the constant term for

integer solutions. If there is no integer solution, then real roots should be

irrational.

(2) In the above problem the eigen values -1, 2, 3 are different. So, by

property (2) the eigen vectors are linearly independent. We shall verify
this:

[ ]
1 1 0
Consider B = 0 1 1 with the eigen vectors as rows.
1 1 1

Then 𝐵 =1.0-1(-1)+0=1≠0

∴ X1,X2,X3 are linearly independent.

4.2Properties of Eigen Values

4.2.1 A square matrix A and its transpose AT have the same


eigenvalues.

Proof Eigen values of A are the roots of its characteristic equation


Type equation here .

| 𝐴 − 𝜆𝐼 |=0 ……………………………….. (1)

We know (A-λI)T=AT-(λI)T [∵ (A+B)T=AT+BT]

=AT-λIT=AT-λI [∵IT=I]

∴| (𝐴 − 𝜆𝐼)𝑇| = |𝐴𝑇 – 𝜆𝐼| …………………………….. (2)

For any square matrix B,| 𝐵 T|= |𝐵|

∴ |(𝐴 − 𝜆𝐼)𝑇| = |𝐴 – 𝜆𝐼| ……………………………(3)

From (2) and (3), |𝐴 – 𝜆𝐼| = |𝐴𝑇 – 𝜆𝐼| .

This shows that the characteristic polynomial of A and A T are the same.

18
Hence,the characteristic equations of A and AT is (1).

∴ A and AT have the same eigenvalues.

4.2.2. Sum of the eigen values of a square matrix A is equal to the


sum of the elements on its main diagonal.

Proof

Let A be a square matrix of order n.

Then the characteristic equation of A is |𝐴 – 𝜆𝐼| =0

⇒ λn-S1λn-1+S2λn-2-…+(-1)nSn=0 …………………………… (1)

where S1 = sum of the diagonal elements of A

If λ1 , λ2 , …,λn are the roots of (1), then λ1 , λ2 , …, λn are the eigen values
of A.

From theory of equations,


( n−1 )
−coefficient of λ
sum of the roots of (1) is = n
coefficient of λ

⇒ λ1 + λ2 + … + λn = -(-S1 ) = S1

∴ The sum of the eigen values = λ1 + λ2 + … + λn = S1

. = sum of the diagonal elements ofthematrixA.

Note Sum of the diagonal elements of A is called the trace of A.

∴ Sum of the eigen values = trace of A

𝑨.
4.2.3. Product of the eigen values of a square matrix A is equal to

Proof Let A be a square matrix of order n.

Then its characteristic equation is| 𝐴 – 𝜆𝐼|=0

⇒ λn-S1λn-1+ S2λn-2 - ... +(-1)nSn=0 ……………………………………… (1)

where Sn =| 𝐴| .

If λ1 , λ2 , …, λn are the n roots of (1), then from theory of equations,


constant term
the product of roots = (-1)ⁿ n
coefficient of λ

⇒ λ1 ,λ2 ,λ3 ,… λn =(-1)n(-1)nSn = (-1)2nSn=Sn = 𝐴

∴ The product of the eigen values =λ1 ,λ2 …λn=Sn= 𝐴 .

19
Note If at least one eigen value is 0, then 𝐴 = 0 ∴ A is a singular matrix. If

all the eigen values are non-zero, then 𝐴 ≠ 0

∴ A is a non-singular if all the eigen values are non-zero.

4.2.4. If λ1 , λ2 , …, λn are non-zero eigen values of square matrix


of order
1 1 1
n, then , ,…., are eigen values of A−1 .
λ1 λ2 λn

Proof

Let λ be any non-zero eigen value of A, then there exists a

non-zero column matrix X such that AX = λX. Since all the eigen values
are

non-zero, A is non-singular.

∴ A−1 exists.

∴ A−1 (AX) = A−1 (λX)

⇒ ( A−1A)X = λ( A−1X)

⇒ IX = λ( A−1X)
1 1
⇒ X =λ( A−1X) ⇒ X = A−1X ⇒ A−1X = X . [∵ λ≠0]
λ λ
1
So, Is an eigen value of A−1
λ

This is true for all the eigen values of A.


1 1 1
∴ , ,…, are the eigen values of A−1.
λ1 λ2 λn
1
Note that the eigen vector for A−1 corresponding to is also X.
λ

6. If λ1 , λ2 , …, λn are the eigen values of A, then

(i) λ1- K , λ2-K,…,λn-K are the eigen values of A – KI.


(ii) Α0λ12+α1λ1+α2,α0λ22+α1λ2+α2,…,α0λn2+α1λn+α2 are the eigen
values of Α0A2+α1A+α2I.

Proof

Let λ be any eigen value of A. Then AX = λX ……………. (1)

Where X ≠ 0 is a column matrix.

20
∴ AX – KX = λX – KX

⇒ (A – KI)X = (λ – K)X

∴ λ – K is an eigen value of A – KI.

This is true for all eigen values of A.

∴λ1- K, λ2 – K, …, λn – K are the eigen values of A – KI.

(iii) We have AX = λX and A2 X = λ2 X.

∴ α0(A2 X) = α0(λ2 X) and α1(AX) = α1(λX)

∴ α0 (A2 X) + α1(AX) = α0 (λ2 X) + α1(λX)

Adding α2X on both sides, we get

α0(A2 X)+α1(AX)+α2X = α0(λ2X)+α1(λX)+α2X ⇒ (α0A2+α1A +α2I)X =


(α0λ2+α1λ+α2)X

This means α0λ2+α1λ+α2 is an eigen value of α0A2+α1A+α2I.

This is true for all eigen values of A.

∴ α0λ12+α1λ1+α2λ22+α1λ2+α2,…,α0λn2+α1λn+α2 are the eigen values of


Α0A2+α1A+α2I.

Note

[ ]
1 0 0
1. The eigen values of the unit matrix 0 1 0 are 1,1,1 and
0 0 1

[ ][ ][ ]
1 0 0
The corresponding eigen vectors are 0 , 1 , 0 , which are
0 0 1
independent.

[ ]
λ 1 a12 a 13
2. The eigen values of a triangular matrix 0 λ 2 a 23 a re the main
0 0 λ3
Diagonal elements λ1 , λ2 , λ3.
3. If λ is an eigen value of A then AX = λX. We have seen A 2X = λ2X, …,
AmX = λmX.

Thus, the eigen values of A, A2 , …, Am are λ, λ2 , …, λm which are all


different.

But they all have the same eigen vector X.

Similarly, λ and α0λ2 + α1λ+α2 are eigen values of A and α0A2+α1A+α2I. But

21
They have the same eigen vector X.

Example 1

Find the sum and product of the eigen values of the matrix

[ ]
1 2 −2
1 0 3 .
−2 −1 −3

Solution.

[ ]
1 2 −2
Let A = 1 0 3
−2 −1 −3

Sum of the eigen values = Sum of the elements on the main diagonal

= 1 + 0 + (-3) = -2

Product of the eigen values = |𝐴 |= 1


[ ]
1 2 −2
0 3
−2 −1 −3

=1(0 + 3)-2(-3 + 6)-2(-1-0)

= 3 – 6 + 2 = -1

Example 2

[ ]
3 10 5
If 2 and 3 are eigen values of A = −2 −3 −4 , find the eigen
3 5 7
values of
−1
A and A³ .

Solution.

[ ]
3 10 5
Given A= −2 −3 −4
3 5 7

Also given 2 and 3 are two eigen values of A.

Let λ be the 3rd eigen value.

We know, sum of the eigen values = sum of the diagonal elements.

⇒ 2 + 3 + λ = 3+(-3)+7 ⇒ λ = 2

So, eigen values of A are 2, 2, 3

22
1 1 1
∴ The eigen values of A−1 are , , and the eigen values of A³ are 2³ , 2³ ,
2 2 3

⇒ 8, 8, 27

5.APPLICATION
5.1 Vibration analysis
Eigenvalues play a crucial role in vibration analysis!

 What are Eigenvalues in Vibration Analysis?

In vibration analysis, eigenvalues represent the natural frequencies of a


system. They are scalar values that describe the amount of vibration or
oscillation of a system at a particular frequency.

 Importance of Eigenvalues in Vibration Analysis

1. _Natural Frequencies_: Eigenvalues help identify the natural frequencies


of a system, which is essential for understanding its vibration behavior.

2. _Mode Shapes_: Eigenvalues are used to determine the mode shapes of


a system, which describe the vibration patterns of the system at different
frequencies.

3. _Resonance_: Eigenvalues help identify resonance frequencies, where


the system's vibration amplitude is maximum.

4. _Stability_: Eigenvalues can indicate the stability of a system, with


negative eigenvalues indicating instability.

 Types of Eigenvalues in Vibration Analysis

1. _Real Eigenvalues_: Represent the natural frequencies of a system.

2. _Complex Eigenvalues_: Represent the damped natural frequencies of a


system.

3. _Imaginary Eigenvalues_: Represent the undamped natural frequencies


of a system.

 Applications of Eigenvalues in Vibration Analysis

1. _Structural Dynamics_: Eigenvalues are used to analyze the vibration


behavior of buildings, bridges, and other structures.

2. _Mechanical Systems_: Eigenvalues are used to analyze the vibration


behavior of mechanical systems, such as engines, gearboxes, and
bearings.

23
3. _Aerospace Engineering_: Eigenvalues are used to analyze the vibration
behavior of aircraft and spacecraft.

 Software Tools for Eigenvalue Analysis

1. _MATLAB_: Provides built-in functions for eigenvalue analysis, such as


`eig()` and `eigs()`.

2. _ANSYS_: Offers advanced eigenvalue analysis capabilities for


structural and mechanical systems.

3. _ABAQUS_: Provides eigenvalue analysis capabilities for structural and


mechanical systems.

By analyzing eigenvalues, engineers can gain valuable insights into the


vibration behavior of complex systems, enabling them to design and
optimize systems for improved performance, safety, and reliability.

5.2 Image Compression


Eigenvalues play a significant role in image compression!

 What is Image Compression?

Image compression reduces the size of an image file while maintaining its
quality. This is achieved by representing the image data in a more
compact form.

 Role of Eigenvalues in Image Compression

1. _Principal Component Analysis (PCA)_: Eigenvalues are used in PCA to


identify the most important features of an image. PCA transforms the
image data into a new coordinate system, where the axes represent the
directions of maximum variance.

2. _Singular Value Decomposition (SVD)_: Eigenvalues are used in SVD to


decompose an image into its constituent parts. SVD represents an image
as the product of three matrices: U, Σ, and V.

3. _Karhunen-Loève Transform (KLT)_: Eigenvalues are used in KLT to


transform an image into a new coordinate system. KLT is similar to PCA
but uses a different mathematical approach.

 How Eigenvalues Help in Image Compression

1. _Dimensionality Reduction_: Eigenvalues help reduce the dimensionality


of an image by retaining only the most important features.

2. _Energy Compaction_: Eigenvalues help compact the energy of an


image into a smaller number of coefficients.

24
3. _Lossy Compression_: Eigenvalues enable lossy compression by
discarding the least important coefficients.

 Image Compression Algorithms Using Eigenvalues

1. _JPEG_: Uses DCT (Discrete Cosine Transform) which is related to


eigenvalue decomposition.

2. _SVD-based Compression_: Uses SVD to decompose an image and


retain only the most important singular values.

3. _PCA-based Compression_: Uses PCA to transform an image and retain


only the most important principal components.

 Benefits of Using Eigenvalues in Image Compression

1. _Improved Compression Ratio_: Eigenvalues help achieve a better


compression ratio by retaining only the most important features.

2. _Reduced Computational Complexity_: Eigenvalues enable faster


compression and decompression by reducing the dimensionality of the
image data.

3. _Improved Image Quality_: Eigenvalues help preserve the most


important features of an image, resulting in better visual quality.

5.3 Quantum Mechanics


Eigenvalues play a crucial role in Quantum Mechanics!
 What are Eigenvalues in Quantum Mechanics?

In Quantum Mechanics, eigenvalues represent the possible values of


physical observables, such as energy, momentum, and spin.

 Importance of Eigenvalues in Quantum Mechanics

1. _Energy Levels_: Eigenvalues determine the energy levels of a quantum


system, which are essential for understanding the behavior of atoms and
molecules.

2. _Quantization_: Eigenvalues introduce quantization, which is the


fundamental concept that physical observables can only take on specific
discrete values.

3. _Wave Functions_: Eigenvalues are used to determine the wave


functions of a quantum system, which describe the probability of finding a
particle in a particular state.

 Types of Eigenvalues in Quantum Mechanics

25
1. _Discrete Eigenvalues_: Represent the energy levels of a bound system,
such as an atom or molecule.

2. _Continuous Eigenvalues_: Represent the energy levels of an unbound


system, such as a free particle.

3. _Degenerate Eigenvalues_: Occur when multiple eigenstates have the


same eigenvalue, leading to degeneracy.

 Applications of Eigenvalues in Quantum Mechanics

1. _Schrödinger Equation_: Eigenvalues are used to solve the time-


independent Schrödinger equation, which describes the behavior of a
quantum system.

2. _Quantum Field Theory_: Eigenvalues play a crucial role in quantum


field theory, which describes the behavior of fundamental particles and
forces.

3. _Quantum Computing_: Eigenvalues are used in quantum computing to


understand the behavior of quantum bits (qubits) and quantum gates.

 Mathematical Representation

The eigenvalue equation in quantum mechanics is typically represented


as:

Hψ = Eψ

Where H is the Hamiltonian operator, ψ is the wave function, E is the


eigenvalue (energy), and ψ is the eigenstate.

 Key Concepts

1. _Hermitian Operators_: Eigenvalues are always real for Hermitian


operators, which are used to represent physical observables.

2. _Orthogonality_: Eigenstates are orthogonal to each other, which


ensures that the eigenvalues are distinct.

3. _Completeness_: The eigenstates form a complete basis, which allows


us to expand any wave function in terms of the eigenstates.

Eigenvalues are a fundamental concept in quantum mechanics, and their


properties and applications continue to shape our understanding of the
quantum world.

5.4 Machine Learning


Eigenvalues play a significant role in machine learning!

 What are Eigenvalues in Machine Learning?

26
In machine learning, eigenvalues are used to analyze and transform
data. They represent the amount of variance explained by each
principal component.

 Applications of Eigenvalues in Machine Learning

1. *Principal Component Analysis (PCA)*: Eigenvalues are used to


select the most important principal components, reducing
dimensionality and improving model performance.

2. *Singular Value Decomposition (SVD)*: Eigenvalues are used to


decompose matrices, reducing noise and improving model
interpretability.

3. *Kernel Methods*: Eigenvalues are used to compute kernel matrices,


enabling non-linear transformations and improving model performance.

4. *Clustering*: Eigenvalues are used to identify clusters and determine


the number of clusters.

5. *Anomaly Detection*: Eigenvalues are used to detect anomalies and


outliers.

 Algorithms that Use Eigenvalues

1. *PCA*: Uses eigenvalues to select principal components.

2. *t-SNE*: Uses eigenvalues to compute the similarity matrix.

3. *LLE*: Uses eigenvalues to compute the local linear embedding.

4. *Spectral Clustering*: Uses eigenvalues to identify clusters.

 Benefits of Using Eigenvalues in Machine Learning

1. *Dimensionality Reduction*: Eigenvalues help reduce the


dimensionality of data, improving model performance and reducing
computational complexity.

2. *Noise Reduction*: Eigenvalues help reduce noise and improve


model interpretability.

3. *Improved Model Performance*: Eigenvalues help improve model


performance by selecting the most important features.

4. *Interpretability*: Eigenvalues provide insights into the underlying


structure of the data.

 Popular Libraries for Eigenvalue Computation

1. *NumPy*: Provides functions for eigenvalue computation.


27
2. *SciPy*: Provides functions for eigenvalue computation and
decomposition.

3. *Scikit-learn*: Provides functions for PCA and other eigenvalue-based


algorithms.

By leveraging eigenvalues, machine learning models can better


capture the underlying patterns and relationships in data, leading to
improved performance and interpretability.

5.5 Medical Imaging


Eigenvectors play a significant role in medical imaging!

 Medical Imaging Modalities

Eigenvectors are used in various medical imaging modalities, including:

1. *Magnetic Resonance Imaging (MRI)*: Eigenvectors help reconstruct


images from MRI data.

2. *Computed Tomography (CT)*: Eigenvectors aid in reconstructing


images from CT data.

3. *Positron Emission Tomography (PET)*: Eigenvectors help reconstruct


images from PET data.

4. *Ultrasound*: Eigenvectors aid in image processing and analysis.

 Applications of Eigenvectors in Medical Imaging

1. *Image Denoising*: Eigenvectors help remove noise from medical


images.

2. *Image Segmentation*: Eigenvectors aid in segmenting images into


different regions of interest.

3. *Image Registration*: Eigenvectors help align multiple images taken


at different times or from different modalities.

4. *Image Reconstruction*: Eigenvectors aid in reconstructing images


from raw data.

5. *Computer-Aided Diagnosis (CAD)*: Eigenvectors help detect and


diagnose diseases from medical images.

 Techniques Using Eigenvectors in Medical Imaging

1. *Principal Component Analysis (PCA)*: Eigenvectors help reduce the


dimensionality of medical images.

28
2. *Independent Component Analysis (ICA)*: Eigenvectors aid in
separating mixed signals in medical images.

3. *Singular Value Decomposition (SVD)*: Eigenvectors help


decompose medical images into their constituent parts.

4. *Karhunen-Loève Transform (KLT)*: Eigenvectors aid in transforming


medical images into a new coordinate system.

 Benefits of Using Eigenvectors in Medical Imaging

1. *Improved Image Quality*: Eigenvectors help enhance image quality


by removing noise and artifacts.

2. *Increased Diagnostic Accuracy*: Eigenvectors aid in detecting and


diagnosing diseases more accurately.

3. *Reduced Computational Complexity*: Eigenvectors enable faster


image processing and analysis.

4. *Improved Patient Outcomes*: Eigenvectors help improve patient


outcomes by enabling more accurate diagnoses and treatments.

5.6 Financial modeling


Eigenvalues play a significant role in financial modeling!

 What is Financial Modeling?

Financial modeling involves using mathematical models to analyze and


forecast financial data. These models help investors, analysts, and
portfolio managers make informed decisions.

 Role of Eigenvalues in Financial Modeling

1. _Risk Analysis_: Eigenvalues help quantify risk by identifying the


most significant factors affecting a portfolio’s returns.

2. _Portfolio Optimization_: Eigenvalues enable portfolio managers to


optimize their portfolios by identifying the optimal asset allocation.

3. _Factor Analysis_: Eigenvalues help identify underlying factors


driving stock prices, such as macroeconomic variables or industry
trends.

4. _Stress Testing_: Eigenvalues enable stress testing by simulating


extreme scenarios and assessing their impact on a portfolio.

 Applications of Eigenvalues in Financial Modeling

1. _Value-at-Risk (VaR) Models_: Eigenvalues help estimate VaR, which


measures the potential loss of a portfolio over a specific time horizon.

29
2. _Principal Component Analysis (PCA)_: Eigenvalues are used in PCA
to identify the most important factors driving stock prices.

3. _Factor-Based Models_: Eigenvalues help identify underlying factors


driving stock prices, such as macroeconomic variables or industry
trends.

4. _Risk Parity Models_: Eigenvalues enable risk parity models to


allocate risk equally across different asset classes.

 Benefits of Using Eigenvalues in Financial Modeling

1. _Improved Risk Management_: Eigenvalues help quantify risk and


enable more effective risk management.

2. _Enhanced Portfolio Optimization_: Eigenvalues enable portfolio


managers to optimize their portfolios more effectively.

3. _Better Stress Testing_: Eigenvalues enable more accurate stress


testing and scenario analysis.

4. _Increased Transparency_: Eigenvalues provide insights into the


underlying factors driving stock prices, enabling more informed
investment decisions.

 Software Tools for Eigenvalue Analysis in Finance

1. _MATLAB_: Provides built-in functions for eigenvalue analysis, such


as `eig()` and `eigs()`.

2. _Python_: Offers libraries like NumPy and SciPy for eigenvalue


analysis.

3. _R_: Provides packages like `eigen` and `Matrix` for eigenvalue


analysis.

4. _Financial Modeling Software_: Software like Bloomberg, FactSet, and


BlackRock’s Aladdin platform provide eigenvalue analysis capabilities.

5.7 Climate Modeling


Eigenvectors play a significant role in climate modeling!

 What is Climate Modeling?

Climate modeling involves using mathematical models to simulate the


Earth’s climate system, including the atmosphere, oceans, land surfaces,
and ice.

 Role of Eigenvectors in Climate Modeling

30
1. _Empirical Orthogonal Function (EOF) Analysis_: Eigenvectors are used
in EOF analysis to identify patterns of climate variability, such as El Niño-
Southern Oscillation (ENSO).

2. _Principal Component Analysis (PCA)_: Eigenvectors are used in PCA to


reduce the dimensionality of climate datasets and identify the most
important modes of variability.

3. _Singular Value Decomposition (SVD)_: Eigenvectors are used in SVD to


decompose climate datasets into their constituent parts and identify the
most important patterns of variability.

 Applications of Eigenvectors in Climate Modeling

1. _Climate Pattern Identification_: Eigenvectors help identify patterns of


climate variability, such as ENSO, the North Atlantic Oscillation (NAO), and
the Pacific Decadal Oscillation (PDO).

2. _Climate Prediction_: Eigenvectors are used in climate prediction


models to forecast future climate patterns and trends.

3. _Climate Data Assimilation_: Eigenvectors are used in climate data


assimilation to combine model forecasts with observational data and
produce the best possible estimate of the current climate state.

 Benefits of Using Eigenvectors in Climate Modeling

1. _Improved Climate Pattern Identification_: Eigenvectors help identify


complex patterns of climate variability that may not be apparent through
other methods.

2. _Enhanced Climate Prediction_: Eigenvectors improve climate prediction


by identifying the most important patterns of variability and their
relationships to climate forcings.

3. _Increased Efficiency_: Eigenvectors reduce the dimensionality of


climate datasets, making it easier to analyze and visualize large datasets.

 Software Tools for Eigenvector Analysis in Climate Modeling

1. _MATLAB_: Provides built-in functions for EOF, PCA, and SVD analysis.

2. _Python_: Offers libraries such as NumPy, SciPy, and PyAOS for


eigenvector analysis.

3. _R_: Provides packages such as “stats” and “ vegan” for eigenvector


analysis.

6. Conclusion

31
 Eigenvector is a vector which when multiplied with a transformation
matrix results in another vector multiplied with a scalar multiple
having same direction as Eigenvector.This scalar multiple is known
as Eigenvalue .
 Eigenvectors and Eigenvalues are key concepts used in feature
extraction techniques such as Principal Component analysis which is
an algorithm used to reducing dimensionality while training a
machine learning model.
 Eigenvalues and Eigenvector concepts are used in several fields
including machine learning ,quantum computing ,communication
system design,construction designs,electrical and mechanical
engineering etc.

7.Reference
1.Rao A R and Bhim Sankaram Linear Algebra Hindustan Publishing House.

2.Gilbert Strang, Linear Algebra and its Applications,Thomson,2007.

32

You might also like