DELA Unit V
DELA Unit V
CHARACTERISTIC EQUATION
Let ‘A’ be a given matrix. Let λ be a scalar. The equation det [A- λ I]=0 is called the
characteristic equation of the matrix A.
EIGEN VALUE
The values of λ obtained from the characteristic equation |A- λ I|=0 are called the Eigen
values of A.
EIGEN VECTOR
Let A be a square matrix of order ‘n’ and λ be a scalar, X be a non- zero column vector such
that AX = λX.
NOTE:
(iii) If two or more Eigen values are equal, then the Eigen vectors may be linearly
independent or linearly dependent.
Property 1: (I) The sum of the Eigen values of a matrix is equal to the sum of the elements
of the principal diagonal (trace of the matrix). i.e., λ1+ λ2+ λ3=a11+a22+a33
(ii)The product of the Eigen values of a matrix is equal to the determinant of the matrix. i.e.,
λ1 λ2 λ3=|A|
Property 2: A square matrix A and its transpose ܣhave the same Eigen values.
Property 3: The characteristic roots of a triangular matrix are just the diagonal elements of
the matrix.
Property 4: If λ is an Eigen value of a matrix A, then 1/ λ, (λ=!0) is the Eigen value of A-1
Property 9: The Eigen values of a real symmetric matrix are real numbers.
Property 10: The Eigen vectors corresponding to distinct Eigen values of a real symmetric
matrix are orthogonal.
Property 12: Eigen vectors of a symmetric matrix corresponding to different Eigen values
are orthogonal.
Property 13: If A and B are nxn matrices and B is a non singular matrix then A and B-1AB
have same Eigen values.
Property 14: Two Eigen vectors X1 and X2 are called orthogonal vectors if X1TX2=0
Note: The absolute value of a determinant (|detA|) is the product of the absolute values of the
eigen values of matrix A.
· Eigen vectors of a symmetric matrix are orthogonal, but only for distinct eigen values.
· The smallest Eigen value of matrix A is the same as the inverse (reciprocal) of the largest
eigen value of A-1; i.e. of the inverse of A.
CAYLEY HAMILTON THEOREM:
The process of finding a matrix M such that M-1AM=D ,where D is a diagonal matrix, if
called diagonalisation of the Matrix A
Note Ak=MDkM-1
DIAGONALISATION BY ORTHOGONAL
If A is a real symmetric matrix, then the Eigen vectors of A will be not only linearly
independent but also pair wise orthogonal. If we normalize each eigen vector Xr i.e. divide
each element of Xr by the square root of the sum of the square\s of all the elements of Xr and
use the normalized Eigen vectors of A to form the normalized modal matrix N, then it can be
proved that N is an orthogonal matrix. By a property of orthogonal matrix, N-1= NT.
This form of Q is called the sum of the squares form of Q or the canonical form of Q.
Rank: When the quadratic form is reduced to the canonical form it contains only r
terms which is the rank of A.
Index: The number of positive terms in the canonical form is called the index (p) of the
quadratic form.
Signature: The difference between the number of positive and negative terms is called
signature (s) of the quadratic form [s = 2p-r].
(i) Positive definite: If r = n and p = n or if all the Eigen values of A are positive.
(ii) Positive semi definite: If r < n and p = r or if all the Eigen values of A>=0 and atleast
one Eigen value is zero.
(iii) Negative definite: If r = n and p = 0 or if all the Eigen values of A are negative.
(iv) Negative semi definite: If r < n and p = 0 or if all the Eigen values of A<=0 and atleast
one Eigen value is zero.
(v) Indefinite: In all other cases or if A has positive as well as negative Eigen values.
In this method we can determine the nature of the quadratic form without reducing it to the
canonical form. Let A be a square matrix of order n.
1. A Q.F is positive definite if D1,D2,D3 … … … … Dn are all positive i.e., Dn>0 for all n.
2. A Q.F is negative definite if D1,D3,D5 … are all negative and D2,D4,D6 … are all
positive i.e.,(-1)n Dn>0 for all n.
4. A Q.F is negative semi- definite if (-1)nDn >=0 and atleast one Di=0.