Eigen Vectors, Eigen Values and Matrices
Eigen Vectors, Eigen Values and Matrices
Formula: [ X ] a = lambda(b)
Where X is a square matrix and ‘a’ is the input vector
Here, lambda is the Eigen value for the matrix [X] and ‘b’ is the Eigen Vector.
Eigen Vectors can also be used for reorienting the x,y co-ordinates
into the dimensions specified by the PCA as the covariance matrix and eigen
vectors are orthogonal to each other.
Also there is a relationship between the rank of a matrix and its eigen
vectors. The number of linearly independent eigen vectors of the 0 eigen value of a
matrix say X is equal to the nullity or kernel of the matrix. A kernel is a set of
vectors in the domain of mapping for a linear mapping which is mapped to the zero
vector.
Given a linear map A->B. we can say L(a)=0 where a is contained in A.
The rank and nullity add up to the number of columns of the matrix.
Therefore, the number of linearly independent eigen vectors of 0
eigen value of a matrix and the rank of the matrix add up to give the total number
of columns of the matrix.