0% found this document useful (0 votes)
56 views2 pages

Eigen Vectors, Eigen Values and Matrices

Eigenvectors and eigenvalues have a relationship with square matrices. When a matrix multiplies an input vector, it results in an output vector that is mapped to a new space. Eigenvectors are vectors whose direction remains the same through this transformation. A square matrix can be decomposed into its eigenvectors and eigenvalues. The eigenvectors represent the axes along which the matrix transformation acts, while the eigenvalues are scalar coefficients associated with the eigenvectors. There is a relationship between a matrix's rank, its eigenvectors of eigenvalue 0, and the matrix's columns.

Uploaded by

Shashank Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views2 pages

Eigen Vectors, Eigen Values and Matrices

Eigenvectors and eigenvalues have a relationship with square matrices. When a matrix multiplies an input vector, it results in an output vector that is mapped to a new space. Eigenvectors are vectors whose direction remains the same through this transformation. A square matrix can be decomposed into its eigenvectors and eigenvalues. The eigenvectors represent the axes along which the matrix transformation acts, while the eigenvalues are scalar coefficients associated with the eigenvectors. There is a relationship between a matrix's rank, its eigenvectors of eigenvalue 0, and the matrix's columns.

Uploaded by

Shashank Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Eigen Vectors, Eigen Values and Matrices

Eigen is a german word for “own” . Similarly, all


square matrices have eigen vectors and both of it have a certain relationship with
each other. Matrices are used for linear transformations as when we multiply some
input vector say ‘a’ with a matrix ‘X’, it results in an output vector ‘b’. So in other
words, it maps one vector to another. Every time we apply this on vectors, the
vectors are projected into a new space that projects farther. During this application,
there are various vectors involved but the vectors whose direction remain the same
throughout the transformation are called eigen vectors.
These are vectors that respond to matrices as if it
was a scalar coefficient. It can also be thought of as the axes along which the
transformation acts upon. A square matrix can have as many as ‘n’ number of
eigen vectors where n is the dimension of the matrix. Eg. A 2x2 matrix can have 2
eigen vectors, one for each dimension. Matrix decomposition is a way of breaking
down matrices in specific ways that give us information about certain function
properties that are universal for the complete matrix as well.
One technique of decomposition involves breaking down of matrices into
eigen values and eigen vectors.

Formula: [ X ] a = lambda(b)
Where X is a square matrix and ‘a’ is the input vector
Here, lambda is the Eigen value for the matrix [X] and ‘b’ is the Eigen Vector.
Eigen Vectors can also be used for reorienting the x,y co-ordinates
into the dimensions specified by the PCA as the covariance matrix and eigen
vectors are orthogonal to each other.
Also there is a relationship between the rank of a matrix and its eigen
vectors. The number of linearly independent eigen vectors of the 0 eigen value of a
matrix say X is equal to the nullity or kernel of the matrix. A kernel is a set of
vectors in the domain of mapping for a linear mapping which is mapped to the zero
vector.
Given a linear map A->B. we can say L(a)=0 where a is contained in A.

The rank and nullity add up to the number of columns of the matrix.
Therefore, the number of linearly independent eigen vectors of 0
eigen value of a matrix and the rank of the matrix add up to give the total number
of columns of the matrix.

You might also like