0% found this document useful (0 votes)
55 views11 pages

Math 270 5.4

The document discusses linear transformations and their relationship to matrix factorization and eigenvectors. It explains that any linear transformation between vector spaces can be represented by a matrix relative to bases of the domain and codomain spaces. If the domain and codomain are the same, the matrix is the standard matrix or B-matrix of the linear transformation. Similar matrices, related by conjugation, represent linearly equivalent transformations and have the same eigenvalues and characteristic polynomial.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views11 pages

Math 270 5.4

The document discusses linear transformations and their relationship to matrix factorization and eigenvectors. It explains that any linear transformation between vector spaces can be represented by a matrix relative to bases of the domain and codomain spaces. If the domain and codomain are the same, the matrix is the standard matrix or B-matrix of the linear transformation. Similar matrices, related by conjugation, represent linearly equivalent transformations and have the same eigenvalues and characteristic polynomial.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

5.

4 Eigenvectors and
Linear Transformations

4/21/2017 1
Eigenvectors and Linear transformations.

How is the factorization A = PDP −1 related to linear transformations?

Any linear transformation T : n →  m can be implemented by


left-multiplication by a matrix A, the standard matrix of T .

We want to extend this kind of representation to any linear


transformation between two finite-dimensional vector spaces.

4/21/2017 2
Suppose V is an n-dimensional vector space and W is an m-dimensional vector space.
Let T :V → W be a linear transformation. Suppose further that B and C are ordered bases
for V and W respectively.

 
For any x ∈V , the coordinate vector  x  B is in  n and the coordinate vector of its image,

T ( x )  , is in  m .
 C 

V  W 
x T ( x)


T ( x ) 

x   C
 B

n m

4/21/2017 3

     
Let  b1 , , bn  be the basis B for V . If x = r1b1 +  + rnbn , then

r 
  1
 x  =   
B  
r 
 n 

and, since T is linear,

    
T ( x=) T (r1b1 +  + rnb=n ) r1T (b1 ) +  + rnT (bn ) (*)

Now using the basis C in W , we can write (*) in terms of


C -coordinate vectors:

  
T=

( x ) r1 T (b1) + + rn T (bn ) (**)
C  C  C

4/21/2017 4
But since C -coordinate vectors are in  m , the vector equation

 
  
 
T=
( x ) r1 T (b1) + + rn T (bn )
 C  C  C

can be written as a matrix equation

 
T ( x ) 
 C
= M  x  B

   
  
  
with M =  T (b1) T (b2 )   T (bn )  
  C C C 

The matrix M is a matrix representation of T called the matrix


for T relative to the bases B and C.

4/21/2017 5
 
T ( x ) 
 C
= M  x  B

   
  
  
M =  T (b1 ) T (b2 )   T (bn )  
  C C C 

4/21/2017 6
 
Example: Suppose B { }
b=1 , b2 is a basis for V and C
  
{c1, c2 , c3} is
a basis for W . Let T : V → W be a linear transformation with the
       
property that T (b1 ) =3c1 − 2c2 + 5c3 and T (b2 ) = 4c1 + 7c2 − c3
Find the matrix M for T relative to B and C.
 
The C -coordinate vectors for the images of b1 and b2 are

  3   4
       
T (b )
1  =−2
  and T (b )
2  =
 7
C  5 C  −1
   

 3 4
 
Then the matrix M =  −2 7
 5 −1

4/21/2017 7
Linear transformations on  n .

Theorem. Diagonal matrix representation.


Suppose A PDP −1, where D is a diagonal n × n matrix.

If B is a basis for  n formed from the columns of P,


 
then D is the B-matrix for the transformation x  Ax.

4/21/2017 8
Linear transformations T :V → V , f rom a vector space V into itself.

When the domain and codomain of T are the same, that is when
T :V → W and W is the same as V and basis C is the same as B,
the matrix

   
  
  
M =  T (b1 ) T (b )
2   T (bn )  
  C C C 

is called the B-matrix for T , written T  B .

The B-matrix for T :V → V satisfies the equation

  
T ( x ) 
 B
= T  B  x  B , ∀x ∈V .

4/21/2017 9
Matrix similarity

Let A and B be n × n matrices. A is said to be similar to B if


there exists an invertible matrix P such that

=P −1 AP B=
or equivalently, A PBP −1.

If we write Q for P −1, we have Q −1BQ = A.

So B is similar to A as well. Thus A and B are simply said to


be similar.

Theorem

If n × n matrices A and B are similar, then they have the


same characteristic polynomial and thus the same eigenvalues.

4/21/2017 10
End presentation

4/21/2017 11

You might also like