0% found this document useful (0 votes)
126 views19 pages

AK Lecture Notes Sem 5 L2 Algebra and Matrix Representations of Linear Transformations

The document discusses linear transformations and their representation using matrices. It covers topics such as operations on linear transformations, the algebra of linear transformations, and representing linear transformations on vector spaces as matrices. It provides examples of defining linear transformations, calculating their sums and compositions, and representing them as matrices with respect to a basis. It also discusses properties of linear operators and how they form an associative algebra.

Uploaded by

ABCSDFG
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
126 views19 pages

AK Lecture Notes Sem 5 L2 Algebra and Matrix Representations of Linear Transformations

The document discusses linear transformations and their representation using matrices. It covers topics such as operations on linear transformations, the algebra of linear transformations, and representing linear transformations on vector spaces as matrices. It provides examples of defining linear transformations, calculating their sums and compositions, and representing them as matrices with respect to a basis. It also discusses properties of linear operators and how they form an associative algebra.

Uploaded by

ABCSDFG
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Algebra and Matrix Representations of Linear

Transformations

Atanu Kumar

Assistant Professor
Department of Physics
Chandernagore College

August 24, 2020


Contents

I Operations on Linear Transformations


I Operations on Linear Operators
I Algebra of Linear Transformations
I Representation of Linear Transformations by Matrices.
Operations on Linear Transformations
I Let f : U → V and g : U → V are two linear transformations
and α ∈ F is a scalar. then
1. f = g iff f (u) = g (u), ∀u ∈ U.
2. The “addition” (f + g ) : U → V is also an linear
transformation defined by (f + g )(u) = f (u) + g (u), ∀u ∈ U.
3. The “scalar multiplication” (αf ) : U → V is also an linear
transformation defined by (αf )(u) = αf (u), ∀u ∈ U.
I Theorem: Let U and V are two vetor spaces over same field
F . Collection of all linear transformations from U to V form a
vector space Hom(U, V) over F under the operations
“addition” and “scalar multiplications”. Zero Vector O is
defined by O(u) = φ, ∀u ∈ U.
I Let U, V , W are three vector spaces over F . f : U → V and
g : V → W are linear operators. Then their Composition
g ◦ f : U → W is defined by w = g ◦ f (u) = g (v ) ∈ W , for
u ∈ U and v = f (u) ∈ V . g ◦ f is also a linear transformation.
Figure: Composition of Linear Transformations
Solved Problems

Let f : R3 → R2 and g : R3 → R2 be defined by


f (x, y , z) = (y , x + z) and g (x, y ) = (2z, x − y ). Find formulas
defining the mappings f + g and 3f − 2g .

(f + g )(x, y , z) = f (x, y , z) + g (x, y , z)


= (y , x + z) + (2z, x − y )
= (y + 2z, 2x − y + z),
(3f − 2g )(x, y , z) = 3f (x, y , z) − 2g (x, y , z)
= (3y , 3x + 3z) − (4z, 2x − 2y )
= (3y − 4z, x + 2y + 3z).
Let h : R2 → R is defined by h(x, y ) = (x + y ). Using maps f and
g in previous problem, find the formulas defining the mapping
h ◦ f , h ◦ g , and h ◦ (f + g ). Verify that h ◦ (f + g ) = h ◦ f + h ◦ g .
We have f (x, y , z) = (y , x + z), g (x, y ) = (2z, x − y ) and
(f + g )(x, y , z) = (y + 2z, 2x − y + z). So

h ◦ f (x, y , z) = h(y , x + z) = y + x + z,
h ◦ g (x, y , z) = h(2z, x − y ) = 2z + x − y ,
h ◦ (f + g )(x, y , z) = h(y + 2z, 2x − y + z) = 2x + 3z.

Then

h ◦ f (x, y , z) + h ◦ g (x, y , z) = y + x + z + 2z + x − y
= 2x + 3z
= h ◦ (f + g )(x, y , z).
Show that f , g , h ∈Hom(R2 , R2 ), defined by f (x, y ) = (x, 2y ),
g (x, y ) = (y , x + y ), h(x, y ) = (0, x), are linearly independent.
Let us consider the linear combination equating to the zero
operator: af + bg + ch = O:

af (x, y ) + bg (x, y ) + ch(x, y ) = O(x, y )


⇒ a(x, 2y ) + b(y , x + y ) + c(0, x) = (0, 0)
⇒ (ax + by , 2ay + bx + by + cx) = (0, 0),

which leads to two algebric equations:

ax + by = 0, (b + c)x + (2a + b)y = 0.

Putting x = 1, y = 0, we get a = 0 and b + c = 0. Then putting


x = 0, y = 1, we get b = 0, which implies c = 0 also. So
af + bg + ch = O implies a = 0, b = 0, c = 0. So the operators
are linearly independent.
Operations on Linear Operators
Let us consider the linear operators on a vector space V .
I Two operator  and B̂ are equal iff Âv = B̂v , ∀v ∈ V .
I For any two linear operators  and B̂ in the vector space V ,
the sum of the operators  + B̂ is defined as
(Â + B̂)v = Âv + B̂v , ∀v ∈ V .
I The operator (−Â) is defined as (−Â)v = Â(−v ), ∀v ∈ V ,
−v being the additive inverse of v . Then subtraction of the
operators is given by  − B̂ =  + (−B̂).
I The Null Operator (Ô) maps each vector v to the null
vector φ: Ôv = φ, ∀v ∈ V . It can be easily shown that
 −  = Ô, ∀Â.
I The Identity Operator (Iˆ) is defined by Iˆv = v , ∀v ∈ V .
I For any two linear operators  and B̂ in the vector space V ,
the product of the operators ÂB̂ is defied as
ÂB̂v = Â(B̂v ), ∀v ∈ V . If ÂB̂v = B̂ Âv , ∀v ∈ V , then  is
said to commute with B̂. The operator [Â, B̂] = ÂB̂ − B̂ Â is
called the Commutator of  and B̂.
Let us consider two operators  and B̂ in R2 , defined by
       
x x +y x x −y
 = , B̂ = .
y x −y y x +y

Sum  + B̂ is
       
x x +y x −y 2x
(Â + B̂) = + = .
y x −y x +y 2x

Products of the operators ÂB̂ and B̂ Â will be given by


     
x x −y 2x
ÂB̂ = Â = ,
y x +y −2y
     
x x +y 2y
B̂ Â = Â = .
y x −y 2x

So the operators  and B̂ do not commute.


Algebra of Linear Operators

I Space of all linear operators on V form an algebraic structure


A(V ), which is wider than the vector space structure.
I If Â, B̂, Ĉ are linear operators on V on in other words
“vectors” in the algebra A(V ), then their “addition” Â + B̂,
“product”ÂB̂ and “scalar multiplication” αA, exist and
belong to A(V ).
I The operators in A(V ) obey the following rules:
1. Â(B̂ + Ĉ ) = ÂB̂ + ÂĈ , ∀Â, B̂, Ĉ ∈ A(V )
2. (Â + B̂)Ĉ = ÂĈ + B̂ Ĉ , ∀Â, B̂, Ĉ ∈ A(V )
3. α(ÂB̂) = (αÂ)B̂ + Â(αB̂), ∀Â, B̂, ∈ A(V ) and ∀α ∈ F .
I If in addition Â(B̂ Ĉ ) = (ÂB̂)Ĉ holds then A(V ) will be an
associative algebra.
I If dim(V )=n, then dim(A(V ))=n2 .
Let us consider the vector space R2 . General form of a linear
operator  : R2 → R2 is
      
x ax + by a b x
 = = .
y cx + dy c d y

Thus any linear operator on R2 can be represented as 2 × 2 real


matrices. So A(R2 ) is the space of all 2 × 2 real matrices and the
basic operations of A(R2 ) are usual matrix addition, matrix
multiplication and scalar multiplication. The zero operator is
 
0 0
.
0 0

Shortly we will see that linear operators on any n dimensional


vector space Vn can be represented as n × n matrices.
Solved Problems
The operators  and B̂, on the space of functions F, are defined by
Âu(x) = du ˆ
dx and B̂u(x) = xu(x). Show that [Â, B̂] = I and
[Â2 , B̂] = 2Â.

d du du
ÂB̂u = (xu) = u + x , B̂ Âu = x
h i dx x x
ˆ
⇒ Â, B̂ u = ÂB̂u − B̂ Âu = u = I u.

d 2u
Now Â2 u = dx 2
. Then

d2 d 2u
 
2 d du du
 B̂u = (xu) = u + x = 2 + x
dx 2 dx dx dx dx 2
2
d u
B̂ Â2 u = x 2
dx
h i du
⇒ Â2 , B̂ u = 2 = 2Âu.
dx
Matrix Representation of Linear Operators

Let us consider an n dimensional real vector space V and an linear


operator  : V → V . Choose a set of basis vectors
S={e1 , e2 , ..., en } in V . Then any vector u ∈ V can be written as
a linear combination of ei s:
n
X
u = u1 e1 + u2 e2 + ...un en = ui ei ,
i=1

where the scalars ui s are components of u in the basis {ei }.


v = Â(u), the image of u, is similarly expressed in terms of bases
and components:
n
X
v = v1 e1 + v2 e2 + ...vn en = vi ei .
i=1
Since V is isomorphic to Rn , u and v can be represented as
column matrices

   
u1 v1
 u2   v2 
   
. .
[u]S = 
 . ,
 [v ]S = 
 . .

   
. .
un vn

Now since ei s are vectors in V , so are their images Â(ei )s. Hence
each Â(ei ) can be written as a linear combination of ei s:
n
X
Â(ei ) = a1i e1 + a2i e2 + ...ani en = aji ej .
j=1

Let us consider aij s as elements of an n × n matrix [Â]S . To see


that this is the matrix representation of the operator Â, let us
evaluate v = Â(u).
 
n n n n
!
X X X X
Â(u) =  ui ei = ui  (ei ) = ui  aji ej 
i=1 i=1 i=1 j=1
n n
!
X X
= uj aij ei (interchanging i and j)
j=1 i=1
 
n
X n
X
=  aij uj  ei (interchanging the order of summations).
i=1 j=1

Pn
Comparing with v = i=1 vi ei :

n
X
vi = aij uj ,
j=1

which is formula of matrix multiplication. Thus we have


[v ]S = [Â]S [u]S and identify [Â]S as the matrix representation of Â.
If the basis set S = {ei } is orthonormal one, i.e. hei , ej i = δij then
the matrix elements of [Â]S are evaluated as
n
X n
X n
X
hei , Â(ej )i = hei , akj ek i = akj hei , ek i = akj δik = aij .
k=1 k=1 k=1

d
Example: Consider the differential operator dx on P2 . For an
2
arbitrary polynomial a(x) = a0 + a1 x + a2 x ∈ P2 ,

d
b(x) = a(x) = a1 + 2a2 x.
dx
Comparing with b(x) = b0 + b1 x + b2 x 2 , we have b0 = a1 ,
d
b1 = 2a2 , b2 = 0. So the matrix representation of dx in the basis
2
set {1, x, x } is

0 1 0
  !
d
= 0 0 2 .
dx 0 0 0
Solved Problems
Consider the linear operator  on R2 and basis S:
Â(x, y ) = (2x − 7y , 4x + 3y ) and S = {e1 , e2 } = {(1, 0), (1, −1)}.
Find the matrix representation [Â]S . Verify [Â]S [u]S for the vector
u = (4, −3) in R2 .
First let us find the components of an arbitrary vector in the basis
v = (x, y ) in S:
     
x 1 1 x =a+b a=x +y
=a +b , ⇒ , ⇒ .
y 0 −1 y = −b b = −y
 
x +y
⇒ v = (x + y )e1 − ye2 , or [v ]S = .
−y

Then operating  on the basis vectors we get


   
1 2
Âe1 = Â = Â = 6e1 − 4e2 ,
0 4
   
1 9
Âe2 = Â = Â = 10e1 − e2 .
−1 1
So the matrix representation of  in the basis S is
 
h i 6 10
 = .
S −4 −1

Now the matrix representation of u = (4, −3) and Âu are found as
   
4 1
u= = (4 − 3)e1 + 3e2 = e1 + 3e2 ⇒ [u]S = ,
−3 3
     
8 + 21 29 h i 36
Âu = = = 36e1 − 7e2 ⇒ Âu = .
16 − 9 7 S −7

Then calculating [Â]S [u]S ,


    
h i 6 10 1 36
 [u]S = = ,
S −4 −1 3 −7

it is verified that [Â]S [u]S = [Âu]S .


References

1. S P Kuila, “Vector Analysis, Tensor Analysis and Linear


Vector Space”
2. B.S. Vatssa, “Elements of Modern Algebra”
3. Arfken and Weber, ”Mathematical Methods for Physicists”
4. S Lipschutz, “Schaum’s Outlines: Linear Algebra”

You might also like