0% found this document useful (0 votes)
78 views6 pages

Linear Algebra Resupply Date Vi. Matrix Represention.: Definition 1. Ordered Basis

The document discusses matrix representations of linear transformations. It defines ordered bases, column vectors, and the matrix representation of a linear transformation T with respect to input and output bases. The matrix representation uniquely determines T and preserves properties like addition and scalar multiplication. Matrix multiplication is also defined and properties like [UT] = [U][T] are proven. While linear transformations satisfy UT = TU, matrix multiplication is generally non-commutative.

Uploaded by

詹子軒
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views6 pages

Linear Algebra Resupply Date Vi. Matrix Represention.: Definition 1. Ordered Basis

The document discusses matrix representations of linear transformations. It defines ordered bases, column vectors, and the matrix representation of a linear transformation T with respect to input and output bases. The matrix representation uniquely determines T and preserves properties like addition and scalar multiplication. Matrix multiplication is also defined and properties like [UT] = [U][T] are proven. While linear transformations satisfy UT = TU, matrix multiplication is generally non-commutative.

Uploaded by

詹子軒
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Linear Algebra ReSupply Date VI. Matrix Represention.

前面我們討論了很多關於 vector space 與 linear transformation 的理論,接下來我們把前面抽


象的概念正式具象化。前面我們在講 basis 我們通常是把它視為一個 set,而不會去討論內部的
element(vector) 之間的關係,而在這裡我們會用到這些 vector 的順序 (也就是 order 的概念)。

Definition 1. Ordered Basis


Let V be a vector space over F with dim(V ) < ∞, then we defined the ordered basis β for V if β
has specific order.

Example. Ordered Basis


Let V = R3 , then we consider that

(i) β1 = {(1, 0, 0), (0, 1, 0), (0, 0, 1)}.

(ii) β2 = {(0, 1, 0), (1, 0, 0), (0, 0, 1)}.

We can see that β1 ̸= β2 as ordered basis and β1 = β2 as un-ordered basis.

用任意不同的 ordered basis 寫出來的 vector 或 matrix 是長得不一樣的,但是他們之間的性質


是不變的,接著我們去討論 ordered basis 所寫出來的 matrix representation。

Definition 2. Column Vector


Let β = {u1 , ..., un } be an ordered basis for V , if for all x ∈ V , x = a1 u1 + · · · + an un for some unique
ai ∈ F , then we defined by  
a1
 . 
[x]β =  . 
 . 
an
is called the column vector.

Notation. Matrix Representation


Let T ∈ L(V, W ) and dim(V ), dim(W ) < ∞, and let β = {v1 , ..., vn }, γ = {w1 , ..., wm } be the basis
for V, W , then the linear transformation is uniquely determined by {T (v1 ), ..., T (vn )}.
each T (vi ) has a unique expression in terms of {w1 , ..., wm } i.e.


m
T (vj ) = aij wi
i=1

for some aij ∈ F .

1 | Linear Algebra ReSupply Date VI JunWei,Zhang Edit


Definition 3. Matrix Representation
Using above notation, we calll the matrix A, Aij = aij is called ij-componemt of A, then matrix
representation of T where β, γ fixed. denoted by A = [T ]γβ and A is a m × n matrix, denoted by
A ∈ Mm×n (F ). If T ∈ L(V, V ), then we denoted [T ]β .

Remark. Matrix Representation


Let T, U ∈ L(V, W ), if [T ]γβ = [U ]γβ , then T = U .

Example. Matrix Representation


Let T ∈ L(R2 , R3 ) and β = {(1, 0), (0, 1)} and γ = {(1, 0, 0), (0, 1, 0), (0, 0, 1)}.
If T (a1 , a2 ) = (a1 + 3a2 , a1 , 2a1 − 4a2 ), then we find [T ]γβ .

Solution.
Since T (1, 0) = (1, 1, 2) = 1(1, 0, 0) + 1(0, 1, 0) + 2(0, 0, 1) and T (0, 1) = (3, 0, −4) = 3(1, 0, 0) +
0(0, 1, 0) − 4(0, 0, 1), then we have  
1 3
[T ]γβ =  1 0 
2 −4

我們在中學有學過 matrix 是 linear,在這裡我們的所有 properties 都跟小時候一樣。


Definition 4. Linear transformation in Addition vs Scalar-Mulitiplication
Let T, U ∈ L(V, W ) ba arbitary function and V, W be vector space over F , then we defined the
following:
(a) (T + U )(x) = T (x) + U (x) for all x ∈ V .

(b) (cT )(x) = cT (x) for all x ∈ V, c ∈ F .

Theorem 1.
Let T, U ∈ L(V, W ) ba arbitrary function and V, W be vector space over F , then we have
(a) aT + U ∈ L(V, W ) for all a ∈ F .

(b) S = {T : V → W : T ∈ L(V, W )} is a vector space.

Proof.
We show that the following:

(a) for all x, y ∈ V , a, c ∈ F , then we have

(aT + U )(cx + y) = (aT + U )(cx) + (aT + U )(y) = c(aT (x) + U (x)) + (aT (y) + U (y))
= c(aT + U )(x) + (aT + U )(y)

Thus, we know that aT + U ∈ L(V, W ) for all a ∈ F .

(b) we choose zero function is the identity element of S. the proof is trivial.

2 | Linear Algebra ReSupply Date VI JunWei,Zhang Edit


Theorem 2.
Let T, U ∈ L(V, W ) and V, W be vector space over F with dim(V ), dim(W ) < ∞, then we have

(a) [T + U ]γβ = [T ]γβ + [U ]γβ .

(b) [aT ]γβ = a[T ]γβ

Proof.
Let β = {v1 , ..., vn } and γ = {w1 , ..., wm }, then we have


m ∑
m
T (vj ) = aij wi ; U (vj ) = bij wi
i=1 i=1
∑m
(a) Since (T + U )(vj ) = T (vj ) + U (vj ) = i=1 (aij + bij )wi , then we have

([T + U ]γβ )ij = aij + bij = ([T ]γβ )ij + ([U ]γβ )ij

Thus, we get [T + U ]γβ = [T ]γβ + [U ]γβ .

∑m
(b) Since (aT )(vj ) = aT (vj ) = a i=1 (aij + bij )wi , then we have

([aT ]γβ )ij = a · aij = a([T ]γβ )ij

Thus, we get [aT ]γβ = a[T ]γβ + [U ]γβ .

Theorem 3.
Let T ∈ L(V, W ), U ∈ L(W, Z) and V, W, Z be vector space over F with dim(V ), dim(W ), dim(Z) <
∞, then U T ∈ L(V, Z).

Proof.
By definition, we have

U T (ax + y) = U (aT (x) + T (y)) = aU (T (x)) + U (T (y)) = aU T (x) + U T (y)

for all x, y ∈ V , a ∈ F .
Thus, we get U T ∈ L(V, Z).

3 | Linear Algebra ReSupply Date VI JunWei,Zhang Edit


Theorem 4.
Let T, U1 , U2 ∈ L(V, V ) and V be vector space over F , then we have

(a) T (U1 + U2 ) = T U1 + T U2

(b) T (U1 U2 ) = (T U1 )U2

(c) T IdV = T = IdV T

(d) a(U1 + U2 ) = aU1 + aU2 for all a ∈ F .

Proof.
We show that the following:

(a) (i) Since T (U1 + U2 )(x) = T ((U1 + U2 )(x)) = T (U1 (x) + U2 (x)) = T (U1 (x)) + T (U2 (x)),
then we have T (U1 + U2 )(x) = T U1 (x) + T U2 (x).
Thus, we get T (U1 + U2 ) = T U1 + T U2 .

(ii) Since (U1 + U2 )T (x) = (U1 + U2 )(T (x)) = U1 (T (x)) + U2 (T (x)), then we have

T (U1 + U2 )(x) = U1 T (x) + U2 T (x)

Thus, we get (U1 + U2 )T = U1 T + U2 T .

(b) Since T (U1 U2 )(x) = T (U1 U2 (x)) = T (U1 (U2 (x))) = T U1 (U2 (x)), then we have

T (U1 U2 ) = (T U1 )U2

(c) Since T IdV (x) = T (IdV (x)) = T (x) = IdV (T (x)) = IdV T (x), then we get

T IdV = T = IdV T

(d) Since a(U1 U2 )(x) = a(U1 (U2 (x))) = aU1 (U2 (x)) = U1 (aU2 (x)), then we have

a(U1 U2 ) = (aU1 )U2 = U1 (aU2 )

上面我們討論完了 matrix 的 addition 和 scalar-mulitiplication,接著我們去討論 matrix 間的


mulitiplication,這裡我們用的 definition 很以前一樣,但是我們在這裡用比較理論的 definition。

Definition 5. Matrix Mulitplication


Let A ∈ Mm×n (F ), B ∈ Mn×p (F ), then we defined the Matrix Mulitplication by


n
(AB)ij = Aik Bkj
k=1

for all i ∈ [1, m], j ∈ [1, p].

4 | Linear Algebra ReSupply Date VI JunWei,Zhang Edit


基本上,matrix 和 matrix 之間的關係與 linear transformation 和 linear transformation 之間的
關係是相似的,只是要注意的是 matrix representation 要去注意所在的 vector space 與其對應的
basis。然後這邊需要各位去記得當你在證明 matrix 之間的運算的時候要從 component(entry) 去
看,這樣可以把問題變得很簡單。

Theorem 5.
Let T ∈ L(V, W ), U ∈ L(W, Z) and V, W, Z be vector space over F with dim(V ), dim(W ), dim(Z) <
∞, and let α = {v1 , ..., vn }, β = {w1 , ..., wm }, γ = {z1 , ..., zp } be the basis for V, W, Z.
If B = [T ]βα , A = [U ]γβ , then AB = [U T ]γα .

Proof.
By definition, we have


m ∑
m ∑
m ∑
p
(U T )(vj ) = U (T (vj )) = U ( Bkj wk ) = Bkj U (wk ) = Bkj Aik zi
k=1 k=1 k=1 i=1

p

m ∑
p
= ( Aik Bkj )zi = cij zi
i=1 k=1 i=1
∑m
Thus, we get ([U T ]γα )ij = cij = k=1 Aik Bkj = (AB)ij .

Theorem 6.
Let T ∈ L(V, W ), U ∈ L(W, Z) and V, W, Z be vector space over F , and let α = {v1 , ..., vn },
β = {w1 , ..., wm }, γ = {z1 , ..., zp } be the basis for V, W, Z, then

[U T ]γα = [U ]γβ [T ]βα

(the proof by above remark).

Remark. Commutative in Matrix Multiplication


The matrix multiplication is not commutative.

Example. Commutative in Matrix Multiplication


By definition, we have
( )( ) ( )
1 1 0 1 1 1
=
0 0 1 0 0 0
( )( ) ( )
0 1 1 1 0 0
=
1 0 0 0 1 1

Remark. Cancellation Law in Matris Multiplication


The matrix multiplication has no cancellation law.

5 | Linear Algebra ReSupply Date VI JunWei,Zhang Edit


Example. Cancellation
 Law in Matris Multiplication
0 0 0
 
Let A =  1 0 0 

, then we have
0 1 0
    
0 0 0 0 0 0 0 0 0
    
A2 = 
 1 0 0   1 0 0  =  0 0 0  , A 3 = O3
   
0 1 0 0 1 0 1 0 0

Definition 6. Transpose
We defined the transpose of A ∈ Mm×n (F ) by (AT )ij = Aji , denoted by AT .

Theorem 7. the Property of Transpose


Let A ∈ Mm×n (F ), B ∈ Mn×p (F ), then (AB)T = B T AT .

Proof.
By definition, we have

n ∑
n
(AB)Tij = (AB)ji = Ajk Bki = (B T )ik (AT )kj = ((B T )(AT ))ij
k=1 k=1

Thus, we get (AB)T = B T AT .

最後在證明有關 transpose 的性質時也盡量從 component(entry) 去看,不要在那邊把整個矩陣


畫出來 (可以輔助,但是他不是最佳的證明方式)。

Notation. Reference Material


(1) Edition. NTNU Math113 Jun-Wei,Zhang - Date. 2022-01-10.

(2) Linear Algebra (4th Edition 2003) - Stephen H.Friedberg, Arnold J. Insel, Lawrence E. Spence.

(3) Linear Algebra (2nd Edition 1971) - Kenneth M Hoffman, Ray Kunze.

6 | Linear Algebra ReSupply Date VI JunWei,Zhang Edit

You might also like