Inner Product Spaces: Unit - Ii
Inner Product Spaces: Unit - Ii
Inner Product Spaces: Unit - Ii
Definition
The vector space V over F is said to be an inner product space, if there is defined for any two
vectors u, v ∈ V an element (u, v) in F such that,
1. ( u , v ) = ( v, u ) ;
2. ( u , u ) ≥ 0 and ( u , u ) = 0 is and only if u = 0
3. (α u + β v , w) = α ( u, w) + β ( u , w ) for any u, v, w ∈ V and α , β ∈ F .
Example :
1) In F (n ) define for u = ( u1.....un ) and v = ( v1.....vn ) . ( u , v ) = u1v1 + ...... + unvn , this defines
an inner product of F (n ) .
If f ( t ) , g ( t ) ∈V define ( f ( t ) , g ( t ) ) = ∫ f (t ) g ( t ) dt .
0
Definition :
u = ( u, v )
26
Leema :
If u, v ∈ V and α , β ∈ F then
(α u + β v, α u + β v ) = αα ( u, u ) + αβ ( u, v )
Proof : by property 3,
(α u + β v ,α u + β v ) = α ( u, αu + β v ) + β ( v, αu + βv )
but ( u ,α u + β v ) = α ( u, u ) + β ( u + v ) and ( v ,α u + β v ) = α ( v, u ) + β ( v , v )
∴ (α u + β v, α u + β v ) = αα ( u , u ) + αβ ( u, v ) + αβ ( v ,u ) + ββ ( v , v )
Corollary :
αu = α u
Proof : α u 2
= (α u, α u ) = αα ( u, u ) since by above Lemma. Qαα = α ( 2
and ( u , u ) = u
2
)
2 2 2
∴ αu =α u taking positive square roots yields α u = α u .
Lemma :
If a, b, c are real numbers such that a > 0 and aλ 2 + 2bλ + c ≥ 0 for all real number λ then
b 2 ≤ ac .
1( 2 b2
)
Proof : Completing the squares, aλ + 2bλ + c = aλ + b + c − .
2
a a
b
Since it is greater than or equal to 0 for all λ , in particular this must be true for λ = − . Thus
a
b2
c − ≥ 0 and since a > 0 we get b 2 ≤ ac .
a
Theorem : If u, v ∈ V then ( u, v ) ≤ u v .
27
Case - I :
(α u + β v, α u + β v ) = αα ( u, u )+ αβ ( u, v )+ αβ ( v , u ) + ββ ( v, v )
For any real number λ ,
0 ≤ ( λu + v, λu + v ) = λ 2 ( u, u ) + 2 ( u , v ) λ + ( v, v )
Case - II :
u
If α = ( u, v ) is not real then it certainly is not 0 so that is meaningful.
α
u 1 1
Now α , v = α ⋅ ( u, v ) = ( u , v ) ⋅ ( u, v ) = 1 and it is certainly real.
u u
Therefore by Case I, 1 = , v ≤ v .
α α
u 1 u v
∴ = u we get 1 ≤ .
α α α
then
2
( 2
u1v1 + .... + unvn ≤ u1 + .... + un
2
)( v
1
2
+ .... + vn
2
)
1 2 1 1
∫ f (t )g (t )dt ≤ ∫ f ( t ) dt ∫ g ( t ) dt
2 2
Example 2 :
0 0 0
28
Definition : If W is subspace of V, the orthogonal complement of W, W ⊥ is defined by
W ⊥ = {x ∈ V | ( x , w) = 0∀w ∈ W}
∴ aα + β b ∈W ⊥
∴ w = 0 ⇒w = 0
V = W +W ⊥
n r n r n
v = ∑ αi wi = ∑αi wi + ∑ αi wi taking u = ∑ αi wi , w = ∑ αi wi
i =1 i= 1 i= r +1 i =1 i =r +1
∴v =u + w u ∈W , w ∈W ⊥
Thus v = W + W ⊥
29
ii) Let u ∈ W I W ⊥ be arbitrary than u ∈W , u ∈W ⊥ .
⇒ (u , u ) = 0 ⇒ u = 0 ⇒ u = 0
∴W IW ⊥ = {0}
(ii) for if i ≠ j , ( vi , v j ) = 0 .
Lemma : If B = {v1, v2 ,...vn } is an orthogonal set then the vectors in B are linearly independent.
n
If w = ∑ α i vi then α i = ( w, vi ) vi .
r =1
⇒ α i = ( w, vi ) .
30
Proof : Let ( u , vi ) = ( w − ( w, v1 ) v1... ( w, vn )vn , vi )
= ( w, vi ) − ( w, v1 ) ( v1, vi ) .... ( w, vi )( vi , vi ) .... ( w, vn )( vn , vi )
0 i ≠ j
= ( w, vi ) − ( w, vi ) Q ( v j , vi ) =
1 i = j
=0 and v is arbitrary.
Theorem :
Let V be a finite dimensional inner product space, then V has an orthogonal set as a basis.
Proof : Let V be of finite dimension n over F and let v1, v2 ,...., vn be a basis of V. Now from this basis
we shall construct an orthogonal set of n vectors
Let u1 = v1
u1
u2 = v2 − ( v2 , u1 ) 2 linear space of u2 , u1 .
u1
u2 u1
u3 = v 3 − ( v3 , u2 ) − ( v3 , u1 )
2 2 linear space of u3, u1,u 2 .
u2 u1
M
i uj
ui +1 = vi +1 − ∑ ( vi + j , u j )
uj
2 linear space of ui +1, u1....u i .
j =1
( v2 , v1 ) v , v
( u1, u2 ) = v1 , v2 − ( v2 , v1 )
v1
Now 2
= ( v1 , v2 ) − 2 ( 1 1)
v1 v1
2
v1
= ( v1 , v2 ) − ( v2 , v1 ) 2
=0
v1
( u2 , u3 ) = v2 − ( v2 , u1 )
u1 u2 u1
2
, v3 − ( v3, u2 ) 2
− ( v3 , u1 ) 2
u1 u2 u1
31
= ( v2 , v3 ) −
( v3 , u2 ) ( v3 , u1 ) ( v2 , u1 )
2
( u 2, u2 )− 2
( v2, u1 ) − 2
( u1, u3 )
u2 u1 u1
( v2 , u1 ) ( v3, u2 ) ( v2 , u1 ) ( v3 , u1 )
+ 2 2
( u1, u2 ) + 2 2
( u1, u1 )
u1 u2 u1 u1
u1
v3 , v2 − ( v2 , u1 ) 2
u1 u1 ( v3, u1 ) ( v2 , u1 )
= ( v2 , v3 ) − 2 u2 , v2 − ( v2 , u1 ) 2 − 2
u1 u1 u1
(u , v ) ( v , u ) (v , u ) ( v , u )( v , u ) ( v , u ) ( v , u )
= ( u2 , v3 ) − 3 22 − 3 12 2 2 1 − ( v2 , u2 ) − 2 1 22 1 − 3 1 21 1
u2 u1 u2 u1 u1
= ( u2 , v3 ) − ( v3 , v2 ) +
( v3 , u1 ) ( v2 , u1 ) − ( v3 , u1 ) ( v2 , u1 )
2 2
u1 u1
=0
u1 u u
w1 = , w2 = 2 ,..., wn = n
u1 u2 un
∴{w1 , w2 ,...., wn} is orthogonal set which is linearly independent and dim (V) = no. of elements
in this set. Therefore, it forms basis.
Linear Transformations
1) HOM (V, V) : the set of all vector space homomorphisms of V into itself.
2) HOM (V, V) forms a vector space over F under the operations addition and scalar multiplication
defined as
(α T1 ) ( v ) = α ( T1 ( v ) )
Example :
T1T2 ∈ HOM (V , V ) .
32
Solution : Let T1 , T2 ∈ HOM (V ,V ) we define T1T2 ( v ) = T1 ( T2 ( v ) ) for any v ∈ V .
= T1 (α T2 ( u ) ) + T1 ( β T2 ( v ) ) = α ( T1 ( T2 ( u ) ) ) + β (T1 ( T2 ( v) ) )
2) ( T1 + T2 ) T3 = T1T3 + T2T3
Solution : Let α , β ∈ F and u, v ∈ V .
= α ( T1 +T 2 ) T3 ( u ) + β ( T1 +T 2 ) T3 ( v )
4) T1 (T2 , T3 ) = ( T1 , T2 ) T3
= α ( T1 , T2 , T3 ) ( u ) + β (T1, T2 , T3 ) ( v )
= α ( T1 , T2 ) T3 ( u ) + β (T1 , T 2 ) T3 (v )
33
= ( T1, T2 ) α T3 ( u ) + (T1, T2 ) β T3 ( v )
= ( T1 , T2 ) T3 ( α u) + ( T1 , T2 ) T3 ( β v )
= ( T1 , T2 ) T3 (α u + β v)
5) α (T1 , T2 ) = ( αT1 ) T2 = T1 ( αT 2 )
∴ From properties 1, gives clouser property w.r. to multplication 2, 3, 4 give HOM (V, V) an
associative ring.
Definition : An associative ring A is called an algebra over F if A is a vector space over F such that
∀a, b ∈ A and α ∈ F
α ( ab ) = (α a ) b = a (α b )
Note : HOM (V, V) is an algebra over F. We denote if A (V) and whenever we want to emphasize the
role of the field F. We shall denote it by AF (V).
Definition : A linear transformation on V over F is an element of AF (V). A (V) is the ring or algebra
of linear transformations on V.
Lemma : If A is an algebra with unit element over F, then A is isomorphic to a subalgebra of A (V) for
some vector sapce V over F.
Proof : Since A is an algebra over F it must be a vector space over F. We shall use V = A to prove the
lemma.
34
We assert that Ta is a linear transformation on V (= A).
Ta ( u1 + v2 ) = ( v1 +u 2 ) a = v1a + u 2 a
= Ta ( v1 ) + Ta ( v2 )
Ta = Tb ⇒ ψ ( a ) = ψ ( b ) ψ is well deined.
If a , b ∈ A and α , β ∈ F then ∀v ∈ A .
i.e. ψ ( α a + β b ) = αψ (a ) + βψ (b )
Now a , b ∈ A .
35
Lemma : Let A be an algebra with unit element over F, and suppose that A is of dimension n over F.
Then every element in A satisfies some nontrial polynomial in F [x] of degree at most m.
Proof : Let e be the unit element of A, if a ∈ A consider the elements e, a, a2, ... am in A. Since A is
no dimensional over F, we know “If v1....vn is basis of V over F and if w1....wm in V are Linear
independent over F then m ≤ n ” .
Let dim A = m < m + 1
∴ e, a, a2, ... am be linearly dependent over F. In other words there are elements
α 0 , α1,..., α m ∈ F not all zero. Such that α 0e + α1a + .... + α ma m = 0 . But then a satisfies the non-
Theorem : If V is an n-dimensional vector space over F, then given any element T in A (V), there
exists a non-trivial polynomial q ( x ) ∈ f ( x ) of degree at most n2 such that q ( T ) = 0 .
Proof : As above.
Example : If TS = UT = I then S = U
Solution : S = IS = (UT) S = U (TS) = UI = U.
Definition : An element T in A (V) is invertible or regular if it is both right and left invertible i.e. if there
is an element S ∈ A( V ) such that ST = TS = 1 we write S as T–1.
36
Example : Let F be the field of real numbers and let V be f ( x ) the set of all polynomials on x over
F.
Then where as TS = 1.
Theorem : If V is finite dimensional over F, then T ∈ A (V ) is invertible if and only if the constant term
of the minimal polynomial for T is not 0.
1
1 = T − ( α kT k −1 + α k −1T k − 2 + .... + α1 )
0α
∴S = −
1
(α k T k −1 + .... + α1 ) acts as an inverse for T..
α0
Whence T is invertible.
α1 + α 2T + .... + α k T k −1 = 0
37
Corollary : If V is finite dimensional over F and if T ∈ A (V ) is invertible then T–1 is a polynomial
expression in T over T.
T is invertible
Then T −1 = −
1
( α1 + α 2T + .... + α kT k −1 )
α0
ST = TS = 0
Proof : Because T is not regular the constant term of its minimal must be 0.
If S = α1 + .... + α kT k −1 then S ≠ 0
∴ ST = TS = 0
i.e. T is invertible.
Let v = S ( w) then T ( v ) = T ( S ( w) ) = ( TS ) ( w )
= 0 ( w) = 0
We produced anon-zero vector v in V which is annihilated by T.
38
Conversely, if T ( v ) = 0 with v ≠ 0 .
∴ 0 = T ( S ( w ) ) = TS ( w) ⇒ TS = 0
T (V ) = {v | v ∈ V } .
Theorem : If V is finite dimensional over F then T ∈ A (V ) is regular if and only if T maps V onto V..
We can extend to form a basis for V as v1, v2 ,..., vn . Then every element in T(V) is a linear
Therefore dimT (V ) ≤ n − 1 < n = dimV . But then T (V) must be different from V. i.e. T is
not onto a contradiction hence T must be regular.
Definition : If V is finite dimensional over F, then the rank of T is the dimension of T (V), the range of
T over F.
We denote rank of T by r (T).
Note :
1) If r ( T ) = dim V , then T is regular..
2) If r ( T ) = 0 then T = 0 and so T is singular..
39
Lemma : If V is finite dimensional over F then for S , T ∈ A (V ) .
1) r ( ST ) ≤ r ( T )
2) r ( TS ) ≤ r ( T ) (so r ( ST ) ≤ min {r ( T ) , r ( S )} )
3) If S is invertible then S (V ) = V .
∴TS (V ) = T (S (V ) ) = T (V )
∴ r ( ST ) = dim ( ( TS ) (V ) ) = dim (T (V ) ) = r ( T )
On the other hand if T (V) has w1....wm as a basis the regularity of S implies that
S ( w1 ) ,....., S ( wm ) are linearly independent.
40
Corollary : If T ∈ A (V ) and if S ∈ A( V ) is regular then
r ( T ) = r ( STS −1 )
Example : Let V and w be vector space over the field F and let T be a linear transformation from V
into w. If T is invertible then the inverse function T–1 is a linear transformation from w onto V.
Solution : When T is one-one and onto, there is a uniquely determined inverse function T–1 which
maps w and V. such that T–1T identity on V and TT–1 identity on W.
T −1 (αw1 + β w2 ) = αT −1 ( w1 ) + β T −1 ( w2 )
i.e. v1 = T− 1 ( w1 ) and v2 = T −1 ( w2 )
T (α v1 + β v2 ) = α T ( v1 ) + βT ( v2 ) = α w1 + β w2
∴T −1 (αw1 + β w2 ) = T −1 T ( α v1 + β v2 ) = α v1 + β v2 = α T −1 ( w1 ) + β T −1 ( w2 )
Characteristics Roots
V will always denote a finite dimensional vector space over a field F.
We know “If V–F.D.V.S. over F then T ∈ A (V ) is singular if and only if there exists a v ≠ 0
in V such that T ( v ) = 0 .” ....
41
∴ There is a vector v ≠ 0 in V such that ( λ − T ) ( v ) = 0 .
⇒ λ ( v ) − T ( v ) = 0 ⇒ T ( v ) = λv
Conversely, let T ( v ) = λ v for some v ≠ 0 in V.
T 2 ( v) = T ( λ ( v ) ) = T ( λ v ) = λT ( v ) = λ 2 v
= ( α 0 λ m + α1λ m −1 + .... + α m ) v = q ( λ ) v
( STS −1 ) k = ST k S − 1
∴ if q ( x ) = α 0 +α1x + ... + α m x m
= α 0 + α1STS −1 + ... + α m ST m S −1
= Sα 0 S −1 + Sα1TS −1 + ... + Sα mT m S −1
= Sq ( T ) S −1
Thus if p ( x ) is the minimal polynomial for T then it follows easily that p ( x ) is also the
minimal polynomial for STS −1 .
Theorem : If λ1....λk in F are distinct characteristic roots of T ∈ A (V ) and if v1....vk are characteristic
vectors of T belonging to λ1....λk respectively, then v1....vk are linearly independent over F..
43
Suppose v1....vk are linearly dependent over F then there is a relation of the form
α1v1 + ... + α k vk = 0 where α1....α k ∈ F and not all of them are o. In all such relations, there is one
having as few non-zero coefficients as possible.
By suitable renumbering the vectors we can assume this shortest relation to be
( λ2 − λ1 ) β2 v2 + .... + ( λ j − λ1 ) β j v j = 0
Now λi − λ1 ≠ 0 for i > 1 and β1 ≠ 0 whence ( λi −λ 1 ) β i ≠ 0 .
But then we have produced a shorter relation than that in (1) between v1....vk . This contradiction
proves the theorem.
Corollary : If T ∈ A (V ) and if dimV = n then T can have at most n distinct characteristic roots in F..
Proof : Any set of linearly independent vectors in V can have at most n elements. Since any set of
distinct characteristic roots of T by above theorem gives rise to a corresponding set of linearly
independent characteristic vectors which is at most n.
Corollary : If T ∈ A (V ) and dimV = n if and if T has n distinct characteristic roots in F then there
is a basis of V over F which consists of characteristic vectors of T.
Matrices
Let V be a n-dimensional vector space over F and let v1....vn be basis of V over F..
44
This system can be written more compactly as
n
T ( vi ) = ∑ α ij v j for i = 1,..., n
j =1
Definition : Let V be an n-dimensional vector space over F and let v1....vn be a basis of V over F..
If T ∈ A (V ) then the matrix of T in the basis v1....vn witten as m (T) is
Example : Let F be a field and V be the set of all polynomials in x of degree n–1 or less over F. On
V let D be defined by
Solution :
1) α , β ∈ F , p ( x) ,q ( x ) ∈ V
α p ( x ) + β q ( x ) ∈V
D (α p ( x) + β q ( x)) = D ( α p ( x) ) + D ( β q ( x))
= D (α ) p ( x )+ α D ( p ( x ) ) + D ( β ) q ( x ) + β D ( q ( x ) )
= α D ( p ( x )) + β D ( q ( x )) Q D (α ) = 0 = D ( β )
∴ D is linear transformation.
45
2) The basis for V is 1, x, x 2 ,..., x n −1
n
= ∑ ovi
i =1
gives basis
0 0 0 −−− 0 0
1 0 0 −−− 0 0
0 2 0 −−− 0 0
∴ m ( D) =
0 0 3 −−− 0 0
M
0 0 0 − − − n − 1 0
D ( wn ) = D ( x ) = 0 = 0 w1 + .... + 0wn
46
0 ( n − 1) 0 0 0 0
0 0 ( n − 2) 0 0 0
∴m( D) = M
0 0 0 0 0 1
0 0 0 0
0 0
3) u1 = 1, u2 = 1 + x, u3 = 1 + x2 ,...., un = 1 + xn−1
Therefore all α i = 0 .
D ( u3 ) = D (1+ x 2 ) = 2 x = 2 x + 2 − 2 = 2 (1 + x − 1) = 2 ( u 2 − u1 )
D ( un ) = D (1 + x n −1 ) = ( n − 1) x n −2 = ( n − 1) ( un − u1 )
0 0 0 −−− 0 0
1 0 0 −−− 0 0
−2 2 0 −−− 0 0
∴ m ( D) =
−3 0 3 −−− 0 0
M
−( n − 1) 0 0 − − − ( n − 1) 0
47
4) Let T is linear transformation of V of n-dimensional vector space V and if T has n distinct
characteristic roots then what is the matrix for T.
Solution : Let T is linear transformation on V and λ1...λn be n distinct characteristic roots of T..
We know “If T ∈ A (V ) and dimV = n and if T has n distinct and have ---- roots in F, then
there is a basis of V over F which consists of characteristic vectors of T.”
λ1 0 0 − − − 0
0 λ 0 −−− 0
m T =
( ) 2
M
0 0 0 λx
Note :
β11 L β1n
If we have a basis v1....vn of V over F a given matrix M , βi , j ∈ F gives rise to
β L β
n1 nn
n
a linear transformation T defined on V by T ( vi ) = ∑ βij v j on this basis.
i =1
Thus every possible square away serves as the matrix of some linear transformation in the
basis v1....vn .
Let V is an n-dimensional vector space over F and v1....vn be basis suppose that S , T ∈ A (V ) ,
If S = T if and only if m ( S ) = m ( T ) .
48
n n
Now m ( S ) = (α ij ) and S ( vi ) = ∑ α ij v j and T ( vi ) = ∑ βij v j
j =1 j =1
∴ ( S + T ) ( vi ) = S ( vi ) + T ( vi ) = ∑ α ij v j + ∑ β ij v j = ∑ ( αij + β ij ) v j
This is meant by the matrix of linear transformation in a given basis, m ( S + T ) = ( λij ) where
λij = αij + βij for every i and j.
Now for γ ∈ F show that m ( γ S ) = ( µij ) when µij = γα ij for every i and j.
m ( γ S ) = ( γα ij ) = ( µij )
n n
γ S ( vi ) = γ ∑ α ij v j = ∑ ( γαij ) v j = S ( vi )
j =1 j =1
n n
For m ( ST ) let ST ( vi ) = S ( T ( vi ) ) = S
∑
β ij j = ∑ βij S ( v j )
v
j =1
j =1
n
But S ( v j ) = ∑ α jk vk
k =1
n n n n n
∴ ST ( vi ) = ∑ β jk vk = ∑∑ ( βijα jk ) vk = ∑ ∑ ( β ijα jk ) vk
j =1 j =1 k =1 k =1 S =1
n
= ∑ βij ( α j1v1 + α j 2v2 + ... + α jn vn )
j =1
n n n
= ∑ β ijα j1v1 + ∑ βijα j 2v2 + .... + ∑ βijα jnvn
j =1 j =1 j =1
n n
= ∑∑ ( βijα jk ) vk
k =1 j =1
n
σ ik = ∑ β ijα jk
j =1
49
Fn : set of all n x n matrices entry from F..
It is an algebra.
n
iv) (α ij )( βij ) = (σ ij ) where every i and j σ ij = ∑ α ik βkj
K =1
Theorem : The set of all n x n matrices over F form an associative algebra, Fn over F. If V is an n-
dimensional vector space over F, then A (V) and Fn are isomorphic as algebra over F.
We deine mapping φ : A ( V ) → Fn as T → m ( T ) .
Let S , T ∈ A (V )
n n
iff ∑ α ij v j = ∑ βij v j
j =1 j =1
i.e. (α ij ) = ( βij )
A ∈ Fn , ∃T ∈ A ( V ) , φ ( T ) = Am×n .
φ is onto.
α , β ∈ F , T , S ∈ A (V ) , α T + β S ∈ A ( V ) .
50
φ ( α T + β S ) = m (α T + β S ) = α m (T ) + β m ( S )
(α T + β S ) ( vi ) = ( α T ) ( vi ) + ( β S ) ( vi ) = α T ( vi ) + β ( S ( vi ) )
n n n
= α ∑ α ij v j + β ∑ β ij v j = ∑ (αα ij + ββ ij ) v j
j =1 j =1 j =1
φ ( ST ) = φ ( S ) + φ (T ) = m ( ST ) = m ( S )m (T )
n n
∴ ST ( v j ) = ∑ γ ij v j where γ ij = ∑ α ik β kj
j =1 j =1
n n
γ ij = ∑∑ α ik β kj v j
j =1 k =1
∴φ is homomorphic.
Hence φ is isomorphic.
Theorem : If V is n-deimensional over F and if T ∈ A (V ) has the matrix m1 ( T ) in the basis v1,..., vn
and the matrix m 2 ( T ) in the basis w1 ,..., wn of V over F. Then there is an element C ∈ Fn such that
m2 ( T ) = C −1m1 (T )C .
n n
We know “ If V is finite dimensional over F then T ∈ A (V ) is regular iff T maps V onto V..”
obtain
51
n
T ( S ( vi ) ) = ∑ βij ( S ( vi ) )
j =1
n
⇒ TS ( vi ) = S ∑ β ij v j
j=1
S −1 (TS ( vi ) ) = S − 1S ( ∑ βij v j )
⇒ ( S −1TS ) ( vi ) = ∑ βij v j
m1 ( S −1TS ) = ( βij ) = m2 ( T )
( )
∴ m1 ( S −1TS ) = m1 ( S −1 ) m1 (T ) m1 ( S ) = m1 ( S − 1 ) m1 (T ) m1 ( S )
A B
−1 2
1 1 1 1 1 2 ( AB − BA) =
Example 1 : 1 −1 0 1 = 1 0
0 1
1 1 1 1 2 0 −1 2 −1 2 1 0
0 1 1 −1 = 1 −1 ( AB − BA) 2 = =
B A 0 1 0 1 0 1
A B
−2 6
1 1 2 3 3 5 ( AB − BA ) =
1 −1 1 2 = 1 1
−2 2
2 3 1 1 5 −1 −2 6 −2 6 −8 0
1 2 1 −1 = 3 −1 ( AB − BA) = =
B A −2 2 −2 2 0 −8
1 0
= ( −8)
0 1
52
PROBLEMS :
1. Prove that S ∈ A( V ) is regular if and only if whenever v1,..., vn ∈ V are linearly
3. Prove that the minimal polynomial of R over F divides all polynomials satisfied by T over F.
4. If V is two-dimensional over a field F prove that every element in A (V) satisfies a polynomial
of degree 2 over F.
0 1 0
5. Prove that give the matrix A = 0 0 1 ∈ F3 (where the characteristic of F is not 2),
6 −11 6
then
(a) A3 − 6 A2 + 11A − 6 = 0
(b) There exists a matrix C ∈ F3 such that
1 0 0
−1
CAC = 0 2 0
0 0 3
❏❏❏
53