0% found this document useful (0 votes)
72 views56 pages

Lec4 Vector Spaces. Basis and Dimension

This document discusses vector spaces and their bases and dimensions. It provides definitions of basis and linearly independent sets. It also discusses properties of vector spaces including that any basis of a vector space has the same number of vectors. The dimension of a vector space is defined as the number of vectors in any of its bases. Examples of bases for various vector spaces are given to illustrate the concepts.

Uploaded by

ansat5.ansat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views56 pages

Lec4 Vector Spaces. Basis and Dimension

This document discusses vector spaces and their bases and dimensions. It provides definitions of basis and linearly independent sets. It also discusses properties of vector spaces including that any basis of a vector space has the same number of vectors. The dimension of a vector space is defined as the number of vectors in any of its bases. Examples of bases for various vector spaces are given to illustrate the concepts.

Uploaded by

ansat5.ansat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

VECTOR SPACES:

BASIS AND
DIMENSION
Lecturer: Askarbekkyzy Aknur
Vector spaces
First, we state two equivalent definitions of a
basis of a vector space V.
Definition 1. A set of vectors 𝑆 = {𝑣! , . . . , 𝑣" }
is a basis of 𝑉 if
1. 𝑆 is linearly independent.
2. 𝑆 spans 𝑉.
Vector spaces
Definition 2. A set of vectors 𝑆 = {𝑣! , . . . , 𝑣" }
is a basis of 𝑉 if every 𝑣 ∈ 𝑉 can be written
uniquely as a linear combination of the basis
vectors.
Proposition 1. The Definitions 1 and 2 are
equivalent.
Example 1. a) Let 𝑉 = ℝ# over ℝ. Consider the
vectors
1 0 0
𝑒! = 0 , 𝑒$ = 1 , 𝑒# = 0 .
0 0 1
We know from Lecture 3 that these vectors span
ℝ# . Furthermore, they are independent. Consider
𝜆! 𝑒! + 𝜆$ 𝑒$ + 𝜆# 𝑒# = 0.
We have
1 0 0 0
𝜆! 0 + 𝜆$ 1 + 𝜆# 0 = 0 .
0 0 1 0
One can easily check that 𝜆! = 𝜆$ = 𝜆# = 0.
#
So 𝑒! , 𝑒$ and 𝑒# are independent in ℝ . Hence
#
{𝑒! , 𝑒$ , 𝑒# } is a basis of ℝ . This basis is called a
#
standard basis of ℝ .
b) This example generalizes the preceding example.
"
Consider the following 𝑛 vectors in ℝ :

1 0 0
0 1 0
𝑒! = ⋮ , 𝑒$ = ⋮ , … , 𝑒" = ⋮ .
0 0 0
0 0 1
These vectors are linearly independent. Furthermore, any
vector
𝑎!
𝑎"
#
𝑣= ⋮ ∈ ℝ
𝑎#$!
𝑎#
can be written as a linear combination of 𝑒! , 𝑒" , . . . , 𝑒# .
Specifically, 𝑣 = 𝑎! 𝑒! + 𝑎" 𝑒" + ⋯ + 𝑎# 𝑒# .
Accordingly, the vectors form a basis of ℝ# which is
called standard basis of ℝ𝒏 .
c) Let 𝑉 = 𝑀$,$ be a vector space of all 2 × 2
matrices over ℝ. The following four matrices
from 𝑀$,$ form a basis of the vector space 𝑀$,$
over ℝ:

1 0 0 1 0 0 0 0
, , ,
0 0 0 0 1 0 0 1
Since
𝑎 𝑏 1 0 0 1
=𝑎 +𝑏 +
𝑐 𝑑 0 0 0 0
0 0 0 0
+𝑐 +𝑑
1 0 0 1
and their independence is evident.
d) Vector space 𝑃" (𝑡) of all polynomials of degree
less than or equal to 𝑛 . The set 𝑆 =
$ "
{1, 𝑡, 𝑡 , . . . , 𝑡 } is a basis of 𝑃" (𝑡) . Any
polynomial can be written as
" "
𝑎& + 𝑎! 𝑡+. . . +𝑎" 𝑡 = 𝑎& 1 + 𝑎! 𝑡+. . . +𝑎" 𝑡
and their independence is evident.
The following is a fundamental result in linear
algebra.
Theorem 1. Let 𝑉 be a vector space such that
one basis has 𝑚 vectors and another basis has 𝑛
vectors. Then 𝑚 = 𝑛.
Definition 3. A vector space 𝑉 is said to be of
finite dimension 𝒏 or 𝒏 −dimensional, written
dim𝑉 = 𝑛 if 𝑉 has a basis with 𝑛 vectors. If a
vector space 𝑉 does not have finite basis, then 𝑉 is
said to be of infinite dimension or to be infinite-
dimensional.
The vector space {0} is defined to have
dimension 0.
Example 4.
"
dim ℝ = 𝑛
dim 𝑀'," = 𝑚𝑛
dim 𝑃" 𝑡 = 𝑛 + 1
Theorem 2. Let 𝑉 be a vector space of finite
dimension 𝑛. Then:
◦Any 𝑛 + 1 or more vectors in 𝑉 are linearly
dependent.
◦Any linearly independent set 𝑆 = {𝑣! , . . . , 𝑣" }
with 𝑛 vectors is a basis of 𝑉.
◦Any spanning set 𝑇 = {𝑤! , . . . , 𝑤" } of 𝑉 with 𝑛
elements is a basis of 𝑉.
Example 2. a) Let 𝑉 = !
ℝ . Then𝑑𝑖𝑚ℝ! = 3.
We know the vectors
1 0 0
𝑒" = 0 , 𝑒# = 1 , 𝑒! = 0
0 0 1
forms a basis of ℝ! . Then for any nonzero 𝑣 ∈ ℝ!
by the first part of the Theorem above the four
vectors 𝑒" , 𝑒# , 𝑒! , 𝑣 are not linearly independent,
consequently {𝑒" , 𝑒# , 𝑒! , 𝑣} is not a basis of ℝ! .
#
b) Let 𝑉 = ℝ . It is easy to show that
2 1 1
𝑣! = 0 , 𝑣$ = 3 , 𝑣# = 2
0 0 3
# #
are linearly independent in ℝ and 𝑑𝑖𝑚ℝ = 3.
Then by the second part of the Theorem above
#
{𝑣! , 𝑣$ , 𝑣# } is a basis of ℝ .
#
c) Let 𝑉 = ℝ . We know from Lecture 3 that
the vectors
1 1 1
𝑓! = 1 , 𝑓$ = 1 , 𝑓# = 0
1 0 0
# #
span ℝ . Taking account that 𝑑𝑖𝑚ℝ = 3 and
the third part of the Theorem above we have
#
{𝑓! , 𝑓$ , 𝑓# } is a basis of ℝ .
In an echelon form matrix, no nonzero row is a
linear combination of the other nonzero rows.
The nonzero rows of an echelon form matrix
make up a linearly independent set. Namely, rows
in an echelon matrix with n columns give us set
"
of linearly independent vectors in ℝ .
1
Example 3. a) Given vectors 𝑣! = 1 , 𝑣" =
0
1 4
3 , 𝑣& = 9 . We check whether they are linearly
2 5
dependent or not by echelon matrix (from Lecture 3
we know that they are linearly dependent). First we
1 1 0
write them as rows of matrix and have 1 3 2 .
4 9 5
Perform the following sequence of elementary
operations 𝑅$ → −𝑅! + 𝑅$ , 𝑅# → −4𝑅! + 𝑅# ,
𝑅# → −5𝑅$ + 2𝑅# :
1 1 0 1 1 0 1 1 0
1 3 2 ~ 0 2 2 ~ 0 2 2 ~
4 9 5 4 9 5 0 5 5
1 1 0
1 1 0
~ 0 2 2 ~ .
0 2 2
0 0 0
Finally, we have the echelon matrix with two
nonzero rows. It means the third row is a linear
combination of the first and second rows, namely,
𝑣# is a linear combination of 𝑣! and 𝑣$ . Hence,
they are linearly dependent in ℝ# .
1 3
b) Given vectors 𝑣" = 2 , 𝑣# = −1 , 𝑣! =
1 −1
−2
!
2 of ℝ . We want to check whether the set
3
{𝑣" , 𝑣# , 𝑣! } is a basis of ℝ! . First, we write them as
rows of matrix and have
1 2 1
3 −1 −1 .
−2 2 3
After applying the sequence of following
elementary operations 𝑅$ → −3𝑅! +
𝑅$ , 𝑅# → 2𝑅! + 𝑅# , 𝑅# → 7𝑅# , 𝑅# →
6𝑅$ + 𝑅# one can obtain the following echelon
matrix
1 2 1
0 −7 −4 .
0 0 11
As we see there are 3 nonzero rows in the
echelon matrix and therefore these rows define
#
3 linearly independent vectors in ℝ obtained
from 𝑣! , 𝑣$ , 𝑣# . Then 𝑣! , 𝑣$ , 𝑣# are linearly
independent vectors in three dimensional vector
space. Hence the set {𝑣! , 𝑣$ , 𝑣# } is a basis of
ℝ# .
с) Given set 𝑆 = {𝑣! , 𝑣" , 𝑣& , 𝑣' }, where
1 1 2 3
3 4 3 8
𝑣! = 1 , 𝑣" = 3 , 𝑣& = −4 , 𝑣' = 1 .
−2 −1 −7 −7
−3 −4 −3 −8
We will extend the set 𝑆 to a basis of vectors of ℝ( . First
of all, we need to determine whether they are linearly
independent or not. If not, we will delete dependent
vectors from 𝑆.
1 3 1 −2 −3
1 4 3 −1 −4
~
2 3 −4 −7 −3
3 8 1 −7 −8
1 3 1 −2 −3 1 3 1 −2 −3
0 1 2 1 −1 0 1 2 1 −1
~ ~ ~
0 −3 −6 −3 3 0 0 0 0 0
0 −1 −2 −1 1 0 0 0 0 0
1 3 1 −2 −3
~
0 1 2 1 −1
So we observe that 𝑣! , 𝑣" , 𝑣& , 𝑣' are linearly dependent
and only two vectors are linearly independent. To have
a basis of ℝ( we need three more vectors so that new
five vectors become linearly independent with
1 0
3 1
{ 1 , 2 }.
−2 1
−3 −1
We add three more rows
1 3 1 −2 −3
0 1 2 1 −1
0 0 1 0 0
0 0 0 1 0
0 0 0 0 1
and have an echelon matrix with 5 nonzero rows.
Now we have five linearly independent vectors.
The set of vectors
1 0 0 0 0
3 1 0 0 0
{ 1 , 2 , 1 , 0 , 0 }.
−2 1 0 1 0
−3 −1 0 0 1
is a basis of ℝ( . We note that this extension of course
is not unique. We couldtake other three vectors so that
a set of five vectors form a basis of ℝ( .
Definition 3. The rank of a matrix 𝑨 , written
𝑟𝑎𝑛𝑘(𝐴), is equal to number of rows in its echelon form
matrix.
1 1 0
Example 4. Let 𝐴 = 1 3 2 .
4 9 5
Then rank of 𝐴 is 2, write 𝑟𝑎𝑛𝑘(𝐴) = 2, since its
echelon matrix has two nonzero rows
1 1 0
.
0 2 2
We consider an application of rank in space of
solutions of a homogeneous system of linear
equations in 𝑛 unknowns. Recall that
homogeneous systems have either infinitely many
solutions or only zero solutions. The set of
solutions of systems forms a vector space with
respect to the operations in ℝ" .
Theorem 3. Let 𝑊 be space of solutions of a
homogeneous system of linear equations in 𝑛
unknowns and 𝐴 be a matrix of coefficients of
unknowns with 𝑟𝑎𝑛𝑘(𝐴) = 𝑟. Then 𝑑𝑖𝑚 𝑊 =
𝑛 − 𝑟.
𝑥+𝑦−𝑧 =0
Example 5. Consider V2𝑥 − 3𝑦 + 𝑧 = 0.
𝑥 − 4𝑦 + 2𝑧 = 0
1 1 −1
Then 𝐴 = 2 −3 1 . An echelon matrix
1 −4 2
1 1 −1
of 𝐴 is and 𝑟𝑎𝑛𝑘(𝐴) = 2.
0 −5 3
Then 𝑑𝑖𝑚 𝑊 = 3 − 2 = 1, that is, the space
of solutions has dimension one. Of course, the
system has infinitely many solutions of the form
$ #
𝑥, 𝑦, 𝑧 = ( 𝑧, 𝑧, 𝑧), but as a vector space all
( (
solutions are linear combination of one vector
2/5
3/5 .
1
Definition 4. Let 𝑈 and 𝑊 be subsets of a vector
space 𝑉 . The sum of U and W, written 𝑈 + 𝑊 ,
consists of all sums 𝑢 + 𝑤 where 𝑢 ∈ 𝑈 and 𝑤 ∈ 𝑊.
Theorem 4. Suppose 𝑈 and 𝑊 are subspaces of 𝑉.
Then 𝑈 + 𝑊 and 𝑈 ∩ 𝑊 are subspaces of 𝑉.
Theorem 5. Suppose 𝑈 and 𝑊 are finite dimensional
subspaces of 𝑉. Then
𝑑𝑖𝑚 𝑈 + 𝑊
= 𝑑𝑖𝑚 𝑈 + 𝑑𝑖𝑚 𝑊 − 𝑑𝑖𝑚 𝑈 ∩ 𝑊 .
Example 6. Given
1 1 2
3 4 3
𝑣! = −2 , 𝑣" = −3 , 𝑣& = −1 ,
2 4 −2
3 2 9
1 1 2
3 5 5
𝑤! = 0 , 𝑤" = −6 , 𝑤& = 3 .
2 6 2
1 3 1
Let 𝑈 be a space spanned by 𝑢! , 𝑢$ , 𝑢# and 𝑊 be
a space spanned by 𝑤! , 𝑤$ , 𝑤# . Namely, any
vector of 𝑈 and 𝑊 is a linear combination of
𝑢! , 𝑢$ , 𝑢# and 𝑤! , 𝑤$ , 𝑤# , respectively. They are
(
subspaces of ℝ .
We will find bases and dimensions of 𝑈, 𝑊, 𝑈 +
𝑊 and 𝑈 ∩ 𝑊.
To construct a basis of 𝑈 we need to derive
linearly independent vectors from 𝑢! , 𝑢$ , 𝑢# . We
write them as rows of matrix and find its echelon
form matrix.
1 3 −2 2 3 1 3 −2 2 3
1 4 −3 4 2 ~ 0 1 −1 2 −1
2 3 −1 −2 9 0 −3 3 −6 3

1 3 −2 2 3
1 3 −2 2 3
~ 0 1 −1 2 −1 ~ .
0 1 −1 2 −1
0 0 0 0 0
So there are only two nonzero rows or rank of the
matrix is two. Thus,
1 0
3 1
𝑢! = −2 , 𝑢′" = −1
2 2
3 −1
are linearly independent and they span 𝑈, therefore
)
{𝑢! , 𝑢" } is a basis and 𝑑𝑖𝑚 𝑈 = 2.
In a similar way, we find a basis of 𝑊 and its
dimension.
1 3 0 2 1 1 3 0 2 1
1 5 −6 6 3 ~ 0 2 −6 4 2 ~
2 5 3 2 1 0 −1 3 −2 −1
1 3 0 2 1
~ 0 1 −3 2 1 ~
0 −1 3 −2 −1
1 3 0 2 1
1 3 0 2 1
◦ 0 1 −3 2 1 ~ .
0 1 −3 2 1
0 0 0 0 0
So there are only two nonzero rows or rank of the
matrix is two. Thus,
1 0
3 1
𝑤! = 0 , 𝑤′" = −3
2 2
1 1
are linearly independent and they span 𝑊, therefore
{𝑤! , 𝑤′" } is a basis and 𝑑𝑖𝑚 𝑊 = 2.
By the definition of 𝑈 + 𝑊 , the vectors
𝑢! , 𝑢") , 𝑤! , 𝑤′" span 𝑈 + 𝑊.
1 3 −2 2 3 1 3 −2 2 3
0 1 −1 2 −1 ~ 0 1 −1 2 −1 ~
1 3 0 2 1 0 0 2 0 −2
0 1 −3 2 1 0 0 −2 0 −2
1 3 −2 2 3
1 3 −2 2 3
0 1 −1 2 −1
~ ~ 0 1 −1 2 −1 .
0 0 2 0 −2
0 0 1 0 −1
0 0 0 0 0
So, there are only three nonzero rows or rank of the
matrix is three. Thus,
1 0 0
3 1 0
𝑢! = −2 , 𝑢′" = −1 , 𝑤′! = 1
2 2 0
3 −1 −1
are linearly independent and they span 𝑈 + 𝑊 ,
)
therefore {𝑢! , 𝑢" , 𝑤′! } is a basis and 𝑑𝑖𝑚 (𝑈 +
𝑊 ) = 3.
To find the dimension of 𝑈 ∩ 𝑊 we use the
formula given above and have 𝑑𝑖𝑚(𝑈 ∩ 𝑊 ) =
𝑑𝑖𝑚 𝑈 + 𝑑𝑖𝑚 𝑊 − 𝑑𝑖𝑚(𝑈 + 𝑊 ) = 2 + 2 −
3 = 1.
Now we find a basis of 𝑈 ∩ 𝑊 and it consists of
𝑎!
𝑎"
one vector. Let 𝑣 = 𝑎& and 𝑣 ∈ 𝑈 ∩ 𝑊.
𝑎'
𝑎(
Then 𝑣 must be written as a linear combination of
their basis vectors.
𝑣 = 𝜆! 𝑢! + 𝜆" 𝑢") = 𝜇! 𝑤! + 𝜇" 𝑤′" .
1 0 1 0
3 1 3 1
𝜆! −2 + 𝜆" −1 = 𝜇! 0 + 𝜇" −3 .
2 2 2 2
3 −1 1 1
We obtain a system of five linear equations in
unknowns 𝜆! , 𝜆" , 𝜇! , 𝜇" :
𝜆! − 𝜇" = 0
3𝜆! + 𝜆" − 3𝜇! − 𝜇" = 0
−2𝜆! − 𝜆" + 3𝜇" = 0
2𝜆! + 2𝜆" − 2𝜇! − 2𝜇" = 0
3𝜆! − 𝜆" − 𝜇! − 𝜇" = 0
The solution is {(𝜆! , 𝜆" , 𝜇! , 𝜇" ) =
(𝜇" , 𝜇" , 𝜇" , 𝜇" )|𝜇" ∈ 𝑅}. Let 𝜇" = 1 and
1
4
)
𝑣 = 𝑢! + 𝑢" = −3 is the vector in 𝑈 ∩ 𝑊 and
4
2
span it. Hence it is a basis of 𝑈 ∩ 𝑊.
Definition 5. The vector space 𝑉 is said to be
the direct sum of its subspaces of 𝑈 and 𝑊,
denoted by 𝑉 = 𝑈 ⊕ 𝑊 if every 𝑣 ∈ 𝑉 can be
written in one and only one way as 𝑣 = 𝑢 + 𝑤
where 𝑢 ∈ 𝑈 and 𝑤 ∈ 𝑊.
Theorem 6. The vector space 𝑉 is the direct sum
of its subspaces of 𝑈 and 𝑊 if and only if

𝑉 = 𝑈 + 𝑊.
𝑈 ∩ 𝑊 = 0.
𝑎
#
Example 7. a) Let 𝑉 = ℝ and 𝑈 = { 𝑏 |𝑎 =
𝑐
𝑎
𝑐} and 𝑊 = { 𝑏 |𝑎 + 𝑏 + 𝑐 = 0}. We show
𝑐
that 𝑉 = 𝑈 + 𝑊 but the sum is not direct.
𝑎
Suppose 𝑣 = 𝑏 ∈ 𝑉.
𝑐
Then
𝑎 𝑐 𝑎−𝑐
𝑏 = 𝑎+𝑏−𝑐 + 𝑐−𝑎 .
𝑐 𝑐 0
Then 𝑉 = 𝑈 + 𝑊. We note that
𝑎 𝑐−1 𝑎−𝑐+1
𝑏 = 𝑎+𝑏−𝑐+2 + 𝑐−𝑎−2 .
𝑐 𝑐−1 1
There are two ways of expressing 𝑣 as a linear
combination of vectors 𝑈 and 𝑊. Therefore, the
sum is not direct.
Example 3. Let 𝑉 = ℝ# . Vectors
1 0 0
𝑒! = 0 , 𝑒$ = 1 , 𝑒# = 0
0 0 1
𝑎
# 𝑏 #
span ℝ . For any 𝑣 = ∈ ℝ one can have
𝑐
𝑣 = 𝑎𝑒! + 𝑏𝑒$ + 𝑐𝑒# .
#
b) Let 𝑉 = ℝ and
𝑎 0
𝑈 = { 𝑏 |𝑎 = 𝑐} and 𝑊 = { 0 |𝑐 ∈ ℝ}.
𝑐 𝑐
We show that 𝑉 = 𝑈 ⊕ 𝑊. Note that
𝑎 𝑎 0
𝑏 = 𝑏 + 0 .
𝑐 𝑎 𝑐−𝑎
Then 𝑉 = 𝑈 ⊕ 𝑊. Let 𝑣 ∈ 𝑈 ∩ 𝑊. It implies
𝑎 = 𝑐 and 𝑎 = 𝑏 = 0. Then 𝑎 = 𝑏 = 𝑐 = 0.
0
Thus, 𝑣 = 0 and 𝑈 ∩ 𝑊 = {0}. Hence
0
𝑉 = 𝑈 ⊕ 𝑊.

You might also like