0% found this document useful (0 votes)
82 views

Linear Algebra MA2033

The document discusses bases and dimension of vector spaces. It defines a basis as a set of vectors that span a vector space and are linearly independent. The dimension of a vector space is the number of vectors in any of its bases. Some key points made are: - Standard bases exist for Rn and matrices of a given size. - A set of vectors spans a space if every vector can be written as a linear combination of them. - Basis representations of vectors are unique. The coordinates of a vector with respect to a basis are its representation as a linear combination of basis vectors. - All bases of a finite-dimensional space have the same number of vectors, which defines its dimension.

Uploaded by

akil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
82 views

Linear Algebra MA2033

The document discusses bases and dimension of vector spaces. It defines a basis as a set of vectors that span a vector space and are linearly independent. The dimension of a vector space is the number of vectors in any of its bases. Some key points made are: - Standard bases exist for Rn and matrices of a given size. - A set of vectors spans a space if every vector can be written as a linear combination of them. - Basis representations of vectors are unique. The coordinates of a vector with respect to a basis are its representation as a linear combination of basis vectors. - All bases of a finite-dimensional space have the same number of vectors, which defines its dimension.

Uploaded by

akil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Linear Algebra MA2033

Bases and Dimension, Rank of a Matrix

Note: Vector spaces fall into two categories:


A vector space V is said to be finite-dimensional if there is a finite set of vectors in V that spans V and
is said to be infinite-dimensional if no such set exists.

Definition 1 (Basis).
If { } is a set of vectors in a finite-dimensional vector space, then is called a basis for
if:
(a) spans .
(b) is linearly independent.

Examples.
i. The standard unit vectors,
( ) ( ) ( )
form a basis for that we call the standard basis for .

Thus, in particular, ( ) ( ) ( ) is the standard basis for .

ii. Show that the vectors ( ) ( ), and ( ) form a basis for .


Solution.
 We must show that these vectors are linearly independent and span .
 To prove linear independence, we must show that the vector equation
( )
where and are scalars, has only the trivial solution;
 To prove that the vectors span we must show that every vector ( ) in can be
expressed as ( ) where and are scalars.

By equating corresponding components on the two sides, these two equations can be expressed as the
linear systems

and

Thus, we have reduced the problem to showing that the homogeneous system has only the trivial
solution and that the non-homogeneous system is consistent for all values of , , and .

But the two systems have the same coefficient matrix

[ ]

so it follows from parts (b), (e), and (g) of Equivalent Statements Theorem provided in Lecture Note 2,
that we can prove both results at the same time by showing that ( ) .
Since ( ) , the vectors , , and form a basis for .

 From Example 1 and 2 you can see that a vector space can have more than one basis.
1|Page
Bases and Dimension, Rank of a Matrix
iii. Show that the matrices
[ ] [ ] [ ] [ ]
form a basis for the vector space of 2 × 2 matrices.

Solution. We must show that the matrices are linearly independent and span .

To prove linear independence we must show that the equation


( )
has only the trivial solution, where is the 2 × 2 zero matrix;

To prove that the matrices span we must show that every 2 × 2 matrix
[ ]
can be expressed as ( )

The matrix forms of Equations (1) and (2) are


[ ] [ ] [ ] [ ] [ ]
and
[ ] [ ] [ ] [ ] [ ]
which can be rewritten as
[ ] [ ] and [ ] [ ]
Since the first equation has only the trivial solution

the matrices are linearly independent,


Since the second equation has the solution

the matrices span .


This proves that the matrices , , , form a basis for .

 More generally, the different matrices whose entries are zero except for a single entry of 1 form a
basis for called the standard basis for .

iv. Since vectors span and they are linearly independent,


* + form a basis for that we call the standard basis for .

Note: The simplest of all vector spaces is the zero-vector space * +.


o This space is finite-dimensional because it is spanned by the vector .
o However, it has no basis in the sense of Definition 1 because { } is linearly dependent set.
o However, we will find it useful to define the empty set to be a basis for this vector space.

Theorem 1 (Uniqueness of Basis Representation).


If { } is a basis for a vector space , then every vector in can be expressed in the
form in exactly one way.

2|Page
Bases and Dimension, Rank of a Matrix
Proof.
Suppose that some vector can be written as

and also as

Subtracting the second equation from the first gives


( ) ( ) ( )
Since the right side of this equation is a linear combination of vectors in , the linear independence of
implies that

that is,

Thus, the two expressions for are the same.

Note. Observe that in , for example, the coordinates ( ) of a vector are precisely the
coefficients in the formula that expresses as a linear combination of the
standard basis vectors for .

The following definition generalizes this idea.

Definition 2.
If { } is a basis for a vector space , and

is the expression for a vector in terms of the basis , then the scalars are called the
coordinates of with respect to the basis .
The vector ( ) in constructed from these coordinates is called the coordinate vector of
with respect to ; it is denoted by ( ) .

Example. We showed in a previous example that the vectors ( ) ( ),


( ) form a basis for .
a) Find the coordinate vector of ( ) with respect to the basis * +.
b) Find the vector in whose coordinate vector with respect to is ( ) .

Solution.

3|Page
Bases and Dimension, Rank of a Matrix
(a) We must first express as a linear combination of the vectors in ; that is, we must find values of
, and such that
( ) ( ) ( ) ( )
Equating corresponding components gives

Solving this system we obtain . Therefore, ( )

(b) ( ) ( )( ) ( ) ( ) ( )

Theorem 2.
All bases for a finite-dimensional vector space have the same number of vectors.

Theorem 3.
Let be an -dimensional vector space, and let { } be any basis.
a) If a set in has more than vectors, then it is linearly dependent.
b) If a set in has fewer than vectors, then it does not span .
Proof.

If { } is an arbitrary basis for ,


then the linear independence of implies that any set in with more than vectors is linearly
dependent and any set in with fewer than vectors does not span .
Thus unless a set in has exactly vectors it cannot be a basis.
This argument implies that Theorem 2 is true.

Definition 2 (Dimension).
The dimension of a finite-dimensional vector space ( ( )) is the number of vectors in a basis
for . In addition, the zero-vector space is defined to have dimension zero.

Examples.
i. Dimensions of Some Familiar Vector Spaces
( ) [The standard basis has vectors.]
( ) [The standard basis has vectors.]
( ) [The standard basis has vectors.]

ii. Dimension of Span( )


If { } then every vector in span( ) is expressible as a linear combination of the vectors
in . Thus, if the vectors in are linearly independent, they automatically form a basis for span( ), from
which we can conclude that ( * +) .
In words, the dimension of the space spanned by a linearly independent set of vectors is equal to the
number of vectors in that set.

iii. Dimension of a Solution Space


Find a basis for and the dimension of the solution space of the homogeneous system

4|Page
Bases and Dimension, Rank of a Matrix

Solution. The solution of this system to be

which can be written in vector form as


( ) ( )
or, alternatively, as
( ) ( ) ( ) ( )
This shows that the vectors
( ) ( ) ( )
span the solution space.
These vectors are linearly independent as well (verify), thereby the solution space has dimension 3.

Definition 3.

For an matrix [ ]

the vectors,
, - and the vectors,

, -
[ ] [ ] [ ]
, -

in that are formed from the rows of are in formed from the columns of are called
called the row vectors of , the column vectors of .

 The following definition defines two important vector spaces associated with a matrix.

Definition 4 (Column Space and Row Space).


If is an matrix, then the subspace of spanned by the row vectors of is called the row
space of ( row( ) ), and the subspace of spanned by the column vectors of is called the column
space of ( col( ) ).

Theorem 4.
If a matrix is in row echelon form, then the row vectors with the leading 1’s (the nonzero row vectors)
form a basis for the row space of , and the column vectors with the leading 1’s of the row vectors form
a basis for the column space of .
The proof is omitted.

Example. Since the matrix [ ] is in row echelon form, it follows from Theorem

4, that the vectors


5|Page
Bases and Dimension, Rank of a Matrix
, -
, -
, -
form a basis for the row space of , and the vectors

[ ] [ ] [ ]

form a basis for the column space of .

Theorem 5. Elementary row operations do not change the row space of a matrix.
Notes.
 Suppose is the matrix obtained by applying one of elementary row operations on a matrix :
Interchange and ; Replace by ; Replace by
Then each row of is a row of or a linear combination of rows of . Hence, the row space of is
contained in the row space of .
On the other hand, we can apply the inverse elementary row operation on to obtain . Hence, the row
space of A is contained in the row space of .
Accordingly, and have the same row space. This will be true each time we apply an elementary row
operation. Thus, we have proved the above theorem.

 This Theorem does not imply that elementary row operations do not change the column space of a
matrix.
To see why this is not true, compare the matrices [ ] and [ ].
The matrix can be obtained from by adding the first row to the second.
However, this operation has changed the column space of , since that column space consists of all
scalar multiples of [ ] whereas the column space of consists of all scalar multiples of [ ] and the two
are different spaces.

Example (Basis for a Row Space by Row Reduction).


Find a basis for the row space of the matrix

[ ]

Solution. Since elementary row operations do not change the row space of a matrix, we can find a basis
for the row space of by finding a basis for the row space of any row echelon form of .
Reducing to row echelon form, we obtain (verify)

[ ]

By Theorem 4, the nonzero row vectors of form a basis for the row space of and hence form a basis
for the row space of . These basis vectors are
6|Page
Bases and Dimension, Rank of a Matrix
, -
, -
, -

Theorem 6. If and are row equivalent matrices, then:


(a) A given set of column vectors of is linearly independent if and only if the corresponding column
vectors of are linearly independent.
(b) A given set of column vectors of forms a basis for the column space of if and only if the
corresponding column vectors of form a basis for the column space of .
Proof is omitted.

Example (Basis for a Column Space by Row Reduction).


Find a basis for the column space of the matrix that consists of column vectors of

[ ]

Solution.
A row echelon form of is given by,

[ ]

Keeping in mind that and can have different column spaces, we cannot find a basis for the column
space of directly from the column vectors of .
However, it follows from Theorem 6 that if we can find a set of column vectors of that forms a basis
for the column space of , then the corresponding column vectors of will form a basis for the column
space of .
Since the first, third, and fifth columns of contain the leading 1’s of the row vectors, the vectors

[ ] [ ] [ ]

form a basis for the column space of . Thus, the corresponding column vectors of , which are

[ ] [ ] [ ]

form a basis for the column space of .

More Examples.

i. Find a basis for the row space of [ ] consisting entirely of row vectors

from .
7|Page
Bases and Dimension, Rank of a Matrix
Solution. We will transpose , thereby converting the row space of into the column space of ; then
we will use the method of above Example to find a basis for the column space of ; and then we will
transpose again to convert column vectors back to row vectors.

Transposing yields, and then reducing this matrix to row echelon


form we obtain,

The first, second, and fourth columns contain the leading 1’s, so the corresponding column vectors in
form a basis for the column space of ; these are

[ ] [ ] [ ]
Transposing again and adjusting the notation appropriately yields the basis vectors
, - , - , -
for the row space of .

ii. Find a subset of the vectors


( ) ( )
( ) ( ) ( )
that forms a basis for the subspace of spanned by these vectors and express each vector not in the
basis as a linear combination of the basis vectors.

Solution.
Note that, * + col( ) where

A row echelon form of is given by,

The leading 1’s occur in columns 1, 2, and 4, so, * , , + is a basis for the ( ) and
consequently, * , , + is a basis for the ( ).
We can simply express and in terms of basis vectors with smaller subscripts;

The corresponding relationships in terms of given set of vectors are,

8|Page
Bases and Dimension, Rank of a Matrix

Theorem 7. The row space and the column space of a matrix have the same dimension.
Proof is omitted.

Definition 5 (Rank).
The common dimension of the row space and column space of a matrix is called the rank of
( ( )).

Examples.
i. Find the rank of the matrix

Solution.
The reduced row echelon form of is Since this matrix has two leading 1’s, its row
and column spaces are two dimensional and
rank( ) = 2.

ii. What is the maximum possible rank of an matrix that is not square?
Solution.
Since the row vectors of lie in and the column vectors in , the row space of is at most -
dimensional and the column space of is at most -dimensional.
Since the rank of is the common dimension of its row and column space, it follows that the rank is at
most the smaller of and . That is, rank( ) min( ).

Exercises.
1. Show that the vector space of of all polynomials with real coefficients is infinite dimensional by
showing that it has no finite spanning set.
Solution.
If there were a finite spanning set, say * +, then the degrees of the polynomials in
would have a maximum value, say ; and this in turn would imply that any linear combination of the
polynomials in would have degree at most .
Thus, there would be no way to express the polynomial as a linear combination of the polynomials
in , contradicting the fact that the vectors in span .

2. Determine whether the following sets of vectors are bases for


(a) {(1, 0, 1), (1, 1, 0), (0, 1, 1), (2, 1, 1)}
(b) {(1, 0, 1), (1, 1, 0), (1, 2, -1)}
(c) {(1, 0, 1), (0, 1, 1), (1, 2, -1)}

3. (a) Are the vectors ( ) and ( ) linearly independent? Are these


vectors perpendicular to each other? Explain your answers.
9|Page
Bases and Dimension, Rank of a Matrix
(b) Do the vectors ( ), ( ), ( ), ( ) define a
basis of ? Explain.
(c) Do the vectors ( ), ( ), ( ) define a basis of the
subspace defined by (the set of solutions of) the 3-dimensional plane in
? Explain.
(d) Find such that the set of vectors ( ),( ), ( ), ( ) does not form a
basis for . Is this unique? Why?

4. Consider the subset * + of , where ( ) ( ) and ( ) for


all .
a. Show that is a spanning set for and that is linearly independent, and hence that is a basis
for .
b. Obtain the coordinate vector of in the basis , where ( ) for all .

5. Consider the vector space . Replace the polynomial ( ) in ( ( )) such that it becomes
a basis for that vector space. Calculate the coordinates of ( ) relative to the chosen basis.

6. Let be a finite dimensional vector space and let be a subspace of . Prove that
( ) ( ).

7. Let be the set of all possible infinite tuples of real numbers. That
means *( )| +. Assuming that is a vector space, prove that is
infinite dimensional.

8. Find a basis for the solution space of the system of equations when

(a) ( )

(b) ( )

What is the dimension of the solution space in each case?

9. Determine the dimensions of the following subspaces of .


(a) all vectors of the form ( )
(b) *( )| +

10. Find a basis for the subspace of consisting of all matrices such that . What is
the dimension of the subspace of consisting of all matrices such that ?

11. Suppose that a set of vectors * + is a spanning set of a vector space . Prove or
disprove if is another vector in , then the set * + is a spanning set for .

12. Suppose that are linearly dependent set of vectors in a vector space . For any
vector , prove that the set of vectors are linearly dependent as well.

13. Let and be subspaces of such that * + and ( ) ( ) .


10 | P a g e
Bases and Dimension, Rank of a Matrix
a) If , where and , then show that and .
b) If is a basis for the subspace and is a basis for the subspace , then show that the union
is a basis for .
c) If is in , then show that can be written in the form , where and .
d) Show that the representation obtained in part (c) is unique.

14. Consider the matrix ( ).

(a) Compute the row-echelon form of .


(b) Determine the rank of .
(c) Determine a basis of the column space of .

(d) For what value(s) of is the following system solvable ( )?

(a) Find a basis for the row space of . What is


15. Let ( ) the rank of ?

(b) Find a basis for the set of solutions of


and determine the dimension.

16. Find a basis for span( ) where {( ) ( ) ( ) ( )}.

17. Let ( ) ( ) and ( ). Determine the dimension of * +


and determine if ( ) lies in span* +.

18. Suppose *( )( )( )+ and


*( )( )+. Find a basis for and hence prove that .

19. Show that if A is any matrix, then rank( ) = rank( ).


Proof. rank( ) = dim(row space of ) = dim(column space of ) = rank( ).

20. Let be an matrix and be an matrix. Then prove the followings.


(a) ( ) ( ). (Hint. You can use question 6 to prove this)
(b) If the matrix is nonsingular, then ( ) ( ).

11 | P a g e

You might also like