0% found this document useful (0 votes)
64 views93 pages

3 - Vector Spaces

The document covers the fundamentals of vector spaces, including definitions, properties, and examples of vector spaces and subspaces. It discusses concepts such as linear combinations, spanning sets, bases, and dimensions, as well as specific examples like polynomial spaces and matrix spaces. The content is structured into sections that explore the definitions and properties of vector spaces and subspaces in detail.

Uploaded by

trunghoa456
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views93 pages

3 - Vector Spaces

The document covers the fundamentals of vector spaces, including definitions, properties, and examples of vector spaces and subspaces. It discusses concepts such as linear combinations, spanning sets, bases, and dimensions, as well as specific examples like polynomial spaces and matrix spaces. The content is structured into sections that explore the definitions and properties of vector spaces and subspaces in detail.

Uploaded by

trunghoa456
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 93

APPLIED LINEAR ALGEBRA

3. Vector Spaces
(Sections 7.4-7.5 of Kreyszig)

Nguyễn Anh Tú

[email protected]
Contents

1 Vector spaces

2 Subspaces, Linear combination and Span

3 Linear Independence

4 Basis and dimensions

5 Row and column spaces, rank of a matrix

6 Coordinates and changes of bases


Section 1

Vector spaces
Introduction
• Many physical quantities, such as area, length, mass, and
temperature, are completely determined by its magnitude.
Such quantities are called scalars.
• Other physical quantities can only be determined by both
a magnitude and a direction, e.g. forces, velocities,
electromagnetic fields,... These quantities are called
vectors.
• The goal of this section is to study vector spaces, and its
fundamental properties:

1. Linear dependence/independence.

2. Spanning sets.

3. Bases and dimensions.


What is a Vector?

• A vector, represented by an arrow, has both a direction


and a magnitude. Magnitude is shown as the length of a
line segment. Direction is shown by the orientation of the
line segment, and by an arrow at one end.

• Equal vectors have the same length and direction but may
have different starting points.
What is a Vector?

Each of the directed line segments in the above figure


represents the same vector. In each case the vector starts at a
specific point then moves 2 units to the left and 5 units up.
Notation: ~v = h−2, 5i or ~v = (−2, 5).
Note: It is important to distinguish the vector ~v = (−2, 5)
from the point A(−2, 5) .
Vectors
• Given the two points A(a1 , a2 ) and B(b1 , b2 ), the vector
~ is AB
with the representation AB ~ = (b1 − a1 , b2 − a2 ).

• The magnitude, or length, of the vector ~v = (a, b) is


given by, √
k~v k = a2 + b 2
• Example, if ~v = (−3, 5) then its magnitude

k~v k = 9 + 16 = 5

• Any vector with magnitude of 1 is called a unit vector,


e.g., v~1 = (0, 1), or ~v2 = (1, 0) (standard basis vectors).
• The zero vector, ~0 = (0, 0), is a vector that has
magnitude zero, but no specific direction.
Vector Spaces
A vector space is a nonempty set V of objects, called vectors,
on which are defined two operations, called addition and
multiplication by scalars (real numbers), satisfying the
following ten axioms for all u, v , w ∈ V and for all c, d scalars:
1. The sum of u and v , denoted by u + v , is in V .
2. u + v = v + u.
3. (u + v ) + w = u + (v + w ).
4. There is a zero vector 0 in V such that u + 0 = u.
5. For each u in V , there is a vector −u in V such that
u + (−u) = 0 .
6. The scalar multiple of u by c, denoted by cu, is in V .
7. c(u + v ) = cu + cv .
8. (c + d)u = cu + du.
9. c(du) = (cd)u.
Vector Spaces

• Technically, V is a real vector space. All of the theory in


this chapter also holds for a complex vector space in
which the scalars are complex numbers. From now on, all
scalars are assumed to be real.
• The zero vector in Axiom 4 is unique. The vector −u
called the negative vector of u.

Properties
For any u in V and scalar c,
1. 0u = 0 , where 0 is the zero vector of V .
2. c00 = 0 .
3. −u = (−1)u.
Vector Spaces
Example: Three-dimensional vector space
Let V be the set of all arrows (directed line segments) in
three-dimensional space, with two arrows regarded as equal if
they have the same length and point in the same direction.
Define addition by the parallelogram rule and for each v in V ,
define cv to be the arrow whose length is |c| times the length
of v , pointing in the same direction as v if c > 0 and
otherwise pointing in the opposite direction.
Show that V is a vector space. This space is a common model
in physical problems for various forces
Vector Spaces

Spaces of Matrices
The set of all m × n matrices with matrix addition and
multiplication of a matrix by a real number (scalar
multiplication), is a vector space (verify). We denote this
vector space by Mmn .
Vector Spaces

Example: Vector space of Matrices with zero trace


Let V be the set of all 2 × 2 matrices with trace equal to zero,
that is,
   
a b
V = A= : Tr (A) = a + d = 0
c d

V is a vector space with the standard matrix addition, and the


standard scalar multiplication of matrices.
Vector Spaces

Example: n-dimensional vector space


Let Rn be the set of all vector in the following form
 
a1
 a2 
u =  .. 
 
 . 
an

This is the set of all matrices of size n × 1, a specific case of


the previous example. So Rn is a vector space.
Vector Spaces
Example: discrete-time signals
Let S be the space of all doubly infinite sequences of numbers
(usually written in a row rather than a column) with operations

{yk } + {zk } = {yk + zk } ; c {yk } = {cyk }

Elements of S arise in engineering, for example, whenever a


signal is measured (or sampled) at discrete times. A signal
might be electrical, mechanical, optical, and so on. For
convenience, we will call S the space of (discrete-time) signals.
Vector Spaces
Example: The vector spaces of polynomials of degree nth
For n > 0, the set Pn of polynomials of degree at most n
consists of all polynomials of the form

p(x) = a0 + a1 x + a2 x 2 + ... + an x n ,

where the coefficients a0 , a1 , ..., an and the variable x are real


numbers. If all the coefficients are zero, p is called the zero
polynomial.
If q(x) = b0 + b1 x + ... + bn x n , then we define

(p + q) (x) = p (x)+q (x) = (a0 + b0 )+(a1 + b1 ) x+...+(an + bn ) x n

(cp) (x) = cp (x) = ca0 + (ca1 ) x + (ca2 ) x 2 + ... + (can ) x n


Then Pn is a vector space.
Vector Spaces
Example: The vector space of all real-valued functions
Let V be the set of all real-valued functions defined on a set
D. (Typically, D is the set of real numbers or some interval on
the real line.) Functions are added in the usual way

(f + g ) (x) = f (x) + g (x)

(αf ) (x) = αf (x)


Two functions in V are equal if and only if their values are
equal for every t in D.
Hence the zero vector in V is the function that is identically
zero, f (t) = 0 for all t, and the negative of f is (−1)f . Axioms
1 and 6 are obviously true, and the other axioms follow from
properties of the real numbers, so V is a vector space.
Section 2

Subspaces, Linear combination and Span


Subspaces

Definition
A subspace of a vector space V is a subset H of V that has
three properties:
a. The zero vector of V is in H.
b. H is closed under vector addition. That is, for each u and v
in H, the sum u + v is in H.
c. H is closed under multiplication by scalars. That is, for each
u in H and each scalar c, the vector cu is in H.

Properties (a), (b), and (c) guarantee that a subspace H of V


is itself a vector space, under the vector space operations
already defined in V .
Subspaces
Example

(a) A line H through 0 is a subspace of V = R2 .


(b)The x1 x2 -plane

H = {(x, y , 0) : x, y ∈ R}

is a subspace of V = R3 .
Subspaces

Example
The set consisting of only the zero vector in a vector space V
is a subspace of V , called the zero subspace and written as
{00}.

Example
Let P be the set of all polynomials with real coefficients, with
operations in P defined as for functions. Then P is a subspace
of the space of all real-valued functions defined on R. Also, for
each n > 0, Pn is a subspace of P.
Subspaces

Example
A line in R2 not containing the origin is not a subspace of R2 .
A plane in R3 not containing the origin is not a subspace of
R3 .
Subspaces

Example
Which of the given subsets of the vector space P2 are
subspace?
(a) a2 t 2 + a1 t + a0 , where a1 = 0, a0 = 0
(b) a2 t 2 + a1 t + a0 , where a1 = 2a0
(c) a2 t 2 + a1 t + a0 , where a2 + a1 + a0 = 2

Answers: (a) and (b).


Subspaces

Exercises
Let W be the set of all 3 × 3 matrices of the form
 
a 0 b
 0 c 0 
d 0 e

Show that W is a subspace of M33 .


Subspace

Exercises
Which of the following subsets of the vector space Mmn are
subspaces?
(a) The set of all n × n symmetric matrices.
(b) The set of all n × n diagonal matrices.
(c) The set of all n × n invertible matrices.

Answer: The subsets in (a) and (b) are subspaces.


The subset in (c) is not, because it does not contain the zero
matrix.
Null space

Example
if A is an m × n matrix, then the homogeneous system of m
equations in n unknowns with coefficient matrix A can be
written as
Ax = 0
where x is a vector in Rn and 0 is the zero vector. Show that
set W of all solutions is a subspace of Rn .
W is called the solution space of the homogeneous system, or
the null space of the matrix A.
Linear combination

Definition
Let v1 , v2 , ..., vk be vectors in a vector space V . A vector v in
V is a linear combination of v1 , v2 , ..., vk if
k
X
v = a1 v1 + a2 v2 + +... + ak vk = a j vj
j=1

Example

a
If v1 = [1 0 1]T and v2 = [0 1 1]T then every w =  b  is
a+b
a linear combination of v1 , v2 since w = av1 + bv2 .
A Subspace Spanned by a Set

The next theorem gives one of the most common ways to


define a subspace.
Theorem
If v1 , v2 , ..., vk are vectors in a space V , then

span {v1 , v2 , ..., vk } = {c1 v1 + c2 v2 + ... + ck vk : cj ∈ R}

is a subspace of V .

We call span {v1 , v2 , ..., vk } the subspace spanned (or


generated) by {v1 , v2 , ..., vk }
A Subspace Spanned by a Set
Example
Given v1 and v2 in R3 ,

H = span {v1 , v2 } = {av1 + bv2 : a, b ∈ R}

is a plane, which is a subspace of R3 .


A Subspace Spanned by a Set

Example
Consider the set S of 2 × 3 matrices given by
       
1 0 0 0 1 0 0 0 0 0 0 0
S= , , ,
0 0 0 0 0 0 0 1 0 0 0 1

Then span S is the set in M23 consisting of all vectors of the


form  
a1 a2 0
0 a3 a4
where aj ∈ R.
A Subspace Spanned by a Set

Example
Let S = {t 2 , t, 1}, then we have span S be a subset of P2 .
Then span S is the subspace of all polynomials of the form
a2 t 2 + a1 t + a0 .

Example
Let    
1 0 0 0
S= ,
0 0 0 1
Then span S is the subspace of all all 2 × 2 diagonal matrices .
A Subspace Spanned by a Set
Example
Let n o
H = (a − 3b, b − a, a, b)T : a, b ∈ R

Show that H is a subspace of R4 .


Proof:
An arbitrary vector in H has the form
     
a − 3b 1 −3
 b−a 
 = a  −1  + b  1 
  
 
 a   1   0 
b 0 1
Thus, H = span {v1 , v2 }, where
v1 = (1, −1, 1, 0)T , v2 = (−3, 1, 0, 1)T .
Hence H is a subspace of R4 .
A Subspace Spanned by a Set
Example
Let
       
1 1 1 2
v1 =  2  , v2 =  0  , v3 =  1  , v =  1 
1 2 0 5

Is v ∈ span {v1 , v2 , ..., vp }?

Solution: Find a1 , a2 , a3 such that


       
1 1 1 2
a1 2 + a2 0 + a3 1 = 1 
      
1 2 0 5

Solve this linear system to obtain a1 = 1, a2 = 2, a3 = −1.


Thus, v = v1 + 2v2 − v3 so v ∈ span {v1 , v2 , v3 }
A Subspace Spanned by a Set

Example
For what value(s) of a will y be in the subspace of R3 spanned
by v1 , v2 , v3 , if
       
1 5 −3 −4
v1 =  −1  , v2 =  −4  , v3 =  1  , and y =  3 
−2 −7 0 a

Answer: a = 5.
A Subspace Spanned by a Set

Example
In P2 let

v1 = 2t 2 +t +2, v2 = t 2 −2t, v3 = 5t 2 −5t +2, v4 = −t 2 −3t −2

Determine whether the vector

v = t2 + t + 2

belongs to span {v1 , v2 , v3 , v4 }.


A Subspace Spanned by a Set

Solution: Find scalars a1 , a2 , a3 , a4 such that

a1 v1 + a2 v2 + a3 v3 + a4 v4 = v

(2a1 + a2 + 5a3 − a4 ) t 2 +(a1 − 2a2 − 5a3 − 3a4 ) t+(a1 + 2a3 − 2a4 )


= t2 + t + 2
Thus we get the linear system:

2a1 + a2 + 5a3 − a4 = 1

a1 − 2a2 − 5a3 − 3a4 = 1


a1 + 2a3 − 2a4 = 2
A Subspace Spanned by a Set

To determine whether this system of linear equations is


consistent. We form the augmented matrix and transform it to
reduced row echelon form, obtaining (verify)
 
1 0 1 −1 0
 0 1 3 1 0 
0 0 0 0 1

which indicates that the system is inconsistent; that is, it has


no solution. Hence v does not belong to span {v1 , v2 , v3 , v4 }.
A Subspace Spanned by a Set

Example
Let V be the vector space R 3 . Let
     
1 1 1
v1 = 2 , v2 = 0 , v3 = 1 
    
1 2 0

Show that span {v1 , v2 , v3 } = R3 .

Solution:
Pick up any  
a
v = b ∈V
c
A Subspace Spanned by a Set

Solution (Cont.):
This leads to the linear system

a1 + a2 + a3 = a

2a1 + a3 = b
a1 + 2a2 = c
A solution is (verify)

−2a + 2b + c a−b+c 4a − b − 2c
a1 = , a2 = , a3 =
3 3 3
A Subspace Spanned by a Set

Example
Explain why the set S is not a spanning set for the vector
space V .
(a) S = {t 3 , t 2 , t} , V = P3
(b)    
1 0
S= , , V = R2
0 0
(c)    
1 0 0 1
S= , , V = M22
0 1 1 0
Section 3

Linear Independence
Linear Independence

Definition
The vectors v1 , v2 , v3 , ..., vk in a vector space V are said to be
linearly dependent if there exist constants a1 , a2 , ..., ak , not all
zero, such that
X k
aj vj = 0
j=1

Otherwise, v1 , v2 , v3 , ..., vk are called linearly independent.

That is, v1 , v2 , v3 , ..., vk are linearly independent if


Pk
aj vj = 0 ⇔ aj = 0, ∀j = 1, .., k.
j=1
Linear Independence

Example
     
1 1 2
v1 = 2 , v2 = 0 , v3 = 2 
    
1 2 3
are linearly dependent since v1 + v2 − v3 = 0 .

Example
   
1 0
v1 = , v1 =
0 2
are linearly independent since av1 + bv2 = 0 iff a = b = 0.
Linear Independence
Example
Determine whether the vectors
     
3 1 −1
v1 =  2  , v2 =  2  , v3 =  2 
1 0 −1
are linearly independent.
Solution
Forming equation:

a 1 v1 + a 2 v2 + a 3 v3 = 0
       
3 1 −1 0
a1  2  + a2  2  + a3  2  =  0 
1 0 −1 0
Linear Independence
We obtain the homogeneous system

3a1 + a2 − a3 = 0

2a1 + 2a2 + 2a3 = 0


a1 − a3 = 0
Doing the row operations
   
3 1 −1 0 1 0 −1 0
Row operations
 2 2 2 0 −−−−−−−−→ 0 1 2
  0 
1 0 −1 0 0 0 0 0

The nontrivial solution is

k (1, −2, 1)T , k 6= 0

so the vectors are linearly dependent!


Linear Independence

Example
Determine whether the vectors
     
1 0 1
v1 = 0 , v2 = 1 , v3 = 1 
    
1 1 1
are linearly independent.
Solution
Forming equation:
       
1 0 1 0
a1 0 + a2 1 + a3 1 = 0 
      
1 1 1 0
Linear Independence

Doing the row operations


   
1 0 1 0 1 0 0 0
 0 1 1 0 → 0 1 0 0 
1 1 1 0 0 0 1 0

Thus the only solution is the trivial solution a1 = a2 = a3 = 0,


so the vectors are linearly independent.
Linear Independence

Example
Are the vectors
     
2 1 1 2 0 −3
v1 = , v2 = , v3 =
0 1 1 0 −2 1
in M22 linearly independent?
Solution:
Setting up the equation:
       
2 1 1 2 0 −3 0 0
a1 + a2 + a3 =
0 1 1 0 −2 1 0 0
Linear Independence

   
2a1 + a2 a1 + 2a2 + a3 0 0
=
a2 − 2a3 a1 + a3 0 0
Solving the linear system to find aj :
   
2 1 0 0 1 0 1 0
 1 2 −3 0  Row operations  0 1 −2 0 
 0 1 −2 0  −−−−−−−−→ 
   
0 0 0 0 
1 0 1 0 0 0 0 0

The nontrivial solution is

k (−1, 2, 1)T , k 6= 0

so the vectors are linearly dependent.


Linear Independence

Example
Are the vectors

v1 = t 2 + t + 2, v2 = 2t 2 + t, v3 = 3t 2 + 2t + 2
in P2 linearly independent?
Answer: The given vectors are linearly dependent
Linear Independence

Theorem
Let S = {v1 , v2 , ..., vn } be a set of n vectors in Rn . Let A be
the matrix whose columns (rows) are the elements of S. Then
S is linearly independent if and only if det(A) 6= 0.

Proof:
We will prove the result for column-vectors.

Suppose that S is linearly independent. Then it follows that


the reduced row echelon form of A is In . Thus, A is row
equivalent to In , and hence det(A) 6= 0.

Conversely, if det(A) 6= 0, then A is row equivalent to In .


Hence, the rows of A are linearly independent.
Linear Independence

Example
     
 1 0 3 
Is S =  2  ,  1  ,  0  a linearly independent set
3 2 −1
 
of vector in R3 ?
Solution
We form the matrix A whose columns are the vectors in S:
 
1 0 3
A= 2 1 0 
3 2 −1

det (A) = 2
So S is linearly independent.
Linear Independence
Theorem
Let S1 and S2 be finite subsets of a vector space and let S1 be
a subset of S2 . Then the following statements are true:
(a) If S1 is linearly dependent, so is S2 .
(b) If S2 is linearly independent, so is S1 .

Proof: Let

S1 = {v1 , v2 , ..., vk } , S2 = {v1 , v2 , ..., vk , vk+1 , .., vm }

(a) Since S1 is linearly dependent, there exist constants


a1 , a2 , ..., ak , not all zero, such that
k
X
aj vj = 0
j=1
Linear Independence

Proof (Cont.) Therefore,

a1 v1 + a2 v2 + . . . + ak vk + 0vk+1 + . . . + 0vm = 0

Since not all the coefficients in the equations above are zero,
we conclude that S2 is linearly dependent.

Statement (b) is the contrapositive of statement (a), so it is


logically equivalent to statement (a).
Linear Independence

Remarks
• The set S = {00} consisting of only the vector 0 is linearly
dependent.
From this it follows that if S is any set of vectors that
contains 0, then S must be linearly dependent.
• A set of vectors consisting of a single nonzero vector is
linearly independent.
• If v1 , v2 , ..., vk are vectors in a vector space V and for
some i 6= j, vi = vj , then v1 , v2 , ..., vk are linearly
dependent.
Linear Independence
Theorem
The nonzero vectors v1 , v2 , ..., vk in a vector space V are
linearly dependent if and only if one of the vectors vj (j ≥ 2) is
a linear combination of the preceding vectors v1 , v2 , ..., vj−1 .

Example

       
1 1 −3 2
v1 =  2  , v2 =  −2  , v3 =  2  , v4 =  0 
−1 1 −1 0

v1 + v2 + 0v3 − v4 = 0
so v1 , v2 , v3 , and v4 are linearly dependent. We then have

v4 = v1 + v2 + 0v3 .
Section 4

Basis and dimensions


Basis and dimensions
Definition
The vectors v1 , v2 , ..., vk in a vector space V are said to form a
basis for V if
(a) v1 , v2 , ..., vk span V and
(b) v1 , v2 , ..., vk are linearly independent.

Example
Let V = R3 . The vectors
     
1 0 0
v1 = 0 , v2 = 1 , v3 = 0 
    
0 0 1

form a basis for R3 , called the natural basis or standard basis


for R3 .
Basis and dimensions

Example
Generally, the natural basis or standard basis for Rn is denoted
by
{e1 , e2 , ..., en }
where
0
 
 .. 
 . 
 0 
 
ei =  1 
 
 0 
 
 . 
 .. 
0
Basis and dimensions
Example
Show that the set

S = t 2 + 1, t − 1, 2t + 2


is a basis for the vector space P2 .

Solution We must show that S spans V and is linearly


independent.
To show that it spans V , we take any vector in V , that is a
polynomial at 2 + bt + c and find constants a1 , a2 and a3 such
that

at 2 + bt + c = a1 t 2 + 1 + a2 (t − 1) + a3 (2t + 2)

Basis and dimensions
We find
a+b−c c +b−a
a1 = a, a2 = , a3 = .
2 4
Hence S spans V.
To show that S is linearly independent, we solve
a1 t 2 + 1 + a2 (t − 1) + a3 (2t + 2) = 0


a1 t 2 + (a2 + 2a3 ) t + (a1 − a2 + 2a3 ) = 0


This can hold for all values of t only if
a1 = a2 + 2a3 = a1 − a2 + 2a3 = 0
Thus a1 = a2 = a3 = 0.
Remark: The set S = {t n , t n−1 , .., t, 1} forms a basis for the
vector space Pn . S is called the natural, or standard basis, for
Pn .
Basis and dimensions

Example
Show that the set

S = {v1 , v2 , v3 , v4 }

where
       
1 0 0 1
 0   1   2   0 
v1 = 
 1  , v2 =  −1
   , v3 =   , v4 = 
  2   0


0 2 1 1

is a basis for the vector space R4 .


Basis and dimensions
Hint
(a) To show that S spans R4 , we let
 
a
 b 
v =  c 

in R4 and find a1 , a2 , a3 and a4 such that


v = a 1 v1 + a 2 v2 + a 3 v3 + a 4 v4 .
(b) S is linearly independent since det(A) = 1 where
 
1 0 0 1
 0 1 2 0 
A=  1 −1 2 0 

0 2 1 1
Basis and dimensions

Example
Find a basis for the subspace V of P2 , consisting of all vectors
of the form at at 2 + bt + c where c = a − b.

Hint:
S = t 2 + 1, t − 1


Remarks
A vector space V is called finite-dimensional if there is a finite
subset of V that is a basis for V . If there is no such finite
subset of V , then V is called infinite-dimensional.
Basis and dimensions

Theorem
If S = {v1 , v2 , .., vn } is a basis for a vector space V , then every
vector in V can be written in one and only one way as a linear
combination of the vectors in S.

Proof: Suppose

v = a1 v1 + a2 v2 + ... + an vn , and v = b1 v1 + b2 v2 + ... + bn vn

This implies

(a1 − b1 ) v1 + (a2 − b2 ) v2 + ... + (an − bn ) vn = 0

Since S is linearly independent, we have aj = bj for


j = 1, 2, ..., n.
Basis and dimensions
Theorem
Let S = {v1 , v2 , .., vn } be a set of nonzero vectors in a vector
space V and let W = span S. Then some subset of S is a
basis for W .

Procedure for finding a basis from a subset


• Form the equation

a1 v1 + a2 v2 + ... + an vn = 0

• Construct the augmented matrix and transform it to row


echelon form.
• The set of vectors vj ’s corresponding to pivot columns of
the row echelon form is a basis for W = span S.
Finding a basis

Example
Find a basis for S = {v1 , v2 , .., v5 } := Col A, where
 
1 4 0 2 0
   0 0 1 −1 0 
A = v1 v2 · · · v5 =   0 0

0 0 2 
0 0 0 0 0
Finding a basis

Solution
Each nonpivot column of A is a linear combination of the pivot
columns. In fact, v2 = 4v1 , v4 = 2v1 − v3 .
Let      

 1 0 0 
     
0 , , 0 
1

S = {v1 , v2 , v3 } =  0   0   2 

 
0 0 0
 

Since no vector in S is a linear combination of the vectors that


precede it, S is linearly independent.

Thus S is a basis for Col A.


Finding a basis
Example
Let
       
1 6 2 −4
v1 =  −3  , v2 =  2  , v3 =  −2  , v4 =  −8  .
4 −1 3 9

Find a basis for the subspace W spanned by {v1 , v2 , v3 , v4 }.

Solution: We have

The first two column are the pivot columns. Hence


S = {v1 , v2 } is a basis for the subspace W spanned by
{v1 , v2 , v3 , v4 }.
Basis

Theorem
Let S = {v1 , v2 , .., vn } be a basis for a vector space V , and
T = {w1 , w2 , .., wm }.
• If T is a linearly independent then m ≤ n.
• If T spans V then m ≥ n.

Corollary
If S = {v1 , v2 , .., vn } and T = {w1 , w2 , .., wm } are bases for a
vector V , then n = m.
Dimensions

Definition
The dimension of a nonzero vector space V is the number of
vectors in a basis for V , denoted by dim V .
We also define the dimension of the trivial vector space {00} to
be zero.

Example
• If S = {t 2 , t, 1} is a basis for P2 , so dim P2 = 3.
• dim Rn = n.
• dim Mm,n = mn.
For a set of exactly dim V vectors, only one of the two
conditions for being a basis is needed. I.e.,
Theorem
Let V be an n-dimensional vector space.
• If S = {v1 , v2 , ..., vn } is a linearly independent set of
vectors in V, then S is a basis for V .
• If S = {v1 , v2 , ..., vn } spans V , then S is a basis for V .
Basis
Definition
Let S be a set of vectors in a vector space V . A subset T of S
is called a maximal independent subset of S if T is a linearly
independent set of vectors that is not properly contained in
any other linearly independent subset of S.

Example
Let      
1 0 1
S = {v1 , v2 , v3 } = , ,
0 2 2
Then maximal independent subsets of S are {v1 , v2 }, {v2 , v3 },
and {v1 , v3 }.

Theorem
Let S be a finite subset of the vector space V that spans V . A
maximal independent subset T of S is a basis for V .
Section 5

Row and column spaces, rank of a matrix


Row Space and column space
Definition
Let  
a11 a12 . . . a1n
 a21 a22 . . . a2n 
A= .. .. ..
 

 . . . 
am1 am2 · · · amn
be an m × n matrix.
The subspace of Rn spanned by the rows of A is called the row
space of A, denoted by Row A.
The subspace of Rm spanned by the columns of A is called the
column space of A, denoted by Col A.

Definition
The dimension of the row (column) space of A is called the
row (column) rank of A.
Row Space and column space
Theorem
If A and B are two m × n row (column) equivalent matrices,
then the row (column) spaces of A and B are equal.
As a consequence, if A and B are row equivalent, then row
rank A = row rank B; and if A and B are column equivalent,
then column rank A = column rank B.

Example
Find the row space, the null space, and the column space of
the matrix
 
−2 −5 8 0 −17
 1 3 −5 1 5 
A=  3 11 −19 7 1 

1 7 −13 5 −13
Row Space and column space

Solution: We have
 
1 3 −5 1 5
 0 1 −2 2 −7 
A∼B = 
 0 0 0 −4 20 
0 0 0 0 0

The first three rows of B form a basis for the row space of A
(as well as the row space of B).
Thus, a basis for Row A is

{(1, 3, −5, 1, 5) , (0, 1, −2, 2, −7) , (0, 0, 0, −4, 20)}


Row Space and column space
Solution (Cont.) For the column space, observe from B that
the pivots are columns 1, 2, 4. So a basis for Col A is
     

 −2 −5 0  
1   3   1 

 , , 
  3   11   7 
 
1 7 5
 

For the null space of A, we find all x = [x1 , . . . , x5 ] such that


Ax = 0. Since the third and fifth columns are nonpivot
columns, x3 and x5 can take any values. By back substitution,
we obtain
x1 = −x3 − x5 , x2 = 2x3 − 3x5 , x4 = 5x5 .
The null space of A has a basis
{[−1, 2, 1, 0, 0], [−1 − 3, 0, 5, 1]}.
Rank-Nullity

Definition
The nullity of A is the dimension of the null space of A, that
is, the dimension of the solution space of Ax = 0.
Thus, in the previous example, nullity A = 2, row rank
A=column rank A = 3.
Theorem
Let A be an m × n matrix. Then
• The row rank and column rank of A are equal. They equal
the number of pivot columns of the echelon form of A.
• The nullity of A equals the number of non-pivot columns
of the echelon form of A.
• It follows that rank A + nullity A = n.
Rank-Nullity

Example
Let  
3 −1 2
A= 2 1 3 
7 1 8
We have  
1 0 1
A∼B = 0 1 1 
0 0 0
Thus, nullity A=1, rank A = 2 and nullity A + rank A = 3 =
numbers of column of A.
Rank-Nullity
Example
Let  
1 1 4 1 2

 0 1 2 1 1 

A=
 0 0 0 1 2 

 1 −1 0 0 2 
2 1 6 0 1
We have  
1 0 2 0 1

 0 1 2 0 −1 

A∼B =
 0 0 0 1 2 
 0 0 0 0 0 
0 0 0 0 0
Thus, nullity A=2, rank A = 3 and nullity A + rank A = 5 =
number of columns of A.
Rank and Invertibility
Suppose A is a square matrix of size n. Since rank A is the
number of pivot columns of A, it follows that rank A = n if
and only if A is invertible. Thus,
Corollary
Let A be an n × n matrix. The following are equivalent
(a) A is invertible
(b) rank A = n.
(b) det(A) 6= 0.
(c) The homogeneous system Ax = 0 has only the trivial
solution.
(d) The linear system Ax = b has a unique solution for every
vector b ∈ Rn .
Section 6

Coordinates and changes of bases


Coordinates
Definition: Coordinates
Let V be an n-dimensional vector space, with a basis

S = {v1 , v2 , ..., vn } .

Every vector ∈ V can be uniquely expressed in the form:

v = a1 v1 + a2 v2 + ... + an vn .
We define  
a1
 a2 
[v ]S =  ..
 

 . 
an
and call [v ]S ∈ Rn the coordinate vector of v with respect to
the basis S.
Coordinates

Example
Consider the vector space R3 and let S = {v1 , v2 , v3 } be the
basis for R3 , where
     
1 2 0
v1 = 1 , v2 = 0 , v3 = 1 
    
0 1 2
If  
1
v = 1 
−5
compute [v ]S
Coordinates

Solution
To find [v ]S , we need to find the constants a1 , a2 , a3 such that
a1 v1 + a2 v2 + a3 v3 = v . Solve the linear system
 
1 2 0 1
 1 0 1 1 
0 1 2 −5

We get a1 = 3, a2 = −1, a3 = −2. Thus,


 
3
[v ]S =  −1 
−2
Isomorphism

Definition
Let V and W be real vector spaces. A bijection L from V to
W is called an isomorphism if
(a) L (v + w ) = L (v ) + L (w ),
(b) L (cv ) = cL (v ).
If there is an isomorphism from V to W , we say that V is
isomorphic to W .

Theorem
(a) Two finite-dimensional vector spaces are isomorphic if and
only if their dimensions are equal.
(b) If V is an n−dimensional real vector space, then V is
isomorphic to Rn .
Changes of bases - Transition matrices
Thus, let S = {v1 , v2 , ..., vn } and T = {w1 , w2 , ..., wn } be two
ordered bases for the n−dimensional vector space V . Let v be
a vector in V and let
 
c1
 c2 
[v ]T =  .. 
 
 . 
cn

Question: Can we compute [v ]S via [v ]T ?


We have
[v ]S = [c1 w1 + c2 w2 + ... + cn wn ]S
[v ]S = [c1 w1 ]S + [c2 w2 ]S + ... + [cn wn ]S
[v ]S = c1 [w1 ]S + c2 [w2 ]S + ... + cn [wn ]S
Changes of bases - Transition matrices
Let the coordinate vector of wj with respect to S be denoted
by  
a1j
 a2j 
[wj ]S =  .. 
 
 . 
anj
The n × n matrix whose jth column is [wj ]S is called the
transition matrix (or the change-of-coordinates matrix) from
the T-basis to the S-basis and is denoted by PT →S . That is,

PT →S = ([w1 ]S , [w2 ]S , ..., [wn ]S )

Therefore,

[v ]S = PT →S [v ]T
Changes of bases - Transition matrices
Example
Let T = {w1 , w2 }, S = {v1 , v2 }, where
       
−9 −5 1 3
w1 = , w2 = , v1 = , v2 = .
1 −1 −4 −5

Find the transition matrix PT →S from the T-basis to the


S-basis.
Solution: Let
   
x1 y1
[w1 ]S = , [w2 ]S =
x2 y2
We need to solve the following linear systems
   
  x1   y1
v1 v2 = w1 , and v1 v2 = w2
x2 y2
Changes of bases - Transition matrices

Example
We can solve both systems simultaneously.
 
  1 3 −9 −5
v1 v2 w1 w2 =
−4 −5 1 −1
 
1 0 6 4

0 1 −5 −3
       
x1 6 y1 4
[w1 ]S = = , [w2 ]S = =
x2 −5 y2 −3
Thus,  
6 4
PT →S =
−5 −3
Changes of bases - Transition matrices

Example
Let
T = {w1 , w2 , w3 } , S = {v1 , v2 , v3 } ,
where
     
6 4 5
w1 = 3 , w2 = −1 , w3 = 5 
    
3 3 2
     
2 1 1
v1 =  0  , v2 =  2  , v3 =  1 
1 0 1
Find the transition matrix PT →S from the T-basis to the
S-basis.
Changes of bases - Transition matrices
To find [wj ]S , j = 1, 2, 3, we can solve three systems
simultaneously
 
  2 1 1 6 4 5
v1 v2 v3 w1 w2 w3 = 0 2 1 3 −1  5 
1 0 1 3 3 2
 
  1 0 0 2 2 1
v1 v2 v3 w1 w2 w3 ∼ 0 1 0 1 −1  2 
0 0 1 1 1 1
 
2 2 1
PT →S =  1 −1 2 
1 1 1
Q: Verify [v ]S = PT →S [v ]T ?
Exercises

Section 7.4 (p. 287): 1-5, 12-21, 27-31.

You might also like