0% found this document useful (0 votes)
18 views12 pages

Post Class Reading Week+6

The document discusses subspaces, span of vectors, and linear independence. It defines key concepts such as subspace, span, and basis. It provides examples to illustrate these concepts and how to determine if a set of vectors spans a subspace. It also discusses using a set of vectors to describe a subspace.

Uploaded by

defajev209
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views12 pages

Post Class Reading Week+6

The document discusses subspaces, span of vectors, and linear independence. It defines key concepts such as subspace, span, and basis. It provides examples to illustrate these concepts and how to determine if a set of vectors spans a subspace. It also discusses using a set of vectors to describe a subspace.

Uploaded by

defajev209
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

(c)2023 C.

Karimianpour

MAT188 WEEK 6 Subspaces, span, linear independence and basis

PCE 6 Learning Standards


(1) I can visualize all the subspaces of R, R2 , and R3 .
(2) Given a subset of Rn , I can verify whether it is a subspace or not.
(3) Given a set of vectors ⃗v1 , · · · , ⃗vn in R, R2 or R3 I can visualize span(⃗v1 , · · · , ⃗vn ).
(4) I know the definition of the kernel and image of a linear transformation T and can decide if a
given vector is in ker(T ) or im(T ) or can find vectors in ker(T ) and im(T ).
(5) Given a linear transformation T , I can describe the im(T ) in set builder notation.
(6) Given a linear transformation T , I can describe the ker(T ) and im(T ) as a span of vectors.
(7) Given a linear transformation T with domain or codomain R, R2 , R3 I can visualize ker(T ) and
im(T ).
(8) Given a linear transformation T (⃗x) = A⃗x, I can connect ker(T ) and im(T ) to the solutions to
A⃗x = ⃗b, for suitable choices of ⃗b.
(9) Given a linear transformation T , I can connect the notion of injective/sujective to ker(T )/im(T )
.
(10) Given a set of vectors, I can determine if the set is linearly independent.
(11) Given a subspace, I can determine its basis.
(12) Given a set of vectors, I can reduce it to a linearly independent set.
(13) Given a set of vectors, I can visualize if a vector is redundant.
(14) Given a linear transformation T (⃗x) = A⃗x, I can connect ker(T ) with the linearly independence
of columns of A and im(T ) with the span of the columns of A.

Vocabulary: image of a linear transformation, span of vectors, kernel of a linear transformation,subspace,


linear independence, linear relation, basis.

Reading from the textbook: : Sec 3.1, Sec 3.2

1
MAT188 WEEK 6 Subspaces, span, linear independence and basis 2

Introduce
This week, we will learn about subspaces, and develop an efficient and precise way of describing them.
Intuitively speaking, a subspace of Rn is a copy of Rk for k ≤ n, that sits inside Rn . For example, consider
a line ℓ passing through the origin in R3 . The line itself, viewed purely geometrically, is a copy of R
sitting inside R3 . We say ℓ is a subspace of R3 . We will define the concept of subspace mathematically,
and precisely.

Most subspaces - for instance, a line through the origin in R3 - are infinite sets of vectors. We will
develop a way to describe a subspace in terms of a finite number of vectors that generate it - the same
way we described a line through the origin using a direction vector. We will discuss a minimal set of
vectors required to describe a subspace, and how to find such vectors.

We have been working with subspaces ever since we started finding a general solution to a homogeneous
system A⃗x = ⃗0. We also have developed a proper mathematical notation to describe subspaces when
we describe our general solution in vector parametric form. This week, we developed a more precise and
more powerful terminology to talk about familiar concepts. It is crucial that you pay extra attention to
new definitions and terminologies this week and next week. The sooner you master the definitions
of these new terminologies the happier you will be!
 
x
Consider a plane V in R3 defined as V = {y  | : x + 2y + 3z = 0}. The set V is a subset of R3 which
x
describes a plane through the origin. Geometrically speaking, V “looks like” a copy of R2 , positioned
inside R3 . Note that V contains the vector zero. If you pick two vectors ⃗v1 and ⃗v2 on V , their sum ⃗v1 +⃗v2 ,
and any scalar multiple c1⃗v1 or c2⃗v2 of them are also on V . We say V is closed under vector addition and
scalar multiplication. Indeed, any linear combination c1⃗v1  + c2⃗v2 is also inV . 
x1 x2
Let’s verify these claims mathematically. Suppose ⃗v1 = y1 and ⃗v2 = y2  are in V . That means
  
z1 z2

x1 + 2y1 + 3z1 = 0 and x2 + 2y2 + 3z2 = 0.


 
x1 + x2
We want to check if ⃗v1 + ⃗v2 =  y1 + y2  is also in V . We need to check if ⃗v1 + ⃗v2 satisfies the equation
z1 + z2
of V :

(x1 +x2 )+2(y1 +y2 )+3(z1 +z2 ) = x1 +x2 +2y1 +2y2 +3z1 +3z2 = (x1 +2y1 +3z1 )+(x2 +2y2 +3z2 ) = 0+0 = 0.

Hence, ⃗v1 +⃗v2is in V , we way V is closed under vector addition. Similarly, take any scalar k, and any
x
vector ⃗v = y  on V .To check if k⃗v is in V , we verify if it satisfies the equation of V .
z

kx + 2ky + 3kz = k(x + 2y + 3z) = k(0) = 0.

Hence k⃗v is also in V . We say V is closed under scalar multiplication.


When showing that a set is a subspace, we can combine criteria (2) and (3) by showing that, for any
⃗x ∈ W , and any k1 , k2 ∈ R that k1⃗x + k2 ⃗y ∈ W.
MAT188 WEEK 6 Subspaces, span, linear independence and basis 3

Figure 6.1. V is closed under addition and scalar multiplication.

Definition (Subspace)

A subspace of Rn is a subset W of Rn which contains ⃗0 and is closed under addition and scalar
multiplication. That is, a subspace of Rn is a subset W ⊆ Rn such that
(1) ⃗0 ∈ W ;
(2) if ⃗x, ⃗y ∈ W , then also ⃗x + ⃗y ∈ W ;
(3) if ⃗x ∈ W and k is any scalar, then also k⃗x ∈ W .

Example The zero set {⃗0} and the Rn itself are subspaces of Rn . It is easy to check that they both
satisfy the definition. We call {⃗0} the trivial subspace of Rn .
Exercise Consider the unite square S in R2 . Is S a subspace of R2 ?
 
x
Back to out example V = { y  | : x + 2y + 3z = 0} ⊆ R3 . V is a subspace of R3 . Take a minute to

x
verify that we can describe V in vector parametric form as
   
−2 −3
V = {t 1 + s 0  , s, t ∈ R}.
  
0 1
   
−2 −3
That is V is the set of all linear combinations of ⃗v1 = 1 and ⃗v2 = 0 . We introduce a new
  
0 1
terminology for the set of all linear combinations of a number of vectors.

Definition (Span)

Let ⃗v1 , ⃗v2 , · · · , ⃗vm be vectors in Rn . The set of all linear combinations of ⃗v1 , ⃗v2 , · · · , ⃗vm is called
their span. That is
span(⃗v1 , ⃗v2 , · · · , ⃗vm ) = {c1⃗v1 + c2⃗v2 + · · · + cm⃗vm | c1 , c2 , · · · , cm ∈ R}.
If span(⃗v1 , ⃗v2 , · · · , ⃗vm ) = V then {⃗v1 , ⃗v2 , · · · , ⃗vm } is called a spanning set for V , or is said to span
V.

We can say ⃗x ∈ span(⃗u1 , . . . , ⃗um ) if ⃗x can be written as a linear combination of the vectors ⃗u1 , . . . , ⃗um .
Given a vector ⃗b ∈ Rn , how can we determine if an ⃗b is in the Span(⃗u1 , . . . , ⃗um )? This is asking if we
can find c1 , . . . , cm such that c1⃗u1 + c2⃗u2 + · · · + cm⃗um = ⃗b. We can transform the equation into a matrix
MAT188 WEEK 6 Subspaces, span, linear independence and basis 4

vector equation

 c1
 

| | ··· | c 
2
⃗u1 ⃗u2 · · · ⃗um    ⃗
 ...  = b.
| | ··· |
cm

Thus, determining if ⃗b is in the Span(⃗u1 , . . . , ⃗um ), is equivalent to determining if the system A⃗c = ⃗b
consistent, where A is a matrix that has ⃗u1 , . . . ⃗um as its columns.
   
−2 −3
Example Let V be the plane defined by x1 + 2y1 + 3z1 = 0, and let ⃗v1 =  1  and ⃗v2 =  0 . Then
0 1
V is spanned by {⃗v1 , ⃗v2 }. Alternatively, we can say span(⃗v1 , ⃗v2
) is V , or {⃗v1 , ⃗v2 } is a spanning
  set for V .
−5 −5
Every vector on this plane is in span(⃗v1 , ⃗v2 ). For example,  1  ∈ span(⃗v1 , ⃗v2 ), because  1  = ⃗v1 +⃗v2 ,
1 1
can be written as a linear combination of ⃗v1 and ⃗v2 .

Try this Geogebra applet to explore how the set of all linear combinations of these two vectors create
the entire plane.

We have already seen that span(⃗v1 , ⃗v2 ) is the plane V , which is a subspace of R3 . In general, the span
of vectors in Rn satisfies the properties in the definition of a subspace and hence is a subspace of Rn .
Indeed, given a collection of vectors taking their span is a natural way of constructing a smallest subspace
that contains them all!

Theorem (span is a subspace). Given vectors ⃗v1 , ⃗v2 , · · · , ⃗vm in Rn , span(⃗v1 , ⃗v2 · · · , ⃗vm ) is a subspace of
Rn .

Image Consider a linear transformation T : Rm → Rn . The set of all outputs of T creates a subset of
the codomain of T . We will see that this set is a subspace of the codomain. Let’s recall the definition of
the image of a function.

Definition (Image)

The image of f : X → Y is defined to be the set im(f ) = {f (x) : x ∈ X} = {y ∈ Y | f (x) =


y for some x ∈ X}.

Example Suppose T : R3 → R3 is the projection into the x-axis. Hence, T (⃗x) = A⃗x, where A =

1 0 0
0 0 0. Since T projects vectors onto the x axis, every output of T is on this line. On the flip side,
0 0 0
every vector on the x-axis can be thought of as the projection of some vector in R3 (indeed infinitely
many vectors). So,

im(T ) = {T (⃗x) | ⃗x ∈ R3 } = x-axis


MAT188 WEEK 6 Subspaces, span, linear independence and basis 5

Let’s verify this argument algebraically,


im(T ) = {T (⃗x) | ⃗x ∈ R3 } = {A(⃗x) | ⃗x ∈ R3 }
  
1 0 0 x1
= {0 0 0 x2  | xi ∈ R}
0 0 0 x3
     
1 0 0
= {x1 0 + x2 0 + x3 0 | xi ∈ R}
    
0 0 0
 
1
= {x1 0 | x1 , ∈ R} = span(⃗e1 ) = x-axis

0

In the example above, im(T ) is a subspace of R3 because it contains the origin, and is close under
vector addition and scalar multiplication.
Exercise Show that the x-axis satisfies the definition of a subspace.

We can also argue that since we could describe it as span(⃗e1 ), and span of vectors is always a subspace,
im(T ) is a subspace of R3 .
Let’s generalize what we saw in this example. Let A = [⃗a1 ⃗a2 · · · ⃗am ] be an n × m matrix. Suppose,
T : Rm → Rn is given by T (⃗x) = A⃗x. Then
im(T ) = {T (⃗x) | ⃗x ∈ Rm } = {A⃗x | ⃗x ∈ Rm }
= {x1⃗a1 + · · · + xm⃗am | x1 , x2 , · · · , xm ∈ R}
= Span(⃗a1 , ⃗a2 , · · · , ⃗am ).
So, the image of T can be described as the span of columns of A. Another common terminology for the
span of columns of a matrix A is the column space of A, denoted by Col(A). Sometimes we may use the
notation im(A) instead of im(T ). Since im(T ) = Col(A) is the span of the columns of A and a span of
vectors is always a subspace, im(T ) is a subspace of Rn .
Theorem (Image is a subspace). Let T : Rm → Rn be a linear transformation given by T (⃗x) = A⃗x.
Then im(T ) = Col(A) is a subspace of Rn .
We can verify this claim directly using the definition of a subspace.
Proof. We will verify the three conditions in the definition of a substance. We combined the last two
conditions into one 1) We should show ⃗0 ∈ im(T ). To see why this is true, note that A⃗0 = ⃗0, for any
matrix A, therefore ⃗0 ∈ im(T ).
2) Given ⃗y1 , ⃗y2 ∈ im(T ) and scalars k1 , k2 , we should show k1 ⃗y1 + k2 ⃗y2 ∈ im(T ).
Let’s see why this is true. Suppose ⃗y1 and ⃗y2 are in the image of a linear transformation T . This means,
there must be vectors ⃗x1 and ⃗x2 in the domain of T such that T (⃗x1 ) = ⃗y1 and T (⃗x2 ) = ⃗y2 . We now consider
T (k1⃗x1 + k2⃗x2 ). Using properties of linear transformations we have T (k1⃗x1 + k2⃗x2 ) = k1 T (⃗x1 ) + k2 T (⃗x2 ) =
k1 ⃗y1 + k2 ⃗y2 . Therefore, k1 ⃗y1 + k2 ⃗y2 is also in the image of T !
Combining these results implies that for a linear transformation T : Rm → Rn , the set im(T ) is a subspace
of its codomain, Rn . □
MAT188 WEEK 6 Subspaces, span, linear independence and basis 6

Solidify
Kernel Continue considering a linear transformation T : Rm → Rn , define by T (⃗x) = A⃗x. The set of all
vectors in the domain of T that are mapped to the zero is a subset of the domain. We will see that this
set is a subspace of the domain, and is of special interest because it can say a lot about the map T .

Definition (Kernel)

The kernel of a linear transformation T : Rm −→ Rn is the set of all vectors ⃗v in the domain such
that T (⃗v ) = ⃗0. That is,
ker(T ) = {⃗v ∈ Rm | T (⃗v ) = ⃗0}.

Let’s rephrase this in terms of a familiar concept. The set of vectors that are mapped to zero is exactly
the set of all solutions to T (⃗x) = A⃗x = ⃗0.
ker(T ) = {⃗x ∈ Rm | T (⃗x) = ⃗0} = {⃗x ∈ Rm | A⃗x = ⃗0} = the general solution to A⃗x = ⃗0.
Another common terminology for (T ) is the null space for A, and is denoted by N ull(A).
3 3
Example
  Suppose T : R → R is the projection into the x-axis. Hence, T (⃗x) = A⃗x, where A =
1 0 0
0 0 0. Then ker(T ) is the general solution to A⃗x = ⃗x. Verify that the general solution is vector
0 0 0
parametric form is
   
0 0
ker(T ) = {t 1 + s 0 | s, t ∈ R}
  
0 1
= span(⃗e2 , ⃗e3 ) = yz-plane.

Exercise Use the definition of the subspace to show that yz-plane is a subspace of R3 .
We can also argue that since we could describe it as span(⃗e2 , ⃗e3 ), and a span of vectors is always a
subspace, ker(T ) is a subspace of R3 . The two observations we made here hold in general: 1) The Kernel
of a linear transformation can always be described as a span of a number of vectors 2) The kernel of a
linear transformation is a subspace of its domain.
Theorem (Kernel is a subspace). Let T : Rm → Rn , define by T (⃗x) = A⃗0. Then (T ) = Null(A) is a
subspace of Rm

Proof. We first show that, for any linear transformation T : Rm → Rn , ⃗0 ∈ ker(T ). This property is true
for the same reason why ⃗0 ∈ im(T ), since T (⃗0) = ⃗0.
Next we show, If ⃗x1 , ⃗x2 ∈ ker(T ) then, for any scalars k1 and k2 , that k1⃗x +k2⃗x2 ∈ ker(T ). The verification
of this is straight forward, since by linearity of T , we have T (k1⃗x1 + k2⃗x2 ) = k1 T (⃗x1 ) + k2 T (⃗x2 ) = ⃗0.

MAT188 WEEK 6 Subspaces, span, linear independence and basis 7

The kernel and image of a linear transformation give us a lot of information about T : Rm → Rn . In
particular, these concepts are closely connected to the concepts of injective and surjective. The map T is
surjective exactly when im(T ) is the entire codomain. The map T is injective exactly when the equation
A⃗x = ⃗0 has a unique solution. Since A⃗0 = ⃗0, the unique solution has to be the vector zero. In other
words, T is injective exactly when the kernel of T has only one element: the zero vector.

Theorem. For a linear transformation T : Rm → Rn , T is injective if and only if ker(T ) = {⃗0}, and T
is surjective if and only if im(T ) = Rn .

Linear Independence In general, subspaces are infinite sets of vectors1. We describe them as the span
of a collection of vectors. For instance, a line through the origin is a subspace that can be described as
a span of a single vector. How many vectors do we need to describe a given subspace? We will walk
through this idea using an example.  
1 2 1 2
Consider a linear transformation T (⃗x) = A⃗x where A = 1 2 2 3 . We know that the image of T
1 2 3 4
is a subspace of R3 that is spanned by the columns of A. Let ⃗v1 , ⃗v2 , ⃗v3 , and ⃗v4 denote the columns of
A, from left to right. Then im(T ) = span(⃗v1 , · · · , ⃗v4 ). In other words, {⃗v1 , · · · , ⃗v4 } is a spanning set for
im(T ). If we plot these vectors, geometrically they all lie in the same plane in R3 . That is the set of all
of their linear combinations also lies in the same plane.

Observe that ⃗v2 = 2v1 and ⃗v4 = ⃗v1 + ⃗v3 . That is ⃗v2 and ⃗v4 are in span(⃗v1 , ⃗v3 ).So any of their linear
combinations can be achieved by some linear combination of ⃗v1 , ⃗v3 . That is they are “redundant” as far
as the span is concerned. Hence, im(T ) = span(⃗v1 , ⃗v2 , ⃗v3 , ⃗v4 ) = span(⃗v1 , ⃗v3 ). How can we tell which
vectors can be written as a linear combination of the rest and hence can be removed from the spanning
set without changing thesubspace itspans? Let’s find the general solution to A⃗x = ⃗0 (or ker(T )). The
1 2 0 1
matrix A row reduces to 0 0 1 1. Note that the second and fourth columns are NOT pivot columns.

0 0 0 0
The
       
−2 −1 −2 −1
1 0 1 0
ker(T ) = {t 
 0  + s −1 | t, s ∈ R} = span( 0  , −1).
      
0 1 0 1

1There is one exception to this sentence. Which subspace is an exception?


MAT188 WEEK 6 Subspaces, span, linear independence and basis 8
   
−2 −1
1 0

 0  and −1 back into the equation A⃗x = 0, it should hold. This gives us the vector
If we plug in    
0 1
zero as a linear combination of the columns of A. Precisely we get these linear combinations:

−2⃗v1 + ⃗v2 = ⃗0 and − ⃗v1 − ⃗v3 + ⃗v4 = ⃗0


Each of the above equations is called a (non-trivial) linear relation among the columns of A. Isolating
⃗v2 , and ⃗v4 , which are vectors in A whose corresponding column in RREF(A) is NOT a pivot column tells
us that they can be written as a linear combination of ⃗v1 and ⃗v3 , and hence are redundant.

Definition (Linear Relation)

Let ⃗v1 , ⃗v2 , · · · , ⃗vn be vectors in a subspace V of Rm . A linear relation among ⃗v1 , ⃗v2 , · · · , ⃗vn is any
equation of the form
c1⃗v1 + · · · + cn⃗vn = ⃗0,
where ci s are scalars.

In fact, any vector in the general solution to A⃗x = ⃗0, or equivalently in the ker T will give us a linear
relation among ⃗v1 , · · · ⃗v4 . Any non-zero vector in the kernel of T gives us a non-trivial relation. In our
example, we could find redundant vectors precisely because we had non-zero vectors in ker(T ).

Definition (Linearly Dependent)

Let ⃗v1 , ⃗v2 , · · · , ⃗vn be vectors in a subspace V of Rm . We say ⃗v1 , ⃗v2 , · · · , ⃗vn are linearly dependent
if there are scalars c1 , . . . , cn that are not all zero such that c1⃗v1 + · · · + cn⃗vn = ⃗0.

The vectors ⃗v1 , ⃗v2 , ⃗v3 , ⃗v4 in our example are linearly dependent because we can write −2⃗v1 + ⃗v2 = ⃗0 or
−⃗v1 − ⃗v3 + ⃗v4 = ⃗0.
Theorem (linear dependency and nontrivial relations). The vectors ⃗v1 , · · · , ⃗vn are linearly dependent if
and only if there are non-trivial relations among them.

Definition (Linearly Independent)

Let ⃗v1 , ⃗v2 , · · · , ⃗vn be vectors in a subspace V of Rm . We say ⃗v1 , ⃗v2 , · · · , ⃗vn are linearly indepen-
dent if c1⃗v1 + · · · + cn⃗vn = ⃗0 has only one solution: c1 = · · · = cn = 0.

Theorem (linear dependency and nontrivial relations). The vectors ⃗v1 , · · · , ⃗vn are linearly independent
if and only if the only relation among them is the trivial relation.
     
1 2 −2
Example Show that the vectors ⃗u1 = 0 , ⃗u2 = 1 , ⃗u3 = 1  are linearly independent.
    
−1 −1 4

To show that ⃗u1 , ⃗u2 , ⃗u3 are linearly independent, we should that the only solution to c1⃗u1 +c2⃗u2 +c3⃗u3 = ⃗0
 c1 = c2 = c3 =0. 
is We consider
  the matrix equation
1 2 −2 c1 0
0 1 1  c2  = 0. Transforming into the augmented matrix and reducing to RREF yields
−1 −1 4 c3 0
MAT188 WEEK 6 Subspaces, span, linear independence and basis 9
 
1 0 0 0
0 1 0 0. Therefore, c1 = c2 = c3 = 0 is the unique solution to the equation. Hence ⃗u1 , ⃗u2 and ⃗u3
0 0 1 0
are linearly independent.
MAT188 WEEK 6 Subspaces, span, linear independence and basis 10

Expand
First, let’s summarize what we learned so far about when a set of vectors are linearly independent.

Theorem. For a list of vectors ⃗v1 , ⃗v2 , · · · , ⃗vm in Rn the following statements are equivalent

(1) ⃗v1 , ⃗v2 , · · · , ⃗vm are linearly independent.


(2) The set {⃗v1 , ⃗v2 , · · · , ⃗vm } contains no redundent vector.
(3) None of the ⃗vi ’s can be written as a linear combination of the rest.
(4) None of the ⃗vi ’s is in the span of the rest.
(5) The only relation among ⃗v1 , ⃗v2 , · · · , ⃗vm is the trivial relation.
(6) The only solution to c1⃗v1 +c2⃗v2 + · · · + cm⃗vm = ⃗0 is c1 = c2 = · · · = cn = 0.
| | ··· |
(7) Null(A) = {⃗0}, where A = ⃗v1 ⃗v2 · · · ⃗vm .
| | ··· |
 
| | ··· |
(8) The rank ofA = ⃗v1 ⃗v2 · · · ⃗vm  is m.
| | ··· |
(9) The linear map T (⃗x) = A⃗x is injective.
(10) The kernel of the map T (⃗x) = A⃗x is {⃗0}.

Given a subspace V of Rn , we can describe V as a span of a number of vectors. however, there may
be redundancy in our spanning set. In other words, our spanning set may be linearly dependent. We
can identify and remove redundant vectors to reduce the spanning set into one whose vectors are linearly
independent. The ideal way of describing a subspace is via a linearly independent spanning set.

Definition (Basis)

A basis of a subspace V of Rn is a linearly independent set of vectors in V that spans V . Put


differently, a basis is a spanning set for V which is linearly independent.

Let’s emphasize the definition. Suppose ⃗v1 , ⃗v2 , · · · , ⃗vm form a basis for a subspace V in Rn . That
means two conditions hold:

(1) ⃗v1 , ⃗v2 , · · · ⃗vm are linearly independent.


(2) Given any vector ⃗v ∈ V , ⃗v ∈ span(⃗v1 , ⃗v2 , · · · ⃗vm ), that is we can write ⃗v as a linear combination of
⃗v1 , ⃗v2 , · · · , ⃗vm .

Example The vectors ⃗e1 , ⃗e2 , · · · , ⃗en form a basis of Rn . To see why this is true we should check the two
conditions on the definition of a basis.
 
v1
 v2  n
(1) Note that given any vector ⃗v =    ∈ R . We can write ⃗v = v1⃗e1 + v2⃗e2 + · · · + vn⃗en . That is

vn
{⃗e1 , ⃗e2 , · · · , ⃗en } is a spanning set for Rn , or equivalently Rn = span(⃗e1 , ⃗e2 , · · · , ⃗en ).
(2) The only solution to c1⃗e1 + c2⃗e2 + · · · cn⃗en = ⃗0 is the trivial solution. Hence, they are linearly
independent.

The list of vectors ⃗e1 , ⃗e2 , · · · , ⃗en form a basis for Rn . We call this basis the standard basis for Rn .
MAT188 WEEK 6 Subspaces, span, linear independence and basis 11

    
1 2 −2
Example The vectors ⃗u1 = 0 , ⃗u2 = 1 , ⃗u3 = 1  form a basis for R3 .
    
−1 −1 4
To see why this holds we should show that 1) ⃗u1 , u2 and ⃗u3 are linearly independent. 2) show that every
vector ⃗v ∈ R3 is a linear combination of ⃗u1 , u2 and ⃗u3 .
We already verified (1) in previous examples. To see (2), let ⃗v be an arbitrary vector in R3 . Consider
the equation
(6.1) c1⃗u1 + c2 u2 + c3⃗u3 = ⃗v
Then ⃗v ∈ span(⃗u1 , u2 , u3 ) exactly when (6.1) is consistent. Well! we saw (in a previous example)
that the coefficient matrix of this system row reduces to identity. That means every row has a leading
one, and hence regardless of what the augmented column is, it is not possible for a leading one to
appear in the augmented column. Hence (6.1) is consistent. That is ⃗v ∈ span(⃗u1 , u2 , u3 ), and hence
R3 = span(⃗u1 , u2 , u3 ).
   
1 0
Example The list of vectors ⃗u1 = 0 , ⃗u2 = 1 form a basis for the plane −3x − 2y + z = 0.
3 2

Exercise Suppose ⃗v1 , ⃗v2 , · · · ⃗vm is a basis for a subspace V in Rn . Is it possible to write ⃗v as a linear
combination of these basis vectors in multiple ways? In other words, how many solutions does the
equation c1⃗v1 + c2⃗v2 + · · · + cm⃗vm = ⃗v have?

Once you do the exercise above, you will be able to see why the following theorem hold.
Theorem. Consider the list of vectors ⃗v1 , ⃗v2 , · · · ⃗vm in a subspace V of Rn . The vectors ⃗v1 , ⃗v2 , · · · ⃗vm
form a basis for V if and only if every vector ⃗v in V can be expressed uniquely as a linear combination
⃗v = c1⃗v1 + c2⃗v2 + · · · + cm⃗vm .

In this case, the coefficients c1 , c2 , · · · , cm are called the coordinates ⃗v with respect to the basis ⃗v1 , ⃗v2 , · · · ⃗vm .
MAT188 WEEK 6 Subspaces, span, linear independence and basis 12

For a complete list of numbered definitions for chapter 3, please see the end of PCE 7.
Definition (Subspace). A subspace of Rn is a subset W of Rn which contains ⃗0 and is closed under
addition and scalar multiplication. That is, a subspace of Rn is a subset W ⊆ Rn such that
(1) ⃗0 ∈ W ;
(2) if ⃗x, ⃗y ∈ W , then also ⃗x + ⃗y ∈ W ;
(3) if ⃗x ∈ W and k is any scalar, then also k⃗x ∈ W .
Definition (Span). Let ⃗v1 , ⃗v2 , · · · , ⃗vm be vectors in Rn . The set of all linear combinations of ⃗v1 , ⃗v2 , · · · , ⃗vm
is called their span. That is
span(⃗v1 , ⃗v2 , · · · , ⃗vm ) = {c1⃗v1 + c2⃗v2 + · · · + cm⃗vm | c1 , c2 , · · · , cm ∈ R}.
If span(⃗v1 , ⃗v2 , · · · , ⃗vm ) = V then {⃗v1 , ⃗v2 , · · · , ⃗vm } is called a spanning set for V , or is said to span V .
Definition (Kernel). The kernel of a linear transformation T : Rm −→ Rn is the set of all vectors ⃗v in
the domain such that T (⃗v ) = ⃗0. That is,
ker(T ) = {⃗v ∈ Rm | T (⃗v ) = ⃗0}.
Definition (Image). The image of f : X → Y is defined to be the set im(f ) = {f (x) : x ∈ X} = {y ∈
Y | f (x) = y for some x ∈ X}.
T
Definition (Linear Transformation). A linear transformation from Rm to Rn is a mapping Rm −→ Rn
that satisfies:
(1) T (⃗x + ⃗y ) = T (⃗x) + T (⃗y ) for all vectors ⃗x, ⃗y ∈ Rm ;
(2) T (k⃗x) = kT (⃗x) for all vectors ⃗x ∈ Rm and all scalars k ∈ R.
Definition (Surjective and Injective). The function f : X → Y is surjective or onto if for all y ∈ Y ,
there exists some x ∈ X such that f (x) = y. In this case, we also say "f is a surjection." The function
f is injective or one-to-one if for all y ∈ Y there exists at most one x ∈ X such that f (x) = y.
Equivalently: whenever f (x1 ) = f (x2 ) for x1 , x2 ∈ X, it follows that x1 = x2 . In this case, we also say
"f is an injection.".
Definition (Linearly Independent). Let ⃗v1 , ⃗v2 , · · · , ⃗vn be vectors in a subspace V of Rm . We say ⃗v1 , ⃗v2 , · · · , ⃗vn
are linearly independent if c1⃗v1 + · · · + cn⃗vn = ⃗0 has only one solution: c1 = · · · = cn = 0.
Definition (Basis). A basis of a subspace V of Rn is a linearly independent set of vectors in V that
spans V . Put differently, a basis is a spanning set for V which is linearly independent.

You might also like