Chapter5 (5 1 5 3)
Chapter5 (5 1 5 3)
Chapter5 (5 1 5 3)
Definition (1/2)
Let V be an arbitrary nonempty set of objects on which
two operations are defined, addition and multiplication
by scalars (numbers). By addition we mean a rule for
associating with each pair of objects u and v in V an
object u + v, called the sum of u and v; by scalar
multiplication we mean a rule for associating with each
scalar k and each object u in V an object ku, called the
scalar multiple of u by k. If the following axioms are
satisfied by all objects u, v, w in V and all scalars k and
l, then we call V a vector space and we call the objects
in V vectors.
Definition (2/2)
1)
2)
3)
4)
5)
6)
7)
8)
9)
10)
Remark
Example 1
Rn Is a Vector Space
Example 2
A Vector Space of 22 Matrices (1/4)
Show that the set V of all 22 matrices with real entries is a vector
space if vector addition is defined to be matrix addition and vector
scalar multiplication is defined to be matrix scalar multiplication.
Solution.
In this example we will find it convenient to verify the axioms in
the following order: 1, 6, 2, 3, 7, 8, 9, 4, 5, and 10.
Let u
u11 u12
= u u
21 22
and v
v11 v12
= v v
21 22
Example 2
A Vector Space of 22 Matrices (2/4)
Similarly, Axiom 6 hold because for any real number k we have
vu
Example 2
A Vector Space of 22 Matrices (3/4)
To prove Axiom 4, this can be defining 0 to be
0 0
0
0 0
21 22 21 22
And similarly u + 0 = u.
To prove Axiom 5, this can be done by defining the negative of u to be
u u12
u 11
u21 u22
21 22 21
22
Example 2
A Vector Space of 22 Matrices (4/4)
Finally, Axiom 10 is a simple computation:
u11 u12 u11 u12
1u 1
Example 3
A Vector Space of mn Matrices
Example 2 is a special case of a more general
class of vector spaces. The arguments in that
example can be adapted to show that the set V
of all mn matrices with real entries, together
with the operations matrix addition and scalar
multiplication, is a vector space. The mn zero
matrix is the zero vector 0, and if u is the mn
matrix U, then matrix U is the negative u of
the vector u. We shall denote this vector space by the
symbol Mmn.
Example 4
A Vector Space of Real-Valued Functions
(1/2)
Let V be the set of real-valued
functions defined on the entired
real line (-, ). If f = f(x) and
g = g(x) are two such functions and
k is any real number, defined the
sum function f + g and the scalar
multiple kf, respectively, by
(f g)( x) f ( x) g ( x) and (kf )( x) kf ( x)
Example 4
A Vector Space of Real-Valued Functions
(2/2)
In other words, the value of the function f + g at x is
obtained by adding together the values of f and g at x
(Figure 5.1.1 a). Similarly, the value of kf at x is k times
the value of f at x (Figure 5.1.1 b). In the exercises we
shall ask you to show that V is a vector space with respect
to these operations. This vector space is denoted by F(-,
). If f and g are vectors in this space, then to say that f
= g is equivalent to saying that f(x) = g(x) for all x in the
interval (-, ).
The vector 0 in F(-, ) is the constant function that
identically zero for all value of x.
The negative of a vector f is the function f = -f(x).
Geometrically, the graph of f is the reflection of the
graph of f across the x-axis (Figure 5.1.c).
Remark
Example 5
A Set That Is Not a Vector Space
Let V = R2 and define addition and scalar multiplication
operations as follows: If u = (u1, u2) and v = (v1,
v2), then define
u v (u1 v1 , u2 v2 )
Example 6
Every Plane Through the Origin Is a
Vector Space
Let V be any plane through the origin in R3.From Example 1, we
know that R3 itself is a vector space under these operation. Thus,
Axioms 2, 3, 7, 8, 9, and 10 hold for all points in R3 and
consequently for all points in the plane V. We therefore need only
show that Axioms 1, 4, 5, and 6 are satisfied.
Since the plane V passes through the origin, it has an equation of
the form ax + by + cz = 0. If u = (u1, u2, u3) and v = (v1, v2, v3)
are points in V, then au1 + bu2 + cu3 = 0 and av1 + bv2 + cv3 = 0.
Adding these equations gives a(u1 + v1) + b(u2 + v2) + c(u3 + v3)
=0
Axiom 1: u + v = (u1 + v1, u2 + v2, u3 + v3); thus u + v lies in the
plane V.
Axioms 4, 6: be left as exercises.
Axioms 5: Multiplying au1 + bu2 + cu3 = 0 through by -1 gives
a(-u1) + b(-u2) + c(-u3) = 0 ; thus, - u = (-u1, -u2, -u3) lies in V.
Example 7
The Zero Vector Space
Let V consist of a signal object, which we
denote by 0, and define
0 + 0 = 0 and k0 = 0 for all scalars k.
We called this the zero vector space.
Theorem 5.1.1
Let V be a vector space, u a vector in V,
and k a scalar; then:
a)
0u = 0
b)
K0 = 0
c)
(-1)u = -u
d)
If ku = 0 , then k = 0 or u = 0.
5.2 Subspaces
Definition
Theorem 5.2.1
a)
b)
Remark
Example 1
Testing for a Subspace
Example 2
Lines Through the Origin Are Subspaces
Show that a line through the origin of R3 is a
subspace of R3.
Solution.
Let W be a line through the origin of R3.
Example 3
A subspace of R2 That Is Not a Subspace
Remark
Subspace of R3
{0}
Lines through the origin
Planes through origin
R3
Example 4
Subspaces of Mnn
Example 5
A Subspace of Polynomials of Degree n
p( x) a0 a1 x ... an x n
(1)
p( x) a0 a1 x ... an x n
and
q( x) b0 b1 x ... bn x n
Then
(p q)( x) p( x) q( x) (a0 b0 ) (a1 b1 ) x ... (an bn ) x n
and
Example 6
Subspaces of Functions Continuous
on (-, ) (1/3)
Example 6
Subspaces of Functions Continuous
on (-, ) (2/3)
To take this a step further, for each positive integer m, the
functions with continuous m th derivatives on (-, )
form a subspace of C1(-, ) as do the functions that
have continuous derivates of all orders. We denote the
subspace of functions with continuous m th derivatives
on (-, ) by Cm(-, ), and we denote the subspace
of functions that have continuous derivatives of all order
on (-, ) by C(-, ). Finally, it is a theorem of
calculus that polynomials have continuous derivatives of
all order, so Pn is a subspace of C(-, ).
Example 6
Subspaces of Functions Continuous
on (-, ) (3/3)
Solution Space of
Homogeneous Systems
Theorem 5.2.2
If Ax = 0 is a homogeneous linear
system of m equations in n unknowns,
then the set of solution vectors is a
subspace of Rn.
Example 7
Solution Spaces That Are Subspaces
of R3 (1/2)
1 - 2 3 x
0
y 0
(a)
2
4
6
3 - 6 9
z
1 - 2 3 x
0
y 0
(c)
-3
7
-8
4 1 2
z
1 - 2 3 x
0
y 0
(b)
-3
7
8
-2 4 -6
z
0 0 0 x
0
y 0
(d)
0
0
0
0 0 0
z
Example 7
Solution Spaces That Are Subspaces
of R3 (2/2)
Solution.
(a) x = 2s - 3t, y = s, z = t
x = 2y - 3z or x 2y + 3z = 0
This is the equation of the plane through the origin with n = (1, -2,
3) as a normal vector.
(b) x = -5t , y = -t, z=t
which are parametric equations for the line through the origin
parallel to the vector v = (-5, -1, 1).
(c) The solution is x = 0, y = 0, z = 0, so the solution space is the
origin only, that is {0}.
(d) The solution are x = r , y = s, z = t, where r, s, and t have
arbitrary values, so the solution space is all of R3.
Definition
A vector w is a linear combination of
the vectors v1, v2,, vr if it can be
expressed in the form
w = k1v1 + k2v2 + + krvr
where k1, k2, , kr are scalars.
Example 8
Vectors in R3 Are Linear
Combinations of i, j, and k
Every vector v = (a, b, c) in R3 is
expressible as a linear combination of
the standard basis vectors
i = (1, 0, 0), j = (0, 1, 0), k = (0, 0, 1)
since
v = (a, b, c) = a(1, 0, 0) + b(0, 1, 0) +
c(0, 0, 1) = ai + bj + ck
Example 9
Checking a Linear Combination (1/2)
Consider the vectors u = (1, 2, -1) and v = (6, 4, 2) in R3. Show
that w = (9, 2, 7) is a linear combination of u and v and that
w = (4, -1, 8) is not a linear combination of u and v.
Solution.
In order for w to be a linear combination of u and v, there must
be scalars k1 and k2 such that w = k1u + k2v;
(9, 2, 7) = (k1 + 6k2, 2k1 + 4k2, -k1 + 2k2)
Equating corresponding components gives
k1 + 6k2 = 9
2k1 + 4k2 = 2
-k1 + 2k2 = 7
Solving this system yields k1 = -3, k2 = 2, so
w = -3u + 2v
Example 9
Checking a Linear Combination (2/2)
Similarly, for w to be a linear combination of u and v, there must
be scalars k1 and k2 such that w= k1u + k2v;
(4, -1, 8) = k1(1, 2, -1) + k2(6, 4, 2)
or
(4, -1, 8) = (k1 + 6k2, 2k1 + 4k2, -k1 + 2k2)
Equating corresponding components gives
k1 + 6k2 = 4
2 k1 + 4k2 = -1
- k1 + 2k2 = 8
This system of equation is inconsistent, so no such scalars k1 and
k2 exist. Consequently, w is not a linear combination of u and v.
Theorem 5.2.3
a)
b)
Definition
Example 10
Spaces Spanned by One or Two
Vectors (1/2)
Example 10
Spaces Spanned by One or Two
Vectors (2/2)
Example 11
Spanning Set for Pn
The polynomials 1, x, x2, , xn span
the vector space Pn defined in Example
5 since each polynomial p in Pn can be
written as
p = a0 + a0x + + anxn
which is a linear combination of 1, x,
x2, , xn. We can denote this by writing
Pn = span{1, x, x2, , xn}
Example 12
Three Vectors That Do Not Span R3
Determine whether v1 = (1, 1, 2), v2 = (1, 0, 1), and v3 = (2, 1, 3) span the vector
space R3.
Solution.
We must determine whether an arbitrary vector b = (b1, b2, b3) in R3 can be expressed as
a linear combination b = k1v1 + k2v2 + k3v3
Expressing this equation in terms of components gives (b1, b2, b3) = k1(1, 1, 3) + k2(1, 0, 1)
+ k3(2, 1 ,3) or (b1, b2, b3) = (k1 + k2 + 2k3, k1 + k3, 2k1 + k2 + 3 k3)
or
k1 + k2 + 2k3 = b1
k1
+ k3 = b2
2k1 + k2 + 3 k3 = b3
by parts (e) and (g) of Theorem 4.3.4, this system is consistent for all b1, b2, and b3 if and
only if the coefficient matrix
1 1 2
A 1 0 1
2 1 3
has a nonzero determinant. However, det(A) = 0, so that v1, v2, and v3, do not span R3.
Theorem 5.2.4
Definition
If S={v1, v2, , vr} is a nonempty set of
vector, then the vector equation
k1v1+k2v2++krvr=0
has at least one solution, namely
k1=0, k2=0, , kr=0
If this the only solution, then S is called linearly
independent set. If there are other solutions,
then S is called a linear dependent set.
Example 1
A Linear Dependent Set
Example 2
A Linearly Dependent Set
The polynomials
p1=1-x, p2=5+3x-2x2, and p3=1+3x-x2
form a linear dependent set in P2 since
3p1-p2+2p3=0.
Example 3
Linear Independent Sets
Consider the vectors i=(1, 0, 0), j=(0, 1, 0), and
k=(0, 0, 1) in R3. In terms of components the vector
equation
k1i+k2j+k3k=0
becomes
k1(1, 0, 0)+k2(0, 1, 0)+k3(0, 0, 1)=(0, 0, 0)
or equivalently,
(k1, k2, k3)=(0, 0, 0)
So the set S={i, j, k} is linearly independent.
A similar argument can be used to show the vectors
e1=(1, 0, 0, ,0), e2=(0, 1, 0, , 0), , en=(0, 0,
0, , 1)
form a linearly independent set in Rn.
Example 4
Determining Linear
Independence/Dependence (1/2)
Determine whether the vectors
v1=(1, -2, 3), v2=(5, 6, -1), v3=(3, 2, 1)
form a linearly dependent set or a linearly independent set.
Solution.
In terms of components the vector equation
k1v1+k2v2+k3v3=0
becomes
k1(1, -2, 3)+k2(5, 6, -1)+k3(3, 2, 1)=(0, 0, 0)
Thus, v1 , v2, and v3 form a linearly dependent set if this
system has a nontrivial solution, or a linearly independent
set if it has only the trivial solution. Solving this system
yields
k1=-1/2t, k2=-1/2t, k3=t
Example 4
Determining Linear
Independence/Dependence (2/2)
Thus, the system has nontrivial solutions
and v1,v2, and v3 form a linearly
dependent set. Alternatively, we could
show the existence of nontrivial
solutions by showing that the
coefficient matrix has determinant zero.
Example 5
Linearly Independent Set in Pn
Show that the polynomials
1, x, x2, , xn
form a linearly independent set of vectors in Pn.
Solution.
Let p0=1, p1=x, p2= x2, , pn=xn and assume
a0p0+a1p1+a2p2+ +anpn=0
or equivalently,
a0+a1x+a2x2+ +anxn=0
for all x in (-,)
(1)
we must show that
a0=a1=a2= =an=0
Recall from algebra that a nonzero polynomial of degree n has at
most n distinct roots. But this implies that a0=a1=a2= =an=0;
otherwise, it would follow from (1) that a0+a1x+a2x2+ +anxn is
a nonzero polynomial with infinitely many roots.
Theorem 5.3.1
a)
b)
Example 6
Example 1 Revisited
In Example 1 we saw that the vectors
v1=(2, -1, 0, 3), v2=(1, 2, 5, -1), and v3=(7, -1,
5, 8)
Form a linearly dependent set. In this example
each vector is expressible as a linear
combination of the other two since it follows
from the equation 3v1+v2-v3=0 that
v1=-1/3v2+1/3v3, v2=-3 v1+v3, and v3=3v1+v2
Example 7
Example 3 Revisited
Consider the vectors i=(1, 0, 0), j=(0, 1, 0), and
k=(0, 0, 1) in R3.
Suppose that k is expressible as
k=k1i+k2j
Then, in terms of components,
(0, 0, 1)=k1(1, 0, 0)+k2(0, 1, 0) or (0, 0, 1)=(k1, k2, 0)
But the last equation is not satisfied by any values of k1
and k2, so k cannot be expressed as a linear
combination of i and j. Similarly, i is not expressible
as a linear combination of j and k, and j is not
expressible as a linear combination of i and k.
Theorem 5.3.2
a)
b)
Example 8
Using Theorem 5.3.2b
Geometric Interpretation of
Linear Independence (1/2)
Geometric Interpretation of
Linear Independence (2/2)
Theorem 5.3.3
Remark
Linear Independence of
Functions (1/2)
:
( n 1)
f1 ( x)
f 2 ( x)
f 2' ( x)
...
...
:
f 2( n 1) ( x) ...
f n ( x)
f n' ( x)
f n( n 1) ( x)
Is called the Wronskian of f1, f2, ,fn. As we shall now know, this
determinant is useful for ascertaining whether the functions f1,
f2, ,fn form a linear independent set of vectors in the vector
space C(n-1)(-,). Suppose, for the moment, that f1, f2, ,fn
are linear dependent vectors in C(n-1)(-,). Then, there exist
Linear Independence of
Functions (2/2)
k1 f1 ( x) k2 f 2 ( x) ... k n f n ( x) 0
k1 f 1' ( x) k2 f 2' ( x) ... k n' f n ( x) 0
:
k1 f 1( n 1) ( x) k2 f 2( n 1) ( x) ... k n( n 1) f n ( x) 0
Thus, the linear dependence of f1, f2, ,fn implies that the linear
system
f1 ( x )
f ' ( x)
1
:
( n 1)
( x)
f1
f 2 ( x)
f 2' ( x)
...
...
:
f 2( n 1) ( x)
...
f n ( x ) k1 0
f n' ( x) k2 0
: :
:
f n( n 1) ( x) k n 0
Theorem 5.3.4
Example 9
Linearly Independent Set in C1(-,)
Show that the functions f1=x and f2=sin x form a
linearly independent set of vectors in C1(-,).
Solutions.
The Wronskian is
W ( x)
x sin x
1 cos x
x cos x sin x
This function does not have value zero for all x in the
interval (-,), so f1 and f2 form a linearly
independent set.
Example 10
Linearly Independent Set in C2(-,)
Show that the functions f1=1 and f2=ex , and f3=e2x
form a linearly independent set of vectors in C2(,).
Solution.
The Wronskian is
1 ex
e2 x
0 ex
2e 2 x 2e3 x
0 ex
4e 2 x
This function does not have value zero for all x (in fact,
for any x) in the interval (-,), so f1, f2, and f3
form a linearly independent set.
Remark