0% found this document useful (0 votes)
28 views4 pages

Math 115a: Selected Solutions For HW 2: October 15, 2005

Uploaded by

samadrita das
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views4 pages

Math 115a: Selected Solutions For HW 2: October 15, 2005

Uploaded by

samadrita das
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Math 115a: Selected Solutions for HW 2

October 15, 2005

Exercise 1.4.10: Show that if


     
1 0 0 0 0 1
M1 = , M2 = , M3 = ,
0 0 0 1 1 0

then the span of {M1 , M2 , M3 } is the set of all symmetric 2 matrices.

Solution: Let M be an arbitrary symmetric 2 × 2 matrix; we will denote


 
a b
M= .
b d

Via a rather superficial inspection, we see that

M = aM1 + dM2 + bM3 .

Since we’ve written an arbitrary symmetric matrix as a linear combination of


the Mi0 s, we conclude that {M1 , M2 .M3 } spans our space in question.

Exercise 1.4.14: Show that if S1 and S2 are arbitrary subsets of a vector


space V , then span(S1 ∪ S2 ) = span(S1 ) + span(S2 ). (The sum of two subsets
is defined in the exercises in Section 1.3).

Solution: In order to prove equality of two sets, we need to prove mutual inclu-
sion.
⊆: Let v ∈ span(S1 ∪ S2 ). Then v can be written as a linear combination of
vectors in S1 ∪ S2 , i.e. X X
v= ai xi + bj yj ,
i j

where ai , bj ∈ F and xi ∈ S1 , yj ∈ S2 . (We note that P the two sums are finite,
although
P we will not use this fact in this proof.) Since i ai xi ∈ span(S1 ), and
j bj yj ∈ span(S2 ), we conclude that v ∈ span(S1 ) + span(S2 ).

⊇: Let v ∈ span(S1 ) + span(S2 ). Then by definition,


X X
v= ai xi + bj yj ,
i j

1
where ai , bj ∈ F and xi ∈ S1 , yj ∈ S2 . This is clearly a linear combination of
vectors from S1 ∪ S2 . Therefore v ∈ span(S1 ∪ S2 ).

Exercise 1.5.15: Let S = {u1 , u2 , ..., un } be a finite set of vectors. Prove that
S is linearly dependent if and only if u1 = 0 or uk+1 ∈ span({u1 , u2 , ..., uk }) for
some k (1 ≤ k < n)

Proof:
(⇒) Suppose that S is linearly dependent. Then we need to prove that ei-
ther u1 = 0 or there exists some k such that uk+1 ∈ span({u1 , u2 , ..., uk }). If
u1 = 0, then we are done. So let us suppose that u1 6= 0. Then what we need
to prove is that the second part of the statement must be true: there exists
some k such that uk+1 ∈ span({u1 , u2 , ..., uk }). The way we approach this is to
proceed via proof by contradiction. Suppose that there is no such k such that
uk+1 ∈ span({u1 , u2 , ..., uk }). To rephrase, this means that for all k, uk+1 ∈ /
span({u1 , u2 , ..., uk }). So we now need to use this assumption repeatedly, as
follows: u2 ∈ / span({u1 }) implies that {u1 , u2 } is a linearly independent set.
Similarly, u3 ∈ / span({u1 , u2 }) implies that {u1 , u2 , u3 } is a linearly independent
set. We can continue in this fashion until we get the following final statement:
un ∈/ span({u1 , u2 , ..., un−1 }) implies that S = {u1 , u2 , ..., un } is a linearly de-
pendent set. However, our initial assumption is that S is linearly dependent.
This is our contradiction. Therefore our initial assumption is false; there must
exist some k such that uk+1 ∈ span({u1 , u2 , ..., uk }).

(⇐) Suppose that u1 = 0 or uk+1 ∈ span({u1 , u2 , ..., uk }) for some k (1 ≤


k < n). If u1 = 0, then that means 0 ∈ S, which immediately implies that S
is linearly dependent (why?) So suppose that u1 6= 0. This means that there
exists some k such that u1 = 0 or uk+1 ∈ span({u1 , u2 , ..., uk }). Therefore
T = {u1 , u2 , ..., uk+1 } is a linearly dependent set. Since T ⊆ S, this implies
that S is linearly dependent.

Exercise 1.6.12: Let u, v, w be distinct vectors of a vector space V . Show


that if {u, v, w} is a basis for V , then {u + v + w, v + w, w} is also a basis for V .

Solution: Let {u, v, w} be a basis for V . Since this is a three element set, we
conclude that the dimension of V must be 3. Looking at {u + v + w, v + w, w},
we see that this is also a three element set. Therefore if we can prove that this
set is either linearly independent or spans V , then we are done (make sure you
understand why this is true). We will show that {u + v + w, v + w, w} is a
linearly independent set. Let a1 , a2 , a3 ∈ F such that

a1 (u + v + w) + a2 (v + w) + a3 (w) = 0.

We will show that this implies that a1 = a2 = a3 = 0, by rewriting the equality

2
as:
0 = a1 (u + v + w) + a2 (v + w) + a3 (w)
= (a1 )(u) + (a1 + a2 )(v) + (a1 + a2 + a3 )(w).
Since {u, v, w} is a basis for V , it is a linearly independent set. Therefore from
the last equality, we can conclude that a1 = a1 + a2 = a1 + a2 + a3 = 0, and
from here we can conclude immediately that a1 = a2 = a3 = 0. Therefore we’ve
proven that {u + v + w, v + w, w} is a linearly independent set. Therefore it is
a basis for V .

Exercise 1.6.20: Let V be a vector space having dimension n, and let S


be a subset of V that generate V .
(a) Prove that there is a subset of S that is a basis for V .
(b) Prove that S contains at least n elements.

Solution:
(a): Let {β1 , β2 , ..., βn } be a basis for V . Since span(S)=V , each of the βi ’s can
be written as a finite linear combination of elements from S. More specifically,
X
β1 = a1,i si
i∈I1
X
β2 = a2,i si
i∈I2
..
.
X
βn = an,i si
i∈In

where all of the aj,i ’s are scalars, and In ’s are finite index sets (see the note at
the end of the proof). Let us define the set
n
[
J= Ij
j=1

be the finite union of all the index sets. Now consider the subset of the vector
space [
T = sj .
j∈J

Since T is a set made up of elements from S, T ⊆ S. Since J is a finite index


set, T is also a finite set. Furthermore, we have constructed this set T that
contains elements from S which ”builds” each of the βi ’s. Therefore
{β1 , β2 , ..., βn } ⊆ span (T )
⊆ span (S) = V

V = span ({β1 , β2 , ..., βn }) ⊆ span (span (T )) = span (T ) ⊆ V
⇒ span(T ) = V.

3
Since T spans V, and it is a finite set, by the Replacement Theorem (1.10) we
can find a subset of T –call it B–that is a basis for V . It is clear that B is a
subset of S, as it is a subset of T . This finishes our proof.

You might also like