Unit 5
Unit 5
5.1 INTRODUCTION
In this Block we continue our study of Linear Algebra with a discussion of
solution of simultaneous equations. The problem of solution of linear equations
efficiently is one of the fundamental problems of Mathematics. Efficient
methods of solution of systems of linear equations has many applications,
including in new areas like Data Science.
In Sec. 5.2 we discuss the concept of linear independence of vectors and the
dimension of a vector space. In Sec. 5.3, we introduce you to two of the
important concepts in Linear Algebra, the concepts of basis and dimension of a
vector space. In Sec. 5.4, we will determine the dimensions of some
subspaces. In Sec. 5.5, we will determine the dimension of quotient space.
Objectives
After studying this unit, you should be able to:
𝛼1 v1 + 𝛼2 v2 + ⋯ + 𝛼n vn = 0
for 𝛼i ∈ F, 1 ≤ i ≤ n, then 𝛼i = 0.
𝛼1 v1 + 𝛼2 v2 + ⋯ + 𝛼n vn = 0
Note that in two dimension, if two vectors v1 and v2 are linearly independent,
then there are 𝛼1 , 𝛼2 , not both zero, such that 𝛼1 v1 + 𝛼2 v2 = 0. Without loss of
𝛼
generality, we may assume that 𝛼1 ≠ 0. We then have v1 = − 2 v2 . So, v1 is a
𝛼1
scalar multiple of v2 and hence v1 and v2 are collinear. Thus, in ℝ2 , two vectors
linearly dependent iff they are collinear.
Solution:
𝛼1 + 𝛼2 + 𝛼3 = 0
𝛼2 + 𝛼3 = 0
𝛼3 = 0
𝛼1 − 𝛼2 + 𝛼3 = 0
𝛼2 + 2𝛼3 = 0
2𝛼1 + 2𝛼2 + 8𝛼3 = 0
𝛼1 + 2𝛼2 + 7𝛼3 = 0
1 0 3 0
0 1 2 0
[ ]
0 0 0 0
0 0 0 0
∗∗∗
Solution: The zero element of this vector space is the zero function, i.e., it is
the function 0 such that 0 (x) = 0 ∀ x ∈ ℝ. So we have to determine a, b ∈ ℝ
such that, ∀ x ∈ ℝ, a sin x + bex = 0.
∗∗∗
where m > 0.
Now S is linearly dependent. Therefore, for some scalars 𝛼1 , 𝛼2 , … , 𝛼k , not
all zero, we have
k
∑ 𝛼i ui = 0
i=1
Now, what happens if one of the vectors in a set can be written as a linear
combination of the other vectors in the set? The next theorem states that such
36 set is linearly dependent.
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
Theorem 3: Let S = {v1 , v2 , … , vn } be a subset of a vector space V over field F.
Then S is linearly dependent if and only if some vector of S is a linear
combination of the rest of the vectors of S.
We now prove (ii), which is the converse of (i). Since S is linearly dependent,
there exist 𝛼i ∈ F, not all zero, such that
𝛼1 v1 + 𝛼2 v2 + … + 𝛼n vn = 0.
Now, let us look at the situation in ℝ3 where we know the i, j are linearly
independent. Can you immediately prove whether the set {i, j, (3, 4, 5)} is
linearly independent or not? The following theorem will help you to do this.
𝛼v + 𝛼1 v1 + ⋯ + 𝛼n vn = 0. 37
Block
. . . . . . .2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Vector
. . . . . . . .Spaces-II
.........
Now, if 𝛼 = 0, this implies that there exist scalars 𝛼1 , 𝛼2 , … , 𝛼n , not all zero, such
that
𝛼1 v1 + ⋯ + 𝛼n vn = 0.
(−𝛼1 ) (−𝛼n )
v= v1 + ⋯ + vn ,
𝛼 𝛼
i.e., v is linear combination of v1 , v2 , … , vn , i.e., v ∈ [S], which contradicts our
assumption.
Using this theorem we can immediately see that the set {i, j, (3, 4, 5)} is linearly
independent, since (3, 4, 5) is not a linear combination of i and j.
If you’ve done Exercise 3) you will have found that, by adding a vector to a
linearly independent set, it may not remain linearly independent. Theorem 4
tells us that if, to a linearly independent set, we add a vector which is not in
the linear span of the set, then the augmented set will remain linearly
independent. Thus, the way of generating larger and larger linearly
independent subsets of a non-zero vector space V is as follows:
4. If [S2 ] = V, the process ends. Otherwise, we can find a still larger set S3
which is linearly independent. It is clear that, in this way, we either reach a
set which generates V or we go on getting larger and larger linearly
independent subsets of V.
38 In the next example, we we will give an infinite set that is linearly independent.
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
Example 3: Prove that the infinite subset S = {1, x, x2 , … … }, of the vector space
P of all real polynomials in x, is linearly independent.
Now, suppose
k
ai
∑ 𝛼i x = 0, where 𝛼i ∈ ℝ ∀ i
i=1
∗∗∗
Thus, B ⊆ V is a basis of V if B is linearly independent and every vector of V is Lecture by Prof. Strang
a linear combination of a finite number of vectors of B. on basis and dimension
39
Block
. . . . . . .2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Vector
. . . . . . . .Spaces-II
.........
You have already seen that we can write every element of R3 as a linear
combination of i = (1, 0, 0), j = (0, 1, 0) and k = (0, 0, 1). We proved in Example
1, that this set is linearly independent. So, {i, j, k} is a basis for ℝ3 . The
following example shows that ℝ2 has more than one basis.
⟹𝛼=𝛽=0
∗∗∗
∗∗∗
E7) Prove that {1, x + 1, x2 + 2x} is a basis of the vector space, P2 , of all
polynomials of degree less than or equal to 2.
We have already mentioned that no proper subset of a basis can generate the
whole vector space. We will now prove another important characteristic of a
basis, namely, no linearly independent subset of a vector space can obtain
more vectors than a basis of the vector space. In other words, a basis
contains the maximum possible number of linearly independent vectors. In the
next section, we will discuss the dimensions of some subspaces.
n
w1 = ∑ 𝛼i vi , 𝛼i ∈ F ∀ i = 1, … , n.
i=1
Note that we have been able to replace v1 by w1 in B in such a way that the
new set still generates V. Next, let
S′2 = {w2 , w1 , v2 , v3 , … , vn } .
w2 = 𝛽1 w1 + 𝛽2 v2 + ⋯ + 𝛽n vn , 𝛽i ∈ F ∀ i = 1 , … , n
1 𝛽1 𝛽3 𝛽n
v2 = w2 − w1 − v3 − ⋯ − vn ,
𝛽2 𝛽2 𝛽2 𝛽2
Now, suppose n < m. Then, after n steps, we will have replaced all v′i s by
corresponding w′i s and we shall have a set Sn = {wn , wn−1 , … , w2 , w1 } with
[Sn ] = V. But then, this means that wn+1 ∈ V = [Sn ], i.e., wn+1 is a linear
combination of w1 , w2 , … , wn . This implies that the set {w1 , … , wn , wn+1 } is
linearly dependent. This contradicts the fact that {w1 , w2 , … , wm } is linearly
dependent. Hence, m ≤ n. ■
Solution: You know that (1,0) and (0,1) form a basis of ℝ2 over ℝ. Thus, to
show that the given set forms a basis, we only have to show that the 2 vectors
in it are linearly independent. For this, consider the equation
𝛼 (1, 4) + 𝛽 (0, 1) = 0, where 𝛼, 𝛽 ∈ ℝ. Then (𝛼, 4𝛼 + 𝛽) = (0, 0) ⟹ 𝛼 = 0, 𝛽 = 0.
∗∗∗
a) Is {u, v + w, w + t, t + u} a basis.
b) Is {u, t} a basis of V?
We now give two results that you must always keep in mind when dealing with
42 vector spaces. They depend on Theorem 5.
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
Theorem 6: If one basis of a vector space contains n vectors, then all its bases
contain n vectors.
So far we have been saying that “if a vector space has a basis, then …”. Now
we state the following theorem (without proof).
v = 𝛼1 v1 + ⋯ + 𝛼n vn = 𝛽1 v1 + ⋯ + 𝛽n vn .
The coordinates of a vector will depend on the particular basis chosen, as can
be seen in the following example.
Solution:
∗∗∗
Note: The basis B1 = {i, j} has the pleasing property that for all vectors (p, q)
and all the coordinates of (p, q) relative to B1 are (p, q). For this reason B1 is
called the standard basis of ℝ2 , and the coordinates of a vector relative to the
standard basis are called standard coordinates of the vector. In fact, this is
the basis we normally use for plotting points in 2-dimensional space.
44 Solution:
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
a) Let 2x + 1 = 𝛼 (5) + 𝛽 (3x) = 3𝛽x + 5𝛼.
Then 3𝛽 = 2, 5𝛼 = 1. So, the coordinates of 2x + 1 relative to B are
(1/5, 2/3).
E11) Find a standard basis of ℝ3 and for the vector space P2 of all polynomials
of degree ≤ 2.
E12) For the basis B = {(1, 2, 0) , (2, 1, 0) , (0, 0, 1) of ℝ3 , find the coordinates of
(−3, 5, 2).
E13) Prove that, for any basis B = {v1 , v2 , … , vn } of a vector space V, the
coordinates of 0 are (0, 0, 0, … , ).
E14) For the basis B = {3, 2x + 1, x2 − 2} of the vector space P2 of all polynomial
of degree ≤ 2, find the coordinates of
a) 6x + 6 b) (x + 1)2 c) x2
E15) For the basis B = {u, v} of ℝ2 , the coordinates of (1, 0) are (1/2, 1/2) and
the coordinates of (2, 4) are (3, −1). Find u, v.
We now continue the study of vector space by looking into their ‘dimension’, a
concept directly related to the basis of a vector space.
5.3.1 Dimension
So far we have seen that, if a vector space has a basis of n vectors, then every
basis has n vectors in it. Thus, given a vector space, the number of elements in
its different bases remains constant.
∗∗∗
E17) Prove that the real vector space ℂ of all complex numbers has dimension
two.
E18) Prove that the vector space Pn , of all polynomials of degree at most n,
has dimension n + 1.
Int the next subsection, we will see how to complete a linearly independent set
in a vector space to a basis of vector space when the dimension of the vector
space is finite.
Proof: Since m < n, W is not a basis of V (Theorem 6). Hence, [W] ≠ V. Thus,
we can find a vector v1 ∈ V such that v1 ∈ [W]. Therefore, by Theorem 4,
W1 = W ∪ {v1 } is a linearly independent set with n vectors in the n-dimensional
space V, so W1 contains m + 1 vectors. If m + 1 = n, W1 is linearly independent
set with n vectors in the n-dimensional space V, so W1 is a basis of V
(Theorem 5, Corollary 1). That is, {w1 , … , wm , v1 } is a basis of V. If m + 1 < n,
then [W1 ] ≠ V, so there is a v2 ∈ V such that v2 ∉ [W1 ]. Then W2 = W ∪ {v2 } is
linearly independent and contains m + 2 vectors. So, if m + 2 = n, then
[S] = {𝛼(2, 3, 1) |𝛼 ∈ ℝ}
= {(2𝛼, 3𝛼, 𝛼) |𝛼 ∈ ℝ }
Now we have to find v1 ∈ ℝ3 such that v1 ∉ [S], i.e., such that v1 ≠ (2𝛼, 3𝛼, 𝛼)
for any 𝛼 ∈ ℝ. We can take v1 = (1, 1, 1). Then
= {(2𝛼 + 𝛽, 3𝛼 + 𝛽, 𝛼 + 𝛽) |𝛼, 𝛽 ∈ ℝ
Now select v2 ∈ ℝ3 such that v2 ∉ [S1 ]. We can take v2 = (3, 4, 0). How do we
‘hit upon’ this v2 ? There are many ways. What we have done here is to take
𝛼 = 1 = 𝛽, then 2𝛼 + 𝛽 = 3, 3𝛼 + 𝛽 = 4, 𝛼 + 𝛽 = 2. So (3, 4, 2) belongs to [S1 ] then,
by changing the third component from 2 to 0, we get (3, 4, 0), which is not in
[S1 ]. Since v2 ∉ [S1 ] , S1 ∪ {v2 } is linearly independent. That is,
∗∗∗
Note: Since we had a large number of choices for both v1 and v2 , it is obvious
that we could have extended S to get a basis of ℝ3 in many ways.
Solution: We note that P2 has dimension 3, a basis being {1, x, x2 } (see E19).
So we have to add only one polynomials to S to get a basis of P2 .
This shows that [S] does not contain any polynomials of degree 2. So we can
choose x2 ∈ P2 because x2 ∉ [S]. So S can be extended to {x + 1, 3x + 2, x2 },
which is a basis of P2 . Have you wondered why there is no constant term in
this basis? A constant term is not necessary. Observe that 1 is linear
combination of x + 1 and 3x + 2, namely, 1 = 3 (x + 1) − 1(3x + 2). So, 1 ∈ [S] and
hence , ∀ 𝛼 ∈ ℝ, 𝛼.1 = 𝛼 ∈ [S].
∗∗∗
E20) Complete S = {(1, 0, 1) , (2, 3, −1) in two different ways to get two distinct
bases of ℝ3 .
a) S = {2, x2 + x, 3x3 }
b) S = {x2 + 2, x2 − 3x}
to get a basis of P3 .
Theorem 12: Let V be a vector space over a field F such that dim V = n . Let W
48 be a subspace of V. Then dim W ≤ n .
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
Proof: Since W is a vector space over F in its own right, it has a basis.
Suppose dim W = m . Then the number of elements in W’s basis is m. These
elements form a linearly independent subset of W, and hence, of V. Therefore,
by Theorem 7, m ≤ n. ■
Remark 2: If W is a subspace of V such that dim W = dim V = n then
W = V, since the basis of W is a set of linearly independent elements in V, and
we can appeal to Corollary 1.
Solution: By Theorem 12, since dim ℝ2 = 2, the only possibilities for dim V
are 0,1 and 2.
V = {𝛼 (𝛽1 , 𝛽2 ) |𝛼 ∈ ℝ} .
∗∗∗
dim V = 1 ⟹ V = { 𝛼(𝛽1 , 𝛽2 , 𝛽3 )| 𝛼 ∈ ℝ}
x y z
S = {(x, y, z) ∈ ℝ3 | = = }
𝛽1 𝛽2 𝛽3 49
Block
. . . . . . .2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Vector
. . . . . . . .Spaces-II
.........
Let (x1 , y1 , z1 ), (x2 , y2 , z2 ) ∈ S. The set S is a line through the origin.
Further,
x1 y1 z1 x2 y2 z2
= = and = =
𝛽1 𝛽2 𝛽3 𝛽1 𝛽2 𝛽3
Check that
𝛼x1 + 𝛽x2 𝛼y1 + 𝛽y2 𝛼z1 + 𝛽z2
= =
𝛽1 𝛽2 𝛽2
y z
S = {(x, y, z) ∈ ℝ3 |x = 0, = }
𝛽2 𝛽3
Then, S is a line through the origin that lies on the yz-plane. Again, we can
show that S is a subspace and
S = V = { 𝛼 (0, 𝛽2 , 𝛽3 )| 𝛼 ∈ ℝ}
x z
S = {(x, y, z) ∈ ℝ3 |y = 0, = }
𝛽1 𝛽3
S = V = { 𝛼 (𝛽1 , 0, 𝛽3 )| 𝛼 ∈ ℝ}
If 𝛽3 = 0 and 𝛽1 ≠ 0, 𝛽2 ≠ 0, we let
x y
S = {(x, y, z) ∈ ℝ3 |z = 0, = }
𝛽1 𝛽2
S = V = { 𝛼 (𝛽1 , 𝛽3 , 0)| 𝛼 ∈ ℝ}
V = [u1 , u2 ]
𝛼1 x + 𝛼2 y + 𝛼3 z = 0
𝛽1 x + 𝛽2 y + 𝛽3 z = 0
The system has two equations in three unkowns. So, by ??, this has a nonzero
solution (a, b, c). Let
50 S = { (x, y, z) ∈ ℝ3 | ax + by + cz = 0}
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
Then, S is a subspace of ℝ3 and u1 , u2 ∈ S. Since S contains a basis of V, it
contains V.
b c
x − −
a a
[y] = y [ 1 ] + z [ 0 ]
z 0 1
b c
− −
a a
So, S is spanned by v1 = [ 1 ] and v2 = [ 0 ]. Check that {v1 , v2 } is linearly
0 1
independent. Therefore, dim S = 2. Since V ⊆ S and dim V = dim S, it follows
that S = V.
dim V = 3 ⟹ V = ℝ3 .
∗∗∗
Now let us go further and discuss the dimensions of the sum of subspace (See
Unit 3.). If U and W are subspaces of a vector space V, then so are U + W and
U ∩ W. Thus, all these subspaces have dimensions. We relate these
dimensions in the following theorem.
of U and a basis
of W.
where 𝛼i , 𝛽j , 𝜏k ∈ F ∀ i, j, k.
Then
r m n
∑ 𝛼i vi + ∑ 𝛽j uj = − ∑ 𝜏k wk …(2)
i=1 j=r+1 k=r+1
That is,
r m r
∑ 𝛼i vi + ∑ 𝛽j uj = ∑ 𝛿i vi …(3)
i=1 j=r+1 i=1
and
n r
∑ 𝜏k wk = ∑ 𝛿i vi …(4)
k=r+1 i=1
where 𝛿i ∈ F ∀ i = 1, … , r
∑ 𝛿i vi + ∑ 𝜏k wk = 0,
Thus, ∑ 𝛼i vi + ∑ 𝛽j uj + ∑ 𝜏l wk = 0
⟹ 𝛼i = 0, 𝛽j = 0, 𝜏k = 0 ∀ i, j, k.
So A ∪ B is linearly independent.
∴, A ∪ B is a basis of U + W, and
52 ■
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
We give a corollary to Theorem 13 now.
⟹ 5 + 4 − dim(U ∩ W) ≤ 7
⟹ dim(U ∩ W) ≥ 2
Thus, dim (U ∩ W) = 2, 3 or 4.
∗∗∗
(a, b, c, d) ∈ V ⟺ b − 2c + d = 0.
⟺ (a, b, c, d) = (a, b, c, 2c − b)
This shows that every vector in V is a linear combination of the three linearly
independent vectors (1, 0, 0, 0) , (0, 1, 0, −1) , (0, 0, 1, 2). Thus, a basis of V is
Hence, dim V = 3 .
Next, (a, b, c, d) ∈ W ⟺ a = d, b = 2c
= a (1, 0, 0, 1) + c (0, 2, 1, 0) ,
and dim W = 2 .
⟺ b − 2c + d = 0, a = d, b = 2c
= 3 + 2 − 1 = 4.
∗∗∗
V = {(a, b, c) |b + 2c = 0}
W = {(a, b, c) |a + b + c = 0}
Let us now look at the dimension of a quotient space. Before going further it
may help to revise Sec. 3.5.
We also showed that it is a vector space. Hence, it must have a basis and a
54 dimension. The following theorem tells us what dim V/W should be.
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
Theorem 14: If W is a subspace of a finite-dimensional space V, then
dim (V/W) = dim V − dim W.
Then ∑ki=1 𝛼i vi + W = W
k
⟹ (∑ 𝛼i vi ) + W = W
i=1
k
⟹ ∑ 𝛼i vi ∈ W
i=1
⟹ ∑ 𝛼i vi − ∑ 𝛽j wj = 0
𝛽j = 0, 𝛼i = 0 ∀ j, i.
Thus,
∑ 𝛼i (vi + W) = W ⟹ 𝛼i = 0 ∀ i.
So B is linearly independent.
Therefore,
v + W = (∑ 𝛼i wi + ∑ 𝛽j vj , ) + W
i j
= {(∑ 𝛼i wi ) + W} + {(∑ 𝛽j vj ) + W}
i j
k
= W + ∑ 𝛽j (vj + W) since ∑ 𝛼i wi ∈ W
1
k
= ∑ 𝛽j (vj + W)
j=1
So, v + W ∈ [B].
Let us use this theorem to evaluate the dimensions of some familiar quotient
spaces.
Now,
This shows that every element of P4 /P2 is a linear combination of the two
elements (x4 + P2 ) and (x3 + P2 ).
4 3 2
∴, 𝛼x + 𝛽x = ax + bx + c for some a, b, c ∈ ℝ
⟹ 𝛼 = 0, 𝛽 = 0, a = 0, b = 0, c = 0.
Thus, dim (P4 /P2 ) = 2 . Also dim (P4 ) = 5 , dim (P2 ) = 3, (see E19). Hence
dim (P4 /P2 ) = dim (P4 ) − dim (P2 ) is verified.
∗∗∗
E26) Let V be an n – dimensional real vector space. Find dim (V/V) and
dim V/{0}.
5.6 SUMMARY
56 In this unit, we
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
1. defined a linearly independent set and gave examples of linearly
independent and linearly dependent sets of vectors;
4. how to determine whether a set forms a basis for a vector space or not;
5.7 SOLUTIONS/ANSWERS
−1 1 1 0
𝛼1 [ 1 ] + 𝛼2 [−1] + 𝛼3 [1] = [0]
1 1 0 0
or
−𝛼1 − 𝛼2 + 𝛼3 0
[ 𝛼1 − 𝛼2 + 𝛼3 ] = [0]
𝛼1 + 𝛼2 0
−𝛼1 + 𝛼2 + 𝛼3 = 0
𝛼1 − 𝛼2 + 𝛼3 = 0
𝛼1 + 𝛼2 =0
1 0 0 0
[0 1 0 0]
0 0 1 0
There are no non-pivot columns except the last column which doesn’t
correspond to a variable. The solution to the system is 𝛼1 = 0, 𝛼2 = 0
and 𝛼3 = 0. So, the vectors are linearly independent. 57
Block
. . . . . . .2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Vector
. . . . . . . .Spaces-II
.........
b) Writing as column vectors, suppose
1 2 1 0
𝛼1 [3] + 𝛼2 [ 1 ] + 𝛼3 [2] = [0]
2 −1 1 0
or
𝛼1 + 2𝛼2 + 𝛼3 0
[3𝛼1 + 𝛼2 + 2𝛼3 ] = [0] …(5)
2𝛼1 − 𝛼2 + 𝛼3 0
𝛼1 + 2𝛼2 + 𝛼3 = 0
3𝛼1 + 𝛼2 + 2𝛼3 = 0
2𝛼1 − 𝛼2 + 𝛼3 = 0
1 2 1 0
[3 1 2 0]
2 −1 1 0
3
1 0 0
5
1
[0 1 0]
5
0 0 0 0
E2) Suppose 𝛼 ∈ F such that 𝛼v = 0. Then, from Unit 3 you know that 𝛼 = 0
or v = 0. But v ≠ 0…, 𝛼 = 0, and {v} is linearly independent.
E3) The set S = {(1, 0) , (0, 1)} is a linearly independent subset of ℝ2 . Now,
suppose ∃ T such that S ⊆ T ⊆ ℝ2 . Let (x, y) ∈ T such that (x, y) ∉ S. Then
we can always find a, b, c ∈ ℝ, not all zero, such that
a (1, 0) + b (0, 1) + c (x, y) = (0, 0). (Take a = −x, b = −y, c = 1, for example.)
∴S ∪ {(x, y)} is linearly independent. Since this is obtained in T, T is
linearly dependent. The answer to the question in this exercise is ‘No’.
⟹ 𝛽0 + 𝛽1 + ⋯ + 𝛽k = 0, 𝛽1 = 0 = 𝛽2 = ⋯ = 𝛽k
⟹ 𝛽0 = 0 = 𝛽1 = ⋯ = 𝛽k
⟹ T is linearly independent.
Thus, every finite subset of {1, x + 1, x2 + 1, …} is linearly independent.
Therefore, {1,x+1,….} is a linearly independent.
⟹ (a + d) u + bv + (b + c) w + (c + d) t = 0
E10) You know that {1, x, x2 , x3 } is a basis of P3 , and contains 4 vectors. The
given set contains 6 vectors, and hence, by Theorem 7, it must be linearly
dependent.
Then (1, 0, 0) ∉ [S] and (0, 1, 0) ∉ [S] . ∴{(1, 0, 1) , (2, 3, −1) , (1, 0, 0)} and
3
{(1, 0, 1) , (2, 3, −1) , (0, 1, 0)} are two distinct bases of R .
∴, dim(V ∩ W) ≥ 1.
60 ∴, 1 ≤ dim(V ∩ W) ≤ 2 .
Unit
. . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Bases
. . . . . . . and
. . . . .Dimension
...........
E24) a) Any element of V is v = (a, b, c) with b + 2c = 0.
∴,dim W = 2 .
∴,dim (V ∩ W) = 1 .
E25) 0, n.
61
Block 2
x1 + 2x 2 + x 3 = 8
x1 − x 2 + x 3 = 2
x1 + x 2 + 3x 3 = 8
1 2 1 8
1 −1 1 2
1 1 3 8
1 2 1 8
0 1 0 2
0 0 1 1
x1 + 2x 2 + x 3 = 8
x2 =2
x3 = 1
***
Example 2: In this example, we will see how to determine the current flow in
an electrical laws using Kirchoff’s laws. We begin with the simple circuit in
Fig. 1.
62
Block Miscellaneous
Examples Exercises and
Fig. 1
In the figure symbol denotes a voltage source, usually a battery. The
bigger symbol on the right is the positive terminal and the smaller symbol on
the left is the negative terminal. The current always flows from the positive
terminal of the battery to the negative terminal of the battery and this is taken
There are two resistors of resistance 2 ohms and 3 ohms. When the current
passes through a resistor, there is a drop in voltage. The voltage drop is
governed by Ohm’s law V = IR where V is the voltage, R is the resistance
and I is the current. We have the Kirchoff’s voltage law which states the
following:
In the figure above, there are two voltage drops, one is of 2I and the other is of
4I Their sum equals the voltage supplied by the battery. The drop in voltage is
2I+4I=6I and the voltage supplied is 30 volts. So, 6I+30 and I, the current flow
30
is = 6 amperes.
5
Let us now look at a slightly more complicated situation. Consider the figure
below:
Fig. 2
In this circuit, there are junctions at A and B, indicated by dots. There
are three branches, each with its current flow. The current flow at the
junctions follow another law, called the Kirchoff’s Current Law which is
as follows:
Kirchoff’s Current Law: The current flow into any junction is equal
to the current flow out of the junction.
We consider the closed path ABCD. Total voltage supply in this path is
zero. I2 is in a direction opposite to I1. We have
63
Block 2
Fig. 3
Fig. 4
1 0 1 0 0
0 1 2 −2 0
0 1 2 −2 0
0 2 3 −4 0
R3 → R3 − R 2 gives
1 0 1 0 0
0 1 2 −2 0
0 0 0 0 0
0 2 3 −4 0
R 4 → R 4 − 2R2 gives
1 0 1 0 0
0 1 2 −2 0
0 0 0 0 0
0 2 3 −4 0
The third row is an all zeros row and this is followed by a non-zero row. We
interchange the third and fourth rows to get
1 0 1 0 0
0 1 2 −2 0
0 0 −1 0 0
0 0 0 0 0
R3 → ( −1)R3 gives
1 0 1 0 0
0 1 2 −2 0
0 0 1 0 0
0 0 0 0 0
R2 → R 2 − 2R3 gives
65
Block 2
1 0 1 0 0
0 1 0 −2 0
0 0 1 0 0
0 0 0 0 0
R1 → R1 − R3
1 0 0 0 0
0 1 0 0 0
0 0 1 0 0
0 0 0 0 0
The rank is 3 and these are 4 variables. So, there are 4 − 3 = 1 free variable.
However, the column corresponding to the free variable is zero. So, we get a
unique solution x1 = 0,x 2 = 0,x 3 = 0,x 4 = 0 and x 5 = 0.
***
Try the next exercise to check your understanding of the above example.
x1 + x 2 + 2x 3 + x 4 = 0
x1 + x 2 + x 3 + x 4 = 0
x1 + x 2 + 4x 3 + x 4 = 0
Example 4: You would have studied various chemical reactions in school. For
reation, there is a chemican equation. For example, when hydrogen and
oxygen react, we get water. The chemical formula for the hydrogen molecule
is H2 and the chemical formula for oxygen is O2. We represent the reaction by
the equation
2H2 + O2 → 2H2O
Note the factor 2 in H2 and H2O. Without these factors, the number of
hydrogen atoms and oxygen atoms in the reactants and the products will not
be equal. We have added these factors to balance the equation. More often
than not, we can balance a chemical equation by trial and error. However,
some of the reactions are so complicated that we need to use Linear Algebra
to balance the equations. Consider the reaction of sodium sulfite (Na2SO4)
with nitric acid (HNO3). The products are sodium nitrate (Na2NO3), sulphur
dioxide (SO2) and water (H2O). Let us write the reaction in the form
x1Na2SO3 + x 2HNO3 → x 3NaNO3 + x 4SO2 + x 5H2O
We now write down the number of atoms of each element in the reactants and
products.
Reactants Products
Sodium 2x1 x3
Hydrogen x2 2x5
Sulphur x1 x4
Oxygen 3x1+3x2 3x3+2x4+x5
66
Block Miscellaneous
Examples Exercises and
Since the number of atoms in products and reactants are the same, we get the
following system of homogeneous equations:
2x1 − x3 = 0
x2 − 2x 5 = 0
x1 − x4 = 0
3x1 + 3x 2 − 3x 3 − 2x 4 − x5 = 0
(1,2,2,1,1)
Since we want an integer solution, we set = 1 to get the solution (1,2,2,1,1).
So, the balanced equation is
Na2SO3 + 2HNO3 → 2NaNO3 + SO2 + H2O
We can multiply both sides of the equation by any integer to get infinitely many
solutions, but they are essentially the same.
***
Here is an exercise for you to try.
68
Block Miscellaneous
Examples Exercises and
E 6) In each of the following cases, check whether b is in [S]. If b is in
[S], write b as a linear combination of elements in [S].
a) b = (3,1,0) , S = (1,0,1),(0,1, −1),(0, −1,1).
b) b = (1,1,5), S = (1,1,2),( −1, −1,1),(2,2,1)
c) b = (0,3,1), S = (1,1,1),(2,1, −1),( −1,0,2)
( ( )) = ( A ) = ( A )
( A ) = r (A) = dim (RS(A)) = dim CS A t c
t t
***
Next, we consider the problem of checking whether a given set of vectors are
n
linearly independent or not. If we are working in , we can use the concept
of rank of a matrix.
Example 8: Check whether the following subsets of 3 or 4 (as the case
may be) are linearly independent or not.
b) Let au + bv = 0,a,b .
b
Then ( −a,6a, − 12a) + , − 3b,6b) = (0,0,0)
2
b
i.e., −a + = 0,6a − 3b = 0, −12a + 6b = 0. Each of these equations is
2
equivalent to 2a − b = 0, which is satisfied by many non-zero values of a and
b (e.g., a = 1,b = 2).
c) Suppose au + bv = 0, a, b . Then
Subtracting (2) from (3) we get a − b,i.e.,a = b. Putting this in (1), we have
5b = 0. ,b = 0, and so, a = b = 0. Hence, {u,v} is linearly independent.
***
You know that the set {1,x,x ,...,x } P is linearly independent. For larger
2 n
and larger n, this set becomes a larger and larger linearly independent subset
P . This example shows that in the vector space P, we can have as large a
linearly independent set as we wish. In contrast to this situation look at the
following example, in which more than two vectors are not linearly
independent.
2
Example 9: Prove that in any three vectors form a linearly dependent set.
We wish to prove that there are real numbers, , , , not all zero, such that
u + v + w = 0. That is, u + v = −w. This reduces to the pair of
equations.
a1 + b1 = −c1
a2 + b2 = −c 2
Then, we can give a non-zero value and get the corresponding values of
and . Thus, if a1b2 − a2b1 0 we see that {u,v, w} is a linearly dependent set.
= (b1a1,b1a2 ) − (a1b1,a1b2 )
= (0,0)
i.e., b1u − a1v + 0.w = 0 and a1 0,b1 0.
Hence, in this case, also {u,v, w} is a linearly dependent set.
a) (1,2,3),(2,3,1),(3,1,2)
b) (1,2,3),(2,3,1),( −3, − 4,1)
c) ( −2,7,0),(4,17,2),(5, − 2,1)
d) ( −2,7,0),(4,17,2)
E8) Prove that in the vector space of all functions from to , the set
{sin x,cos x} is linearly independent, and the set
{sin x,cos x,sin(x + / 6)} is linearly dependent.
E9) Determine whether each of the following subsets of P is linearly
independent or not.
a) {x 2 ,x 2 + 2}
b) {x 2 + 1,x 2 + 11,,2x 2 − 3}
c) {3,x + 1,x 2 ,x 2 + 2x + 5}
d) {1,x 2 ,x 3 + x 2 + 1}
SOLUTIONS/ANSWERS
Reactants Products
Carbon 6x1 x2+2x3
Hydrogen 12x1 6x3
Oxygen 6x1 2x2+x3
So, the system of equations is
6x1 − x2 − x3 = 0
12x1 − 5x 3 = 0
6x1 − 2x 2 − x3 = 0
The associated matrix is
6 −1 −2
A = 12 0 −5
6 −2 −1
The RREF of A is
1
1 0 − 2
A = 0 1 −1
0 0 0
The non-pivot column corresponds to x3.Taking x 3 = , the
solution set is x1 = , x 2 = , and x 3 = , i.e.
2
1
,11 , .
2
Since we need a solution in natural numbers, we take = 2 and
get the solution x1 = 1, x 2 = 2, and x 3 = 2. The balanced equation
is
C5H12O6 → 2CO2 + 2C2H5OH
E 6) a) The associated matrix is
1 1 0 3
0 1 −1 1
1 0 1 0
The row operations R3 → R3 − R1, R3 → R3 + R2 gives the
matrix 73
Block 2
1 1 0 3
0 1 −1 1
0 0 0 −2
In the last row, the entries in all the columns, except the
last column, are zero and the last column is −2.
Therefore, the system system of equations is inconsistent
and b is not in [S] .
b) The associated matrix is
1 −1 2 1
1 −1 2 1
2 1 1 5
The RREF is
1 0 1 2
0 1 −1 1
0 0 0 0
The equations are consistent. The third column is
the non-pivot column. So, we take the third
variable as the free variable. We get 1 = 2 − ,
2 = 1 + and 3 = . Taking = 0, we get 1 = 1
2 = 1 and 3 = 0. So,
2(1,1,2) + ( −1, −1,1) + 0(2,2,1) = (1,1,5)
c) The associated matrix is
1 2 −1 0
1 1 0 3
1 −1 2 1
The row operations R2 → R2 − R1, R3 → R3 − R1
gives the matrix
1 2 −1 0
0 1 −1 −3
0 0 0 8
In the last row, entries in all the columns, except
the one in last column, is zero and the last column
is 8. So, the associated system of equations is
inconsistent. Therefore b is not in [S].
c) Linearly dependent
e) Linearly independent.
E 8) To show that {sin x,cos x} is linearly independent, suppose a,b such
that a sin x + b cos x = 0.
75