0% found this document useful (0 votes)
13 views10 pages

Lesson 4

Uploaded by

neetu019
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views10 pages

Lesson 4

Uploaded by

neetu019
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Module 1: Matrices and Linear Algebra

Lesson 4

Vector Spaces, Linear Dependence and Independence

4.1 Introduction

In this lecture we discuss about the basic algebraic structure involved in linear
algebra. This structure is known as the vector space. A vector space is a non-empty
set that satisfies some conditions with respect to addition and scalar multiplication.
Recall that by a scalar we mean a real or a complex number. The set of all real
numbers is called the real field and the set of all complex numbers ₵ is called
the complex field. Here onwards by a field F we mean the set of real or the set of
complex numbers. The elements of vector spaces are usually known as vectors and
that in field F are called scalars. In this lecture we also discuss about linearly
dependency or independency of vectors.

4.2 Vector Spaces

A non-empty set V together with two operations called addition (denoted by +) and
scalar multiplication (denoted by.), in short (V, +, .), is a vector space over a field
F if the following hold:

(1) V is closed under scalar multiplication, i.e. for every element α ϵ F and u ϵ V,
α.u ϵ V. (In place of α.u usually we write simply αu).

(2) (V, + ) is a commutative group, that is, (i) forevery pair of elements u, v ϵ V,
u+v ϵ V (ii) elements of V are associative and commutative with respect to +
(iii) V has the zero element, denoted by 0, with respect to +, i.e, u+0 =0+u=0,
for every element u of V and finally (iv) every element u of V has additive
inverse, i.e, there exists v ϵ V such that u+v = v+u = 0.
Vector Spaces, Linear Dependence and Independence

(3) For α, β ϵ F and u ϵ V, (α + β).u = α.u + β.u.

(4) For α ϵ F and u, w ϵ V, α. (u + w) = α.u + α.w.

(5) For α, β ϵ F and u ϵ V, α.(β.u) = (αβ).u

(6) 1.u = u, for all u ϵ V, where 1 is the multiplicative identity of F.

If V is vector space over F then elements of V are called vectors and elements of F
are called scalars.

For vectors v1, v2, . . . , vn in V and scalars α1, α2, . . . , αn in F the expression
α1v1, α2v2, . . . , αnvn is called a linear combination of v1, v2, . . . , vn. Notice that V
contains all finite linear combinations of its elements hence it is also called a linear
space.

Examples 4.2.1: Here we give example of some vector spaces.

(1) ₵ is a vector space over . But is not a vector space over ₵ as it is not
closed under scalar multiplication.

(2) If F= or Fn then
₵ = {( x1, x2, . . . , xn) : xi ϵ F, 1 ≤ i ≤ n} is a vector space
over F where addition and scalar multiplication are as defined below:

For x = (x1, x2, . . . , xn), y = (y1, y2, . . . , yn) ϵ Fn and α ϵ F,

x + y = (x1+y1, x2+y2, . . . , xn+yn).

αx = (αx1, αx2, . . . , αxn).

Fn is also called the n-tuple space.


Vector Spaces, Linear Dependence and Independence

(3) The Space of m × n Matrices: Here Fm × n is the set of all m × n matrices over
F. Fm × n is a vector space over F with respect to matrix addition and matrix
scalar multiplication.

(4) The space of polynomials over F: Let (F) be the set of all polynomials over
F, i.e.,

P(F) = { a0 + a1x + . . . + anxn : ai ϵ F, 1 ≤ i ≤ n, n ≥ 0 is an integer}.

P(F) is a vector space over F with respect to addition and scalar multiplication
of polynomials, that is,

(a0 + a1x + . . . + anxn) + (b0 + b1x + . . . + bmxm)

= c0 + c1x + c2x2 + . . . + ckxk

where ci = ai + bi, k = max {m, n}, ai = bj = 0

for i > n and j > m. And

α (a0 + a1x + . . . + anxn) = αa0 + αa1x + . . . + αanxn.

The following results can be verified easily (proof of which can be taken as
exercise).

Theorem 4.2.1: If V is a vector space over F then

(a) α.0 = 0, for α ϵ F, here 0 is the additive identity of V or the zero vector.

(b) 0.u = 0, for u ϵ V, here 0 in the left hand side is the scalar zero i.e. additive
identity of F and 0 in right hand side is the zero vector in V.
Vector Spaces, Linear Dependence and Independence

(c) (− α).u = − (α.u), for all α ϵ F, u ϵ V.

(d) If u ≠ 0 in V then α.u = 0 implies α = 0.

4.3 Subspaces

For every algebraic structure we have the concept of sub-structures. Here we


discuss about subspaces of vector spaces.

Let V be a vector space over F. A subset W of V is called a subspace of V if W is


closed under ‘ + ’ and ‘ . ’ (which are the addition and scalar multiplication of V).
In other words (i) for u, v ϵ W, u + v ϵ W and (ii) for u ϵ W and α ϵ F, αu ϵ W.

The above two conditions of a subspace can be combined and expressed in a single
statement that: W is a subspace of V if and only if for u, v ϵ W and scalars α, β ϵ
F, αu + βv ϵ W.

Example 4.3.1: Here we give some example of subspaces.

(1) The zero vector of the vector space V alone i.e. {0} and the vector space V
itself are subspaces of V. These subspaces are called trivial subspaces of V.
2 2
(2) Let V = , the Euclidean plane, and W be the straight line in passing
through (0, 0) and (a, b), i.e. W = {(x, y) ϵ 2
: ax + by = 0}. Then W is a
2
subspace of . Whereas the straight lines which do not pass through the origin
2
are not subspaces of .

(3) The set of all n × n symmetric matrices over F forms a subspace of Fn × n (F is a


field).
Vector Spaces, Linear Dependence and Independence

× n
(4) The set of all n × n Hermitian matrices is not a subspace of ₵n (the
collection of all n × n complex matrices), because if A is a Hermitian matrix
then diagonal entries of A are real and so iA is not a Hermitian matrix
(However the set of all n × n Hermitian matrices forms a vector space over .

4.4 Linear Span

Let V be a vector space over F and S be a subset of V. The liner span of S, denoted
by (S), is the collection of all possible finite linear combinations of elements in S.
Then (S) satisfies the following properties given in the theorem below.

Theorem 4.4.1: For any subset S of a vector space V

(1) (S) is a subspace of V.

(2) (S) is the smallest subspace of V containing S, i.e. if W is any subspace of V


containing S then (S) contained in W.

2
Example 4.4.1: In if S = {(2, 3)} then (S) is the straight line passing through
2
(0, 0) and (2, 3) i.e. (S) = 2x + 3y = 0. If S = {(1, 0), (0, 1)} then (S) = .

4.5 Linearly Dependency/Independency

A vector space can be expressed in terms of very few elements of it, provided that ,
these elements spans the space and satisfy a condition called linearly
independency. Short-cut representation of a vector space is essential in many
subjects like Information and Coding Theory.
Vector Spaces, Linear Dependence and Independence

Consider a vector space V over a field F and a set S={ v1, v2, . . . , vk } of vectors in
V. S is said to be linearly dependent if exist scalars α1, α2, . . . , αk (in F), not
all zero such that
α1v1 + α2v2 + . . . + αkvk = 0.

If S is not linearly dependent then it is called linearly independent. In other words


S is linearly independent, if whenever α1v1 + α2v2 + . . . + αnvn = 0, all scalars αi
have to be zero. This suggests a method to verify linearly dependency or
independency of a given set of finite number of vectors, as given in the next sub-
section.

4.5.1 Verification of Linearly Dependency/Independency

Suppose the given set of vectors is S = {v1, v2, . . . , vk}.

Step 1: Equate the linear combination of these vectors to the zero vector, that is,
α1v1 + α2v2 + . . . + αkvk = 0, where αi’s are scalars that we have to find.

Step 2: Solve for scalars α1, α2, . . . , αk. If all are equal to zero then S is a linearly
independent set, otherwise (i.e. at least one αi is non-zero) the S is linearly
dependent.

Properties 4.5.1: Some properties of linearly dependent/independent vectors are


as given below.

(1) A superset of a linearly dependent set is linearly dependent.

(2) A subset of a linearly independent set is linearly independent.

(3) Any set which contains the zero vector is linearly dependent.
Vector Spaces, Linear Dependence and Independence

3
Example 4.5.1: Let V = be the vector space (over ) and S1 = {(1, 2, 3), (1, 0,
2), (2, 1, 5)} and S2 = {(2, 0, 6), (1, 2, − 4), (3, 2, 2)} be subsets of V. We check
linearly dependency/independency of S1 and S2.

First consider the set S1. Let α1, α2, α3 be scalars such that
α1(1, 2, 3) + α2(1, 0, 2) + α3(2, 1, 5) = (0, 0, 0)

Then we have
(α1 + α2 + 2α3, 2α1 + α3, 3α1 + 2α2 + 5α3) = (0, 0, 0)

And is equivalent to the system


α1 + α 2 + 2α3 =
0
2α1 + α3 =
0
3α1 + 2α 2 + 5α3 =
0

On solving this system we get α1 = α2, = α3 = 0, so S1 is linearly independent.


Next for S2, we can take α1 = α2 = 1 and α3 = − 1 and get

α1(2, 0, 6) + α2(1, 2, − 4) + α3(3, 2, 2) = 0.

So S2 is a linearly dependent set.

We can also test linearly dependency/independency of vectors in Fn (in particular


n
in ) using echelon form of a matrix. This method has been explained in the
example below.
Vector Spaces, Linear Dependence and Independence

Example 4.5.2: Let V = 4


and S = {(1, 2, 1, − 2), (2, 1, 3, − 1), (2, 0, 1, 4)} and S1
= {(0, 1, 2, − 1), (1, 2, 0, 3), (1, 3, 2, 2), (0, 1, 1, 1)} be subsets of V. We will check
linearly dependency/independency of S and S1.

We consider S first. We write the vectors in S as a matrix taking the vectors as


rows and then apply elementary row operations and convert it to echelon form. If
there is a zero row in the echelon form then the set is linearly dependent otherwise
linearly independent.

 1 2 1 −2   1 2 1 −2 
  R 2 →R 2 − 2R1  
 2 1 3 −1   →  0 −3 1 3 
2 0 1 4  2 0 1 4 
   

 1 2 1 −2   1 2 1 −2 

R 3 → R 3 − 2R1  R 3 →3R 3 − 2R1  

→  0 −3 1 3   →  0 −3 1 3 
 0 −2 −1 8   0 0 −5 18 
   

The last matrix is in echelon form and all the rows are non-zero. Hence S is
linearly independent.

Next we consider

S1 = {(0, 1, 2, − 1), (1, 2, 0, 3), (1, 3, 2, 2), (0, 1, 1, 1)}.

While forming the matrix we may not have to take 1st vector in S1 as 1st row, 2nd
vector as 2nd row and so on. Since we have to convert the matrix into echelon form
we may take 1st row of the matrix a vector in S for which the 1st entry is non-zero.
So let the matrix be
Vector Spaces, Linear Dependence and Independence

1 2 0 3
 
0 1 1 1
.
0 1 2 −1
 
1 3 2 2

We convert this to echelon form by applying elementary row operations and is


given by
1 2 0 3
 
0 1 1 1
.
0 0 1 −2 
 
0 0 0 0

There is a zero row in the echelon form so S1 is linearly dependent.

4.6 Conclusions

Vector spaces are the main ingredients of the subject linear algebra. Here we have
studied an important property of the vectors that is linearly
dependency/independency. This property will be used in almost all the lectures. In
the next lecture also we discuss about some basic terminologies associated with a
vector space.

Keywords: Vectors, scalars, vector spaces, subspaces, linearly dependent or


independent vectors.

Suggested Readings:

Linear Algebra, Kenneth Hoffman and Ray Kunze, PHI Learning pvt. Ltd., New
Delhi, 2009.
Vector Spaces, Linear Dependence and Independence

Linear Algebra, A. R. Rao and P. Bhimasankaram, Hindustan Book Agency, New


Delhi, 2000.

Linear Algebra and Its Applications, Fourth Edition, Gilbert Strang, Thomson
Books/Cole, 2006.

Matrix Methods: An Introduction, Second Edition, Richard Bronson, Academic


press, 1991.

You might also like