Lesson 4
Lesson 4
Lesson 4
4.1 Introduction
In this lecture we discuss about the basic algebraic structure involved in linear
algebra. This structure is known as the vector space. A vector space is a non-empty
set that satisfies some conditions with respect to addition and scalar multiplication.
Recall that by a scalar we mean a real or a complex number. The set of all real
numbers is called the real field and the set of all complex numbers ₵ is called
the complex field. Here onwards by a field F we mean the set of real or the set of
complex numbers. The elements of vector spaces are usually known as vectors and
that in field F are called scalars. In this lecture we also discuss about linearly
dependency or independency of vectors.
A non-empty set V together with two operations called addition (denoted by +) and
scalar multiplication (denoted by.), in short (V, +, .), is a vector space over a field
F if the following hold:
(1) V is closed under scalar multiplication, i.e. for every element α ϵ F and u ϵ V,
α.u ϵ V. (In place of α.u usually we write simply αu).
(2) (V, + ) is a commutative group, that is, (i) forevery pair of elements u, v ϵ V,
u+v ϵ V (ii) elements of V are associative and commutative with respect to +
(iii) V has the zero element, denoted by 0, with respect to +, i.e, u+0 =0+u=0,
for every element u of V and finally (iv) every element u of V has additive
inverse, i.e, there exists v ϵ V such that u+v = v+u = 0.
Vector Spaces, Linear Dependence and Independence
If V is vector space over F then elements of V are called vectors and elements of F
are called scalars.
For vectors v1, v2, . . . , vn in V and scalars α1, α2, . . . , αn in F the expression
α1v1, α2v2, . . . , αnvn is called a linear combination of v1, v2, . . . , vn. Notice that V
contains all finite linear combinations of its elements hence it is also called a linear
space.
(1) ₵ is a vector space over . But is not a vector space over ₵ as it is not
closed under scalar multiplication.
(2) If F= or Fn then
₵ = {( x1, x2, . . . , xn) : xi ϵ F, 1 ≤ i ≤ n} is a vector space
over F where addition and scalar multiplication are as defined below:
(3) The Space of m × n Matrices: Here Fm × n is the set of all m × n matrices over
F. Fm × n is a vector space over F with respect to matrix addition and matrix
scalar multiplication.
(4) The space of polynomials over F: Let (F) be the set of all polynomials over
F, i.e.,
P(F) is a vector space over F with respect to addition and scalar multiplication
of polynomials, that is,
The following results can be verified easily (proof of which can be taken as
exercise).
(a) α.0 = 0, for α ϵ F, here 0 is the additive identity of V or the zero vector.
(b) 0.u = 0, for u ϵ V, here 0 in the left hand side is the scalar zero i.e. additive
identity of F and 0 in right hand side is the zero vector in V.
Vector Spaces, Linear Dependence and Independence
4.3 Subspaces
The above two conditions of a subspace can be combined and expressed in a single
statement that: W is a subspace of V if and only if for u, v ϵ W and scalars α, β ϵ
F, αu + βv ϵ W.
(1) The zero vector of the vector space V alone i.e. {0} and the vector space V
itself are subspaces of V. These subspaces are called trivial subspaces of V.
2 2
(2) Let V = , the Euclidean plane, and W be the straight line in passing
through (0, 0) and (a, b), i.e. W = {(x, y) ϵ 2
: ax + by = 0}. Then W is a
2
subspace of . Whereas the straight lines which do not pass through the origin
2
are not subspaces of .
× n
(4) The set of all n × n Hermitian matrices is not a subspace of ₵n (the
collection of all n × n complex matrices), because if A is a Hermitian matrix
then diagonal entries of A are real and so iA is not a Hermitian matrix
(However the set of all n × n Hermitian matrices forms a vector space over .
Let V be a vector space over F and S be a subset of V. The liner span of S, denoted
by (S), is the collection of all possible finite linear combinations of elements in S.
Then (S) satisfies the following properties given in the theorem below.
2
Example 4.4.1: In if S = {(2, 3)} then (S) is the straight line passing through
2
(0, 0) and (2, 3) i.e. (S) = 2x + 3y = 0. If S = {(1, 0), (0, 1)} then (S) = .
A vector space can be expressed in terms of very few elements of it, provided that ,
these elements spans the space and satisfy a condition called linearly
independency. Short-cut representation of a vector space is essential in many
subjects like Information and Coding Theory.
Vector Spaces, Linear Dependence and Independence
Consider a vector space V over a field F and a set S={ v1, v2, . . . , vk } of vectors in
V. S is said to be linearly dependent if exist scalars α1, α2, . . . , αk (in F), not
all zero such that
α1v1 + α2v2 + . . . + αkvk = 0.
Step 1: Equate the linear combination of these vectors to the zero vector, that is,
α1v1 + α2v2 + . . . + αkvk = 0, where αi’s are scalars that we have to find.
Step 2: Solve for scalars α1, α2, . . . , αk. If all are equal to zero then S is a linearly
independent set, otherwise (i.e. at least one αi is non-zero) the S is linearly
dependent.
(3) Any set which contains the zero vector is linearly dependent.
Vector Spaces, Linear Dependence and Independence
3
Example 4.5.1: Let V = be the vector space (over ) and S1 = {(1, 2, 3), (1, 0,
2), (2, 1, 5)} and S2 = {(2, 0, 6), (1, 2, − 4), (3, 2, 2)} be subsets of V. We check
linearly dependency/independency of S1 and S2.
First consider the set S1. Let α1, α2, α3 be scalars such that
α1(1, 2, 3) + α2(1, 0, 2) + α3(2, 1, 5) = (0, 0, 0)
Then we have
(α1 + α2 + 2α3, 2α1 + α3, 3α1 + 2α2 + 5α3) = (0, 0, 0)
1 2 1 −2 1 2 1 −2
R 2 →R 2 − 2R1
2 1 3 −1 → 0 −3 1 3
2 0 1 4 2 0 1 4
1 2 1 −2 1 2 1 −2
R 3 → R 3 − 2R1 R 3 →3R 3 − 2R1
→ 0 −3 1 3 → 0 −3 1 3
0 −2 −1 8 0 0 −5 18
The last matrix is in echelon form and all the rows are non-zero. Hence S is
linearly independent.
Next we consider
While forming the matrix we may not have to take 1st vector in S1 as 1st row, 2nd
vector as 2nd row and so on. Since we have to convert the matrix into echelon form
we may take 1st row of the matrix a vector in S for which the 1st entry is non-zero.
So let the matrix be
Vector Spaces, Linear Dependence and Independence
1 2 0 3
0 1 1 1
.
0 1 2 −1
1 3 2 2
4.6 Conclusions
Vector spaces are the main ingredients of the subject linear algebra. Here we have
studied an important property of the vectors that is linearly
dependency/independency. This property will be used in almost all the lectures. In
the next lecture also we discuss about some basic terminologies associated with a
vector space.
Suggested Readings:
Linear Algebra, Kenneth Hoffman and Ray Kunze, PHI Learning pvt. Ltd., New
Delhi, 2009.
Vector Spaces, Linear Dependence and Independence
Linear Algebra and Its Applications, Fourth Edition, Gilbert Strang, Thomson
Books/Cole, 2006.