Vector Space
Vector Space
• Sets : A set is a well-defined collection of objects. The objects in a set can be anything:
numbers, peoples, letters, rivers etc. These objects are called the elements or members of
the set.
that 𝑎 ∗ 𝑎2 = 𝑎2 ∗ 𝑎 = 𝑒. Inverse a’ of a.
Example : 𝑄, + , 𝑍, + and 𝑅, +
𝑍 8 , + is not a group
i.e., 𝑥 ∗ 𝑦 = 𝑦 ∗ 𝑥
Ring : A ring 𝑅, +,: is a set 𝑅 together with two binary operations +
and : , which we call addition and multiplication, defined on 𝑅 such
that the following axioms are satisfied
1. 𝑅, + is an abelian group.
2. multiplication is associative i.e., 𝑎 : 𝑏 : 𝑐 = (𝑎 : 𝑏) : 𝑐
3. for all 𝑎, 𝑏, 𝑐 ∈ 𝑅, the left distributive law, 𝑎 : 𝑏 + 𝑐 = (𝑎 :
𝑏) + (𝑎 : 𝑐) and the right distributive law 𝑎 + 𝑏 : 𝑐 = 𝑎 : 𝑐 + (𝑏 :
𝑐) hold.
defined such that we can find the sum 𝑥 + 𝑦 and the product 𝑥. 𝑦 of any two elements 𝑥 and 𝑦. Then the set 𝐹 is
called a field, if the following axioms are satisfied for all the elements of the set.
5. Identity elements : There exist in 𝐹 an element 0 and an element 1 such that for every 𝑥 in 𝐹, 𝑥 + 0 = 𝑥
and 𝑥. 1 = 𝑥. These are called the zero element and the unity element respectively.
6. Inverse elements : (a) For each 𝑥 in 𝐹 , there exists an element denoted by −𝑥, called the negative or
(b) For each 𝑥 ≠ 0 in 𝐹, there exists an element denoted by 𝑥 EF called the multiplicative inverse of 𝑥,
F
such that 𝑥 : 𝑥 EF = 1. The element 𝑥 EF can be written as when convenient.
G
Vector Space
Definition : Let 𝑉 be a non-empty set with two operations:
Then 𝑉 is called a vector space (over the field 𝐾) if the following axioms hold for any vectors 𝑢, 𝑣, 𝑤 ∈
[A1] 𝑢 + 𝑣 + 𝑤 = 𝑢 + (𝑣 + 𝑤)
V associative law
[A2] There is a vector in 𝑉, denoted by 0 and called the zero vector, such that, for
[A3] For each 𝑢 ∈ 𝑉, there is a vector in 𝑉, denoted by −𝑢, and called the negative
Vector addition:
𝑥F , 𝑥O , … 𝑥M + 𝑦F , 𝑦O , … 𝑦M = (𝑥F + 𝑦F , 𝑥O + 𝑦O , … 𝑥M + 𝑦M )
scalar multiplication:
𝑘 : 𝑥F , 𝑥O , … 𝑥M = (𝑘𝑥F , 𝑘𝑥O , … , 𝑘 𝑥M )
𝑉, +,: is a vector space over the field 𝐾. i.e., 𝑉(𝐾) is a vector space.
Example 2 : If 𝑚 and 𝑛 are positive integers, the set of all 𝑚 × 𝑛 matrices with elements from an
arbitrary field 𝐹 forms a vector space with respect to the operations of matrix addition and scalar
multiplication.
efg e fhig
Example 4 : Consider the linear differential equation aa + 𝑎F + ⋯ + 𝑎M 𝑦 = 0 (1)
eG f eG fhi
The set 𝑉 of all real solutions of (1) is not empty, because the zero function is a solution. Hence 𝑉
vector space over R.
Linear combinations, spanning sets
• Linear combinations : Let 𝑉 be a vector space over a field 𝐾. A vector 𝑣 in
𝑉 is a linear combination of vectors 𝑢F, 𝑢O, … , 𝑢M in 𝑉 if there exist scalars
𝑎F, 𝑎O, … , 𝑎M 𝑖𝑛 𝐾 such that 𝑣 = 𝑎F𝑢F + 𝑎O𝑢O + ⋯ + 𝑎M 𝑢M
i.e., L(𝑆) = { 𝑎F 𝑢F + 𝑎O 𝑢O + ⋯ + 𝑎k 𝑢k , 𝑎P ∈ 𝐾}
Subspace
Definition : Let 𝑉 be a vector space over a field 𝐾 and let 𝑊 be a subset of V.
Then 𝑊 is a subspace of 𝑉 if 𝑊 is itself a vector space over 𝐾 with respect to
the operations of vector addition and scalar multiplication on 𝑉.
𝑥 𝑧
Example 2: Let 𝑉 = 𝑀O×O and 𝐾 = 𝑅. Let 𝑊 = 𝑧 𝑦 , 𝑥, 𝑦, 𝑧 ∈ 𝑅
Theorem 1 : A subset 𝑊 of a vector space 𝑉(𝐾) is a subspace of 𝑉 if and only if
a) 𝑊 in not empty
b) 𝑊 is closed under vector addition. i.e., if 𝑢 𝑎𝑛𝑑 𝑣 are in 𝑊, 𝑢 + 𝑣 also in 𝑊
c) 𝑊 is closed under scalar multiplication. i.e., for every 𝑢 𝑖𝑛 𝑊, 𝑘𝑢 is also in 𝑊 where 𝑘 ∈
𝐾.
i.e., 𝑈 + 𝑊 = {𝑢 + 𝑤, 𝑢 ∈ 𝑈, 𝑤 ∈ 𝑊}
A matrix is called an echelon matrix, or is said to be in echelon form, if the following two
conditions hold (where a leading nonzero element of a row of 𝐴 is the first nonzero element in
the row):
1. All zero rows, if any, are at the bottom of the matrix
2. Each leading nonzero entry in a row is to the right of the leading nonzero entry in the
preceding row.
1 2 3 4 5
Example : 0 6 0 8 9
0 0 0 3 0
0 0 0 0 0
Remarks
1. Suppose 0 is one of the vectors in 𝑣F , 𝑣O , … 𝑣k , say 𝑣F = 0. Then the vectors must be
linearly dependent.
3. Suppose two of the vectors 𝑣F , 𝑣O , … 𝑣k are equal or one is a scalar multiple of other, say
𝑣F = 𝑘𝑣O . Then the vectors must be linearly dependent.
4. If the set {𝑣F , 𝑣O , … 𝑣k } is linearly independent, then any rearrangement of the vectors
{𝑣Pi , 𝑣Pz , … 𝑣P{ } is also linearly independent.
vSuppose a vector space 𝑉 does not have a finite basis. Then 𝑉 is said
to be of infinite dimensional vector space.
Theorems on Basis
Theorem : Let 𝑉 be a vector space of finite dimension n. Then
• Definition: Let 𝑉 be a real vector space. Suppose to each pair of vectors 𝑢, 𝑣 ∈ 𝑉 there is assigned a
real number, denoted by 𝑢, 𝑣 . This function is called a (real) inner product or dot product on 𝑉 if
2. Symmetric property : 𝑢, 𝑣 = 𝑣, 𝑢
The vector space 𝑉 with an inner product is called a (real) inner product space.
Examples of Inner product space
• Example 1: 𝑅 O (𝑅)
< 𝑢, 𝑣 > = < 𝑢F, 𝑢O , 𝑣F, 𝑣O > = 3𝑢F𝑣F + 2 𝑢O𝑣O is an inner product on
𝑅O 𝑅 .
where 𝑝 𝑥 = 𝑎a + 𝑎F 𝑡 + 𝑎O 𝑡 O + ⋯ + 𝑎M 𝑡 M & 𝑞 𝑥 = 𝑏a + 𝑏F 𝑡 + 𝑏O 𝑡 O +
⋯ + 𝑏M 𝑡 M
Complex inner product space
• Definition : Let 𝑉 be a complex vector space. Suppose to each pair of vectors 𝑢, 𝑣 ∈ 𝑉
there is assigned a complex number, denoted by 𝑢, 𝑣 . This function is called a (complex)
inner product or dot product on 𝑉 if it satisfies the following axioms :
1. Linear property : 𝑎𝑢F + 𝑏𝑢O , 𝑣 = 𝑎 𝑢F , 𝑣 + 𝑏 𝑢O , 𝑣
2. Symmetric property : 𝑢, 𝑣 = 𝑣, 𝑢
3. Positive definite property : 𝑢, 𝑢 ≥ 0; and 𝑢, 𝑢 = 0 if and only if 𝑢 = 0.
The vector space 𝑉 over 𝐶 with an inner product is called a (complex) inner product space.
Distance and angle in two dimensional space
• In two dimensional space, 𝑃2 (𝑥2 , 𝑦2 )
𝑣
𝑃(𝑥1 , 𝑦1 ) 𝑃1 (𝑥1 , 𝑦1 )
𝜃 𝑢
O
𝑂
𝑥1 𝑥2 +𝑦1 𝑦2 š›,œ•
• The angle between the vectors 𝑢 and 𝑣 , cos 𝜃 = =
› .| œ |
𝑥12 +𝑦12 . 𝑥22 +𝑦22
Norm of a vector
• The inner product 𝑢, 𝑢 is nonnegative for any vector 𝑢. Thus its positive square root exists.
This nonnegative number is called the norm or length of 𝑢.
𝑢 = 𝑢, 𝑢
i.e., 𝑢 O = 𝑢, 𝑢
Remarks :
2. Every nonzero vector 𝑣 in 𝑉 can be multiplied by the reciprocal of its length to obtain the
F
unit vector 𝑣Ÿ = 𝑣
œ
Properties of inner product
• If 𝑉 is an inner product space, then for any vectors 𝑢, 𝑣 in 𝑉,
2. 𝑅𝑒 < 𝑢, 𝑣 > ≤ 𝑢 | 𝑣 |
3. 𝑢 + 𝑣 ≤ 𝑢 + | 𝑣 | (Triangle inequality )
2 2 2 2
4. 𝑢 + 𝑣 + 𝑢−𝑣 =2 𝑢 + 𝑣 (parallelogram law)
• Orthogonality : Two vectors 𝑢, 𝑣 ∈ 𝑉 where 𝑉 is an inner product space are said
to be orthogonal (or perpendicular) if < 𝑢, 𝑣 > = 0 .
• Orthonormal set: A set {𝑢𝑖 } of vectors in 𝑉 is said to be orthogonal, if all the pairs
of distinct vectors are orthogonal.
• Orthonormal set : The set is said to be orthonormal, if its is orthogonal and if each
𝑢𝑖 has length 1
(ii) 𝑢𝑖 = 1
Gram-Schmidt orthogonalization process
• Suppose {𝑣1 , 𝑣2 , … , 𝑣𝑛 } is a basis of an inner product space 𝑉. One can use this basis to
construct an orthogonal basis {𝑤1 , 𝑤2 , … , 𝑤𝑛 } of 𝑉 as follows. Set
𝑤1 = 𝑣1
….