S1, S2 Linear
S1, S2 Linear
Vector Space
Vector space: Let V be a non-empty set of vectors and F be a scalar field. V is said to be a vector space over
the field F under the operations, vector addition (+ : V × V −→ V ), scalar multiplication (· : F × V −→ V ),
if the following axioms are satisfied.
1. Addition Axioms: For v1 , v2 , v3 ∈ V .
(a) v1 + v2 ∈ V ( closure property ),
(b) v1 + (v2 + v3 ) = (v1 + v2 ) + v3 ,
(c) v1 + 0 = v1 = 0 + v1 ,
(d) v1 + (−v1 ) = 0 = (−v1 ) + v1 ,
(e) v1 + v2 = v2 + v1 ,
2. Multiplicative Axioms: For v1 , v2 , v3 ∈ V and c1 , c2 , c3 ∈ F,
(a) c1 v1 ∈ V ,
(b) (c1 + c2 )v1 = c1 v1 + c2 v1 ,
(c) c1 (v1 + v2 ) = c1 v1 + c1 v2 ,
(d) 1v1 = v1 , where 1 is the multiplicative identity of F,
(e) c1 (c2 v1 ) = (c1 c2 )v1 = c2 (c1 v1 ).
Vector space - Examples:
1. R over R, C over C,
2. C over R,
3. Rn over R, Cn over C,
4. Mn (R) over R,
5. Pn (x) over R,
6. P (x) over R,
7. F(I) over R , where, F(I)= set of all real valued function on interval I.
Subspace of a vector space: Let V be a vector space over F, and W ⊆ V , then W is a subspace of V if
W itself is a vector space over F, w.r.t the same operations associated with V over F.
Theorem: Let V be a vector space over a field F, W ⊆ V , W is a subspace of V iff
(i) 0 ∈ W ,
(ii) cα + β ∈ W , for all c ∈ F and α, β ∈ W .
• Let V be a vector space over a field F, and let V1 and V2 be two subspace of V , then V1 ∩ V2 , and V1 + V2
are subspaces of V , where V1 + V2 = {(v1 + v2 ) : v1 ∈ V1 , v2 ∈ V2 }
• V1 ∪ V2 need not be a subspace of V , For instance, Let V = R2 (F = R),
V1 = {(x, 0) : x ∈ R} and V2 = {(0, y) : y ∈ R}, then (1, 0), (0, 1) ∈ V1 ∪ V2 but (1, 1) ∈
/ V1 ∪ V2
Linear Independence and Dependence, Basis, Dimension
Linear combination: A vector v ∈ V is said to be a linear combination of the vectors v1 , v2 . . . , vn , if there
exist scalars c1 , c2 , . . . cn ∈ F such that c1 v1 + c2 v2 + · · · + cn vn = v
Linear dependence: Let S ⊆ V , then S is said to be linearly dependent, if there exist a non zero coefficient
vector c1 v1 + · · · + ck vk = 0 for some vectors v1 , . . . , vk ∈ S.
Linear independence: A subset S of V is linearly independent if it is not linearly dependent.
Spanning set: Let S = {v1 , v2 . . . , vn } be a subset of V , S is said to be a spanning set of V if any vector
v ∈ V can be expressed as linear combination of elements of S. v = c1 v1 + c2 v2 + · · · + cn vn , for some c1 ∈ F.
Linear span of a set: Let S ⊆ V , then the Linear Span L(S) of S is the smallest subspace of V containing
S, i.e, the intersection of all subspaces containing S
• If S = {v1 , . . . , vn ⊆ V }, then L(S) = {c1 v1 + c2 v2 + · · · + cn vn : ci ∈ F, vi ∈ S, ∀ i = 1, 2, . . . , n}, the
set of all linear combinations of the vectors in S.
• For any non-empty subset S of V , L(S) is a subspace of V .
• L(S) = S, if S itself a subspace of V .
Basis of a vector space: Let V be a vector space over a field F, and let S ⊆ V , then
(i) S is linearly independent,
(ii) S spans V , that is L(S) = V .
• Every non-trivial vector space has a basis.
Dimension: The cardinality of a basis of a vector space is called the dimension of the vector space. If it is
finite, then the vector space is said to be finite dimensional, otherwise the vector space the vector space is of
infinite dimension.
• Let dim V = n, then
(i) Any subset of V having more than n elements is linearly dependent
(ii) Let S be a subset of V having n elements then S is linearly independent iff L(S) = V .
• Let S be a subset of V , and let L(S) = V , where V is a vector space over F, then any maximum
number of linearly independent vectors in S form a basis for V . Suppose we delete every vector that is
linear combination of other vectors in S, then remaining set of vectors form a basis for V . That is any
spanning set contains a basis.
• Let V be a vector space over the field F, and let dim V = n, let S = {v1 , v2 · · · , vm }, m ≤ n be a
linearly independent set of vectors then S is a part of some basis for V . That is S can be extended as
a basis for V .
Dimension of subspace: Let V be a vector space over the field F with dim V = n, let W be a subspace of
V , then W is also finite dimensional and dim W ≤ dim V . dim W = dim V iff V = W .
Hyperspace: If dim W = n − 1, where W is a subspace of V , with dim v = n, then W is called a hyperspace
of V .
Linear Transformation
Linear transformations or linear maps:
Let V and W be two vector space over the same field F, a map T : V → W is said to be linear map if
T (v1 + v2 ) = T (v1 ) + T (v2 ), ∀ v1 , v2 ∈ V T (cv) = cT (v), ∀ c ∈ F, v ∈ V or T (cv1 + v2 ) = cT (v1 ) + T (v2 )
• If T : V → W is a linear transformation then T (0) = 0
Kernel and image of a linear tranformation:
1. Let V and W be two vector space over the same field F and map T : V → W is a linear transformation.
The kernel of T or the null space of T is given by ker(T ) = {v ∈ V : T v = 0}. ker(T ) is a subspace of
V and dim(ker(T )) is called nullity of T .
2. The image space of T or the range space of T is given by {T v : v ∈ V }, that is
{w ∈ W : w = T (v), for some v ∈ V }.
The range space is a subspace of w, and the dim(range(T )) is called the rank(T ).
Rank- Nullity Theorem: Let V and W be a finite dimensional vector space over the same field F, and let
T : V → W is a linear transformation, then rank(T ) + nullity(T ) = dim(V )
• Let V be a vector space over the same field F and T : V → V is a linear transformation, then
(i) nullity (T ) ≤ nullity (T 2 ) ≤ nullity (T 3 ) ≤ . . .
(ii) rank (T ) ≥ rank (T 2 ) ≥ rank (T 3 ) ≥ . . .
Singular and non-singular transformation: Let V and W be two vector space over the same field F,
and let T : V → W is a linear transformation.
(i) T is one-one or injective if v1 6= v2 ⇒ T (v1 ) 6= T (v2 ).
(ii) T is onto or surjective if T (V ) = W , that is img(T ) = W , that is rank(T ) = dim(W ).
Result:
• T is 1 − 1 iff ker(T ) = {0}.
• T is non-singular iff ker(T ) = {0}.
• If dim V = dim W < ∞, then T is 1 − 1 iff T is onto.
• If dim V > dim W and dim V < ∞, then T is singular.
Definition: Let V and W be two vector space over the same field F. V and W are said to be isomorphic if
there exist an invertible linear transformation T : V → W .
• If V and W are finite dimensional vector space over the same field F, then V and W are isomorphic iff
dim V = dim W .
• Let V be an n - dimensional vector space over a field F, then V ∼
= Fn .
• Let V be an infinite dimensional vector space over a field F then V ∼
= V × V . In general V ∼
= Vn
• R over Q is isomorphic to C over Q.
Definition: Let V be a vector space over the same field F, any linear transformation, T : V → V is also
called linear operator on V .
Definition: Let V be a vector space over the same field F, any linear transformation, T : V → F is also
called linear functional on V .
Matrix representation of a linear transformation: Let V and W be two vector space over the same
field F, with dim(V ) = n, dim(W ) = m, let B1 = {v1 , v2 , . . . , vn }, B2 = {u1 , u2 , . . . , um } be ordered bases
for V and W respectively. Let T : V → W be a linear transformation.
For each vi , T (vi ) can be uniquely expressed as linear combination of elements of B2 , that is
Matrices: Matrices are rectangular arrays of ‘elements’ (numbers, symbols or expressions) arranged in rows
and columns.
• The set of all n × n matrices with real entries is usually denoted as Mn (R).
• (Mn (R), +) is an Abelian group.
• (Mn (R), +, ·) is a non commutative ring with unity, which is not an integral domain.
Determinant of a square matrix: Determinant is a map from Mn (R) to R given by det : Mn (R) → R
satisfying the following properties:
(i) det(AT ) = det(A)
(ii) det(A0 ) = − det(A), where A0 is the matrix obtained from A by interchanging any two rows (Ri ) or
columns (Cj )
(iii) det(A) = 0, if any two rows or columns of A are proportional
(iv) det(kA) = k n det(A), k ∈ R
(v) If every element of a row or column of A can be expressed as the sum of two or more terms, then
determinant of A can be expressed as the sum of two or more determinants
(vi) det(A00 ) = det(A), where A00 is the matrix obtained from A by replacing Ri with Ri ± kRj , i 6= j (or Ci
with Ci ± kCj , i 6= j), where k ∈ R
Rank of a Matrix
Eigen Values/Vectors