Chapter 2
Chapter 2
Chapter 2
Vector spaces
The axioms of a vector space
Definition (field): Let K be a set of numbers. We shall say that K together with the operation
addition and multiplication denoted by ( ) is a field if it satisfies the following
conditions:
Remark: The essential thing about a field is that it is a set of elements which can be added and
multiplied, in such a way that addition and multiplication satisfy the ordinary rules
of arithmetic, and in such a way that one can divide by non-zero elements.
Let K, L be fields, and suppose that K is contained in L (i.e. that K is a subset of L). Then we
say that K is a subfield of L. Thus every one of the fields which we are considering is a
subfield of the complex numbers. In particular, we can say that is a subfield of , and is
a subfield of .
Let K be a field. Elements of K will also be called numbers (without specification) if the
reference to K is made clear by the context, or they will be called scalars.
We are now ready to define the central algebraic object of Linear Algebra called vector spaces
over afield K.
Definition (vector space): let K be a field. A vector space V over the field K is a triple
( ) Consisting of a set V and two operations addition and multiplication
satisfying the following axioms called vector space axioms.
Given elements u, v, w of V:
1.
2. Commutative:
For all elements u, v of V, we have Note: .
3. Associativity:
( ) ( ).
4. Existence of additive identity:
There is an element of V, denoted by 0, such that for all elements u of V,
5. Existence of additive inverse
Given an element u of V, there exists an element - u in V such that ( ) .
6. If c is a number, then ( )
If a, b are two numbers, then ( ) .
7. If a, b are two numbers, then ( ) ( ).
8. For all elements u of V, we have (1 here is the number1)
Those eight properties are called axioms of vector spaces.
If a set fails to satisfy at least one of the above axioms then it is not a vector space.
Note:
To show that a set is not a vector space it is suffices to find at least one axiom
which is not satisfied.
Elements of the vector space V are called vectors.
Example 1: Consider sets 2 over the field
Clearly, for V = 2 and k =, axiom 1 and axiom 2 are hold true.
The other 5 properties can be easily verified. Hence 2 is a vector space over .
Thus when dealing with vector spaces, we shall always specify the field over which we
take the vector space.
Addition:
1. Closurity:
√ √ √ √
( ) ( )√ ( )√ ,Since ( ), ( ), ( )
2. Commutativity :
√ √ √ √
( ) ( )√ ( )√
( ) ( )√ ( )√
3. Associativity:
Let
( )
, ( )- , ( )-√ , ( )- √
=,( ) - ,( ) -√ ,( ) -√
( ) ( )√ ,( )- √ √ √
( )
4. Existence of identity:
√ √
√ √ , then √ √ √ √
5. Existence of inverse
Let √ √ , √ √
So, ( ) ( ) ( )√ ( )√
6. Scalar multiplication
Let [ √ √ ] √ √
( ) ,( ) ( )√ ,( )- √ -
[ √ √ ] [ √ √ ]
. Let
( ) ( )( √ √ ) ( ) ( ) √ ( ) √
=[ √ √ ] [ √ √ ]
. Let
( ) [ ( √ √ )] [( √ √ )]
( ( ) ( )√ ( )√ ) ( ) ( ) √ ( ) )√ ) ( )
. Let
[ √ √ ] √ √
Consequently, ( ) is a vector space over the field .
Exercise 2: . Let vector addition and scalar multiplication are defined as follows:
( ) ( ) ( ) and ( ) ( ) . Verify V is not a
vector space over .
Lemma: in any vector space V, for any we have;
1.
2.
3.
Proof
Then since the above eight axioms are true, itself is a vector space.
Example 1: Let and let be the set of vectors in whose last coordinate is equal to
. Then is a subspace of V,
Solution
Example 2: Let be a vector space. A be a vector in . Let W be the set of all elements B in
such that B· A = 0. i.e. B is perpendicular to A. Then show that W is a subspace
of .
Proof:
Let
i. Since , then 0 is in W.
ii. Suppose that B, C are perpendicular to A.
i.e B· A = 0 and B· c = 0
Claim
To show that B + C is in W
i.e (B + C).A=0
Now (B + C)· A = B· A + C· A = 0, so that B + C is also perpendicular to A.
iii. let , then ( ) ( ) , so that is perpendicular to A. This
proves that W is a subspace of .
Example 3: Let V be the set of all functions of into . Then V is a vector space over .
Let W be the set of all continuous functions. Then W is a subspace of v.
Proof:
i. The zero function is continuous. Thus, the zero vector is element of W.
ii. If are continuous functions, then is continuous.
iii. If c is a real number, then is continuous.
Hence W is a subspace of the vector space of all functions of in to .i.e. W is a
subspace of V.
To show that H is a subspace of 2, it is enough to show the above three properties hold in H.
Let u = (x1, y1) and w = (x2, y2) be in H. Then x1 + 4y1 = 0 and x2 + 4y2 = 0
Which shows u w H .
H is a subspace of 2
Theorem: Let V be a vector space and let be subspaces of V. then the intersection of U and
W is a subspace of V.
Proof: exercise
Example: Let U, W be subspaces of a vector space . By , we denote the set of all
elements with . Then verify that is a subspace
of .
is said to be generated by U and W, and called the sum of .
Solution
Exercise:
1. Show that the following sets of elements in form subspaces.
a) The set of all ( ) such that .
b) The set of all ( ) .
c) The set of all ( ) such that .
2. Show that the following sets of elements in form subspaces.
a) The set of all ( ) such that .
b) The set of all ( ) such that .
c) The set of all ( ) such that .
3. Let K be a subfield of a field L. Show that L is a vector space over K. In particular, and
are vector spaces over .
4. Let K be the set of all numbers which can be written in the form √ , where a, b are
rational numbers. Show that K is a field.
5. Let K be the set of all numbers which can be written in the form , where a, b are
rational numbers. Show that K is a field.
6. Let Show that the set of all vectors such
that is perpendicular to both is a subspace.
7. Suppose V is a nonempty subset of . Addition in is defined as
( ) ( ) ( ), and. ( ) ( ). Is V a vector space over ?
8. Show that the set of all positive real numbers, with redefined to equal the
usual , respectively, is a vector space. What is the "zero vector?"
Solution:
i. ( ) to be a linear combination of ( ) ( ), there must be
scalars a and b such that
i.e ( ) ( ) ( ) ( )
ii. ( ) to be a linear combination of U and V there must be scalars a and b such that
( ) ( ) ( )
But ( ) . /
Generators (Spans)
Definition: (Span of a set) let * + be a set of vectors in a vector space V, the set
of all linear combination of the vectors in S is called the span of S and it is denoted
by ( ).
That is ( ) * +
Solution:
Let ( ) arbitrary vector in
We need to show that there are scalars a, b and c such that
Let( ) ( ) ( ) ( )
( )
{ , ,
Definition: Let V be a vector space over the field K, and let be elements of V. We
say that are:
1. Linearly dependent over K if there exist elements not all equal
to 0 such that .
2. : if are numbers such that
, then
Example 1: Consider the vectors v1 = (1, -1,1) , v2 = (2, 0, -1) and v3 = (2, -2, 2)
a1 = 0 and a2 = 0
Hence v1 & v2 are linearly independent.
a1 = -2a2
Exercise:
1. Prove that the four vectors ( ), ( )( ) and ( ) in form a linearly
dependent set over , but any three of them are linearly independent over .
2. Prove that the three vectors ( )( ) ( ) form a linearly independent set
over
3. Prove that the three vectors ( )( ) ( ) form a linearly dependent
Proof 1)
) Let * + be a set with two or more vectors.
If we assume that S is linearly dependent, then there are scalars not all zero such
that .
Without loss of generality assume
as a linear combination of the other
vectors in S.
if is expressible as a linear combination of
the other vectors in S.
) Assume that at least one of the vectors; say in S is expressed as a linear
combination of the other vectors in S.
Then,
are linearly dependent.
Solution:
a. Let
( ) ( ) ( )
( ) ( ) ( ) ( )
The coefficient of the powers of t must be each zero ( to get the zero polynomial)
Therefore we must have
In , a set of three vectors is linearly independent if and only if the vectors do not lie in
the same plane when they are placed with their initial points at the origin.
Example 1: Show that e1 = (0, -1) and e2 = (2, 1) are a basis of 2.
Solution:
x 2y x
Since a1= and a2 = are elements of real number Therefore, any given (x, y), we can
2 2
obtained a real number a1 and a2 such that (x, y) can be written as a linear combination of e1 and
e2 as
4 6 4
For example, (4, 3) = (0,1) (2,1)
2 2
Or (4, 3) = -(0, -1) + 2(2, 1)
Note that {(1, 0), (0, 1)} is also a basis of 2. They are called standard basis 2.
Hence a vector space can have two or more basis. Find other bases of 2.
Solution:
i) W.t.s. S spans
Let ( ) be an arbitrary vector in . We must show the existence of
scalars a, b and c such that
Now ( ) ( ) ( )
( ) ( )
This gives
S spans
Whence and
Definition: Let B = {e1 , e2, …, en} be a basis of V. since B generates V, any u in V can be
represented as u = a1e1 + a2 e2 + … + an en.
Then the vector (a1, a2, …, an) is called coordinate vector of u with respect to the
basis B, and we call ai the i – th coordinate of v.
and
Example 3: Consider the set V of all polynomial functions f: which are of degree less
than or equal to 2.
solution
Exercise:1 Let be the set of polynomial functions Show that the polynomials
( ) – , ( ) and ( )
Theorem: Let V be a vector space over the field K. Let {v1, v2,…,vn} be a basis of V. If w1, w2,…,wm
are elements of V, where m > n, then w1, w2, …, wm are linearly dependent.
Theorem: Let V be a vector space and suppose that one basis B has n elements, and another basis W
has m elements. Then m = n.
Example: In 3 {(1, 0, 0), (0, 1, 1), (0, 2, 1)} is a maximal set of linearly independent
elements.
We now give criteria which allow us to tell when elements of a vector space constitute a basis.
Proof:
What we need to show is to prove generate V.
We shall first prove that each is a linear combination of . By hypothesis,
given there exist numbers not all 0 such that
Furthermore, because otherwise, we would have a relation of linear
dependence for . Hence we can solve for namely
Exercise
1. Express the given vector X as a linear combination of the given vectors ,
and find the coordinates of X with respect to .
a. ( ) ( ) ( )
b. ( ) ( ) ( )
c. ( ) ( ) ( )
d. ( ) ( ) ( )
2. Let ( ) ( ) be two vectors in the plane. If , show that
they are linearly dependent. If , show that they are linearly independent
3. Express the following as a linear combinations of
a.
b.
c.
d.
4. Find two different bases for the subspace of all vectors in whose first two components
are equal.
5. Suppose V is a vector space of dimension 7 and W is a subspace of dimension 4.
True or false:
a. Every basis for W can be extended to a basis for V by adding three more vectors;
b. Every basis for V can be reduced to a basis for W by removing three vectors.
Proof:
Assume that are linearly independent. Since
* + is a basis, there exist elements such that
In K such that
Theorem: Let be a vector space and suppose that one basis has n elements, and another basis
has m elements. Then .
) is a basis of over ,
3
Example1: Since ( ) ( ) (
3
then the dimension 3 of over R is 3.
Note: For any field K, the vector space has dimension n over K.
Remark: A vector space which has a basis consisting of a finite number of elements, or the
zero vector space, is called finite dimensional. Other vector spaces are called infinite
dimensional. It is possible to give a definition for an infinite basis.
Example 5: Let K be a field. Then K is a vector space over itself, and it is of dimension 1. In
fact, the element 1 of K forms a basis of K over K, because any element has a
unique expression as .
Proof:
By one of the previous theorems * + is a maximal set of linearly independent elements
of V.
Hence it is a basis by the above theorem.
Corollary: Let be a vector space of dimension n. Let r be a positive integer with r < n, and let
be linearly independent elements of V. Then one can find elements
such that is a basis of V.
Proof: Theorem: Let be a vector space having a basis consisting of n elements. Let W be a
subspace which does not consist of 0 alone. Then W has a basis, and the dimension of
.
i.e * +
The sum of U and W, is a subspace of V.
1. Let
Then (u1 w1 ) (u 2 w2 ) u1 u 2 w1 w2 U W
Definition: Let V be a vector space over the field K. Let U, W be subspaces of V. Then we say
that V is a direct sum of U and W if for every element v of V there exist unique
elements such that .
Theorem: Let V be a vector space over the field K, and let U, W be subspaces. If
i) , and
ii) * +,
Since U and W are vector spaces, and U and W are subset of V, they are subspaces of V.
U + W = ( x1 , x2 , x3 ), x1 , x2 , x3 R V
3
Thus; V = U + W
Theorem: Let V be a finite dimensional vector space over the field K. Let W be a subspace. Then
there exists a subspace U such that V is the direct sum of Wand U.
Theorem: If V is a finite dimensional vector space over K, and is the direct sum of subspaces
then .
Definition :( DIRECT PRODUCT) Suppose that U, W are arbitrary vector spaces over the
field K (i.e. not necessarily subspaces of some vector space). Then the vector
defined by
The addition and the product on such pairs component wise is defined as
1. if ( )( ) , then ( ) ( ) ( )
2. If , then ( ) is given by ( ) ( ).
Note: that the direct product of the vector space U and W, is a vector space.
If n is a positive integer, written as a sum of two positive integers, , then we see that
is the direct product . We note that ( ) .
Remark: we can extend the notion of direct sum and direct product of several factors. Let
be subspaces of a vector space V. We say that V is the direct sum
Exercises:
1. Let , and let W be the subspace generated by (2, 1). Let U be the subspace
generated by (0, 1). Show that V is the direct sum of . If is the subspace
generated by (1, 1), show that V is also the direct sum of
2. Let V = K3 for some field K. Let W be the subspace generated by (1, 0, 0), and let U
be the subspace generated by (1, 1, 0) and (0, 1, 1). Show that V is the direct sum of
W and U.
3. Let A, B be two vectors in R2, and assume neither of them is O. If there is no number
c such that , show that A, B form a basis of , and that is a direct sum of
the subspaces generated by A and B respectively.
4. Prove the last assertion of the section concerning the dimension of U x W If
* + is a basis of U and * + is a basis of W, what is a basis of U x W?