Vector Space
Vector Space
Vector Space
Tech Students Paper Name: Mathematics Paper Code : M201 Teacher Name: Rahuldeb Das
Lecture 1 Definition
DEFINITION (Vector Space) A vector space over satisfying the following axioms: 1. VECTOR ADDITION: To every pair in 1. 2. 3. There is a unique element every 4. For every in such that (Commutative law). (Associative law). (the zero vector) such that for denoted is a non-empty set,
(called the additive identity). there is a unique element (called the additive inverse). such that
is called VECTOR ADDITION. 2. SCALAR MULTIPLICATION: For each element 1. in such that for every and and there corresponds a unique
2. for every where 3. DISTRIBUTIVE LAWS: RELATING VECTOR ADDITION WITH SCALAR
MULTIPLICATION
For any 1.
and
2. Note: the number 0 is the element of Remark The elements of whereas is the zero vector. are called VECTORS. If the vector space is
the vector space is called a REAL VECTOR SPACE. If called a complex vector space. We may sometimes write THEOREM Let 1. 2. 3. Proof. Proof of Part 1. For Hence, by Axiom 1d there exists is equivalent to such that for a vector space if
Then
Hence, using the first part, one has Now suppose (note that If
for any then the proof is over. Therefore, let us assume exists and
as
Thus we have shown that if Proof of Part 3. We have 6pt depth 0pt Examples 1. The set and
and hence
height6pt width
of real numbers, with the usual addition and multiplication (i.e., ) forms a vector space over
For
and
Then 3. Let
numbers. For
(called component wise or coordinate wise operations). Then is a real vector space with addition and scalar multiplication defined as above. This vector space is denoted by called the real vector space of -tuples.
4. Let (the set of positive real numbers). This is NOT A VECTOR SPACE under usual operations of addition and scalar multiplication (why?). We now define a new vector addition and scalar multiplication as
Subspaces
DEFINITION (Vector Subspace) Let a subspace of if be a NON-EMPTY SUBSET of whenever and is said to be where the vector
addition and scalar multiplication are the same as that of Remark Any subspace is a vector space in its own right with respect to the vector addition and scalar multiplication that is defined for EXAMPLE 1. Let 1. 2. be a vector space. Then the set consisting of the zero vector These are called trivial subspaces. Then passing through the origin.) Then is not a subspace of but it doesn't pass through the origin.) Then is a subspace of ( is is a subspace of ( is a
Hence
is not a subspace of
But if we think of as a subset of the real vector space (component wise addition and scalar multiplication), then is a subspace. Check that is a subspace of the point 7. Consider the set , the zero vector of ). is a real as well as a ( represents a line passing through
be a subset of
. Then
. The reason , as
Assignment
1. Let be the set of all real numbers. Define the addition in by Prove
and the scalar multiplication by that is a real vector space with respect to the operations defined above. 2. Which of the following are correct statements? 1. Let 2. Let be a vector space. Let vector subspace of 3. Let real vector space, Multiple Choice Questions: 1. If in a vector space c. = , then Then is a subspace of
a) c= o and = c) only =
b) only c= o d) c= o or =
2. If S and T are two subspace of a vector space V then which one of the followings is a sub space of V also ? a) S U T c) S -T b) S T d) T- S
3. If a finite set has 5 elements, then its power set has a) 5 ! c) 10 b) 25 d) None
If
(3.1.1)
and
4. as . LEMMA (Linear Span is a subspace) Let empty subset of Then is a subspace of and hence is non-empty subset of and scalars Let such Hence, be a vector space and let be a non, and if , take and
Thus,
Remark Let
is a subspace of
To show containing
is the Then
and hence the result follows. matrix with real entries. Then using the rows we define
Note that the ``column space" of a matrix solution. Hence, LEMMA Let Then THEOREM Let be an be a real
such that
has a
matrix. Suppose
1. is a subspace of ; 2. the non-zero row vectors of a matrix in row-reduced form, forms a basis for the row-space. Hence Proof. Part can be easily proved. Let be an matrix. For part Then Then, a repeated That is, if the rows of then let be the
row-reduced form of
Linear Independence
DEFINITION (Linear Independence and Dependence) Let any non-empty subset of If there exist some non-zero 's be such that is called linearly
then the set is called a linearly dependent set. Otherwise, the set independent. EXAMPLE 1. Let Then check that Since is a solution of, so the set 2. Let
and
Then check that in this case we necessarily have which shows that the set is a linearly independent subset of In other words, if check whether the set equation is a non-empty subset of a vector space then to is linearly dependent or independent, one needs to consider the
(1)
is THE ONLY SOLUTION of (1), the set becomes a Otherwise, the set becomes a linearly dependent subset
PROPOSITION Let
be a vector space.
1. Then the zero-vector cannot belong to a linearly independent set. 2. If is a linearly independent subset of then every subset of is also linearly independent. 3. If is a linearly dependent subset of then every set containing is also linearly dependent. Proof. We give the proof of the first part. The reader is required to supply the proof of other parts. Let be a set consisting of the zero vector. Then for any Hence, for the system we have a non-zero solution linearly dependent. THEOREM Let Suppose there exists a vector linearly dependent, then Proof. Since the set
NOT ALL ZERO
and
is
be a linearly independent subset of a vector space such that the set is a linear combination of is linearly dependent, there exist scalars such that (2) is
CLAIM: Let if possible not all Then equation (2) gives zero. Hence, by the definition of linear independence, the set is linearly dependent which is contradictory to our hypothesis. Thus, and we get with
and hence
for
Hence
We now state two important corollaries of the above theorem. We don't give their proofs as they are easy consequence of the above theorem. COROLLARY Let Then there exists a smallest be a linearly dependent subset of a vector space such that
be a linearly independent subset of a vector space such that Then the set
Assignment
1. Show that any two row-equivalent matrices have the same row space. Give examples to show that the column space of two row-equivalent matrices need not be same. 2. Find all the vector subspaces of 3. Find the conditions on the real numbers so that the set . . Further show that if
consists of all polynomials of degree , then is not a subspace. 5. Conisder the vector space given in Example 5. Determine all its vector subspaces. 6. Let and be two subspaces of a vector space Show that When is is a subspace a
show that
8. Let
where Determine
all
such that Let Find all choices for the vector Does there exist
choices for vectors and such that the set is linearly independent subset of ? 10. If none of the elements appearing along the principal diagonal of a lower triangular matrix is zero, show that the row vectors are linearly independent in The same is true for column vectors. 11. Let the vector 12. Show that 13. Show that general if is linearly dependent in is a linearly independent set in is a linearly independent set then is also a linearly independent set. 14. In give an example of vectors and such that is linearly In Determine whether or not
dependent but any set of vectors from is linearly independent. 15. What is the maximum number of linearly independent vectors in
2. The maximum number of independent vectors among the four (1,2,3,4), (0,4,3,1), (0,0,7,1) and (0,0,0,0) is a) 4 c) 2 b) 3 d) 1
Lecture 3
Bases
DEFINITION (Basis of a Vector Space) 1. A non-empty subset of a vector space is called a basis of 1. is a linearly independent set, and i.e., every vector in of the elements of 2. A vector in is called a basis vector. Remark Let 2. if
Then any
is a
unique linear combination of the basis vectors, Observe that if there exists a such that then and
By convention, the linear span of an empty set is vector space EXAMPLE 1. Check that if or
or
2. For
cannot be a basis of
forms a basis of and That is, can be written as is a complex vector space. Hence, a
That is,
and hence
6. Recall the vector space the vector space of all polynomials with real coefficients. A basis of this vector space is the set
This basis has infinite number of vectors as the degree of the polynomial can be any positive integer. DEFINITION (Finite Dimensional Vector Space) A vector space is said to be finite dimensional if there exists a basis consisting of finite number of elements. Otherwise, the vector space is called infinite dimensional.
Remark We can use the above results to obtain a basis of any finite dimensional vector space as follows: Step 1: Choose a non-zero vector, say, independent. Step 2: If such that independent. we have got a basis of Then the set is linearly
then
is a basis of
Assignment
1. Let be a subset of a vector space Suppose but is not a linearly independent set. Then prove that each vector in can be expressed in more than one way as a linear combination of vectors from 2. Show that the set is a basis of 3. Let be a matrix of rank Then show that the non-zero rows in the row-reduced echelon form of are linearly independent and they form a basis of the row space of
1. The value of k for which the two vectors (k,6) and (2,k) form a basis of V2 is a) 2 3 c) any value 2. The vectyors (1,2,3) and (4,-2,7) are a) linearly independent b) linearly dependent b) - 2 3 d) any value except 23
c) form a basis of V3 is a) 2 c) 3
d) None
3. The dimension of the sub space {(x1,x2,x3) : xi R and 2x1 + x2 x3= 0} b) 1 d) None
Lecture 4
Basic Properties
DEFINITION (Linear Transformation) Let and is called a linear transformation if be vector spaces over A map
We now give a few examples of linear transformations. EXAMPLE 1. Define transformation as by for all Then is a linear
to
1. Define 2. For any 3. For a fixed vector Note that examples values for the vector and define define can be obtained by assigning particular
3. Define Then
be
Then is a linear transformation. Such a linear transformation is called the zero transformation and is denoted by DEFINITION (Identity Transformation) Let be a vector space and let the map defined by
be
Then is a linear transformation. Such a linear transformation is called the Identity transformation and is denoted by THEOREM Let an ordered basis of vectors In other words, Proof. Since is determined by for any there exist scalars such that be a linear transformation and be Then the linear transformation is a linear combination of the
is a basis of
Therefore, to know
of
DEFINITION (Inverse Linear Transformation) Let be a linear transformation. If the map is one-one and onto, then the map defined by is called the inverse of the linear transformation
Note that
the identity transformation. Verify that is indeed the inverse of the linear transformation and the linear transformation
Thus,
for
Then
is defined as
Assignment
1. Which of the following are linear transformations answers. 1. Let 2. Let 3. Let 4. Let and with with and with with Justify your
Lecture 5
Rank-Nullity Theorem
DEFINITION (Range and Null Space) Let over the same set of scalars and 1. 2. We now prove some results associated with the above definitions. PROPOSITION Let and be finite dimensional vector spaces and let is an ordered basis of and be finite dimensional vector spaces be a linear transformation. We define
2. 3. 2. 1. 2. 3. is one-one is a basis of 4. Proof. The results about the readers. We now assume that Let if and only if and can be easily proved. We thus leave the proof for is the zero subspace of is a subspace of
is one-one. We need to show that Also for any linear transformation, is one-one implies That is,
Let
is one-one. So, let us assume that for some This implies, Hence, is one-one.
Then, by linearity of This in turn implies The other parts can be similarly proved. Remark 1. The space SPACE of 2. We write 3. of is called the RANGE SPACE of and and and
EXAMPLE Determine the range and null space of the linear transformation
Solution: By Definition
We therefore have
Also, by definition
THEOREM (Rank Nullity Theorem) Let be a finite dimensional vector space. Then
or equivalently
and
such that
We now prove that the set is linearly independent. Suppose the set is not linearly independent. Then, there exists scalars, not all zero such that That is,
So, by definition of
such that
is a basis of
is a basis of
By definition, is invertible if is one-one and onto. But we have shown that one if and only if is onto. Thus, we have the last equivalent condition. Remark Let be a finite dimensional vector space and let transformation. If either is one-one or is onto, then is invertible.
be a linear
The following are some of the consequences of the rank-nullity theorem. The proof is left as an exercise for the reader. COROLLARY The following are equivalent for an 1. 2. There exist exactly 3. There exist exactly 4. There is a real matrix
rows of that are linearly independent. columns of that are linearly independent. with non-zero determinant and every
sub matrix of
sub matrix of has zero determinant. 5. The dimension of the range space of is 6. There is a subset of consisting of exactly linearly independent vectors such that the system 7. The dimension of the null space of for is consistent.
Assignment
1. Let be a linear transformation and let is linearly be
and
and
4. Let
be defined by
1. Find
for
2. Find and Also calculate and 3. Show that and find the matrix of the linear transformation with respect to the standard basis. 5. Let 1. If of 2. If be a linear transformation. is finite dimensional then show that the null space and the range space are also finite dimensional. and are both finite dimensional then show that 1. if 2. if 6. Let be an 3. if real matrix. Then then the system has infinitely many solutions, such then then is onto. is not one-one.
4. if then there exists a non-zero vector that the system does not have any solution. 7. Let be a vector space of dimension . Suppose and let and let
be an ordered .
basis of
is a basis of
be an
Lecture 6
Definition
In , given two vectors , we know the inner product and , this inner product
. Note that for any satisfies the conditions and if and only if an arbitrary vector space.
. Thus, we are motivated to define an inner product on be a vector space over An inner product
and
and
and equality holds if and only if be a vector space with an inner product
EXAMPLE The first two examples given below are called the STANDARD INNER
PRODUCT or
and
respectively..
1. Let
Verify 2. Let
is an inner product. be a complex vector space of dimension and in Then for check that
is an inner product.
3. Let
and let
Define
Check that
is
an inner product. Hint: Note that 4. let Show that is an inner product in 5. Consider the real vector space . In this example, we define three products that satisfy two conditions out of the three conditions for an inner product. Hence the three products are not inner products. 1. Define Then it is easy to verify that the third condition is not valid whereas the first two conditions are valid. 2. Define Then it is easy to verify that the first condition is not valid whereas the second and third conditions are valid. 3. Define Then it is easy to verify that the second condition is not valid whereas the first and third conditions are valid.
A very useful and a fundamental inequality concerning the inner product is due to Cauchy and Schwartz. The next theorem gives the statement and a proof of this inequality. THEOREM (Cauchy-Schwartz inequality) Let for any The equality holds if and only if the vectors , then DEFINITION 5.1.8 (Angle between two vectors) Let every be a real vector space. Then for by the Cauchy-Schwartz inequality, we have and are linearly dependent. Further, if be an inner product space. Then
We know that
such that
1. The real number with angle between the two vectors 2. The vectors and in
is called the
with
in
THEOREM Let be an inner product space. Let zero, mutually orthogonal vectors of 1. Then the set 2. 3. Let and also let for is linearly independent.
be a set of non-
In particular,
for all
if and only if
Therefore, we have obtained the required result. DEFINITION (Orthonormal Set) Let mutually orthogonal vectors for If the set is also a basis of is called an orthonormal basis of then the set of vectors be an inner product space. A set of non-zero, in is called an orthonormal set if
EXAMPLE
with the standard inner product. Then the standard is an orthonormal set. Also, the basis is an orthonormal set.
Assignment
1. Recall the following inner product on for and
1. Find the angle between the vectors 2. Let 3. Find two vectors 2. Find an inner product in Find such that
and
[Hint: Consider a symmetric matrix solve a system of equations for the unknowns
Define .]
and
3. Let Find with respect to the standard inner product. 4. Let be a subspace of a finite dimensional inner product space . Prove that
5. Let
be the real vector space of all continuous functions with domain Then show that is an inner product space with inner
and
and
6. Let
This inequality is called the TRIANGLE INEQUALITY . 7. Let Use the Cauchy-Schwartz inequality to prove that
Lecture 7
Definitions
In this chapter, the linear transformations are from a given finite dimensional vector space to itself. Observe that in this case, the matrix of the linear transformation is a square matrix. So, in this chapter, all the matrices are square matrices and a vector means for some positive integer Let be a matrix of order In general, we ask the question: there exist a non-zero vector such that (1)
Here, stands for either the vector space equivalent to the equation
over
or
over
Equation (1) is
So, to solve (1), we are forced to choose those values of Observe that therefore lead to the following definition.
is a polynomial in
be a matrix of order
and is denoted by If is a
Some books use the term EIGEN VALUE in place of characteristic value.
THEOREM Let
is a root of
the characteristic equation. Then there exists a non-zero Proof. Since the matrix is a root of the characteristic equation,
has a non-zero solution. Remark Observe that the linear system So, we consider only those has a solution for every
DEFINITION (Eigen value and Eigenvector) If the linear system zero solution 1. 2. 3. the tuple for some then
has a non-
is called an eigen value of is called an eigenvector corresponding to the eigen value is called an eigen pair. of and
Remark To understand the difference between a characteristic value and an eigen value, we give the following example.
is
defined by
1. If in 2. If are
is considered a COMPLEX matrix, then the roots of and as eigen pairs. has no solution as characteristic
that is, if
is considered a REAL matrix, then then has no eigen value but it has
matrix
eigenvectors of
corresponding to the eigen value it is easily seen that if corresponding to the eigen value
eigenvector of
corresponding to an eigen value Suppose is singular and Corollary, the linear system That is, has
whenever
EXAMPLE
Then
That is is a repeated eigen value. Now check that the for is equivalent to the equation
And this has the solution Hence, from the above remark, is a representative for the eigenvector. Therefore, here we have two eigen values mathend000# but only one eigenvector.
2. Let roots
The characteristic equation has and we know that for every from to
and we can choose any two linearly independent vectors get and as the two eigen pairs. are linearly independent vectors in are eigen pairs for the identity matrix,
In general, if
then
Then
The characteristic Now check that the eigen pairs are and
In this case, we have two distinct eigen values and the corresponding eigenvectors are also linearly independent. The reader is required to prove the linear independence of the two eigenvectors.
Over
and
Assignment
1. Find the eigen values of a triangular matrix. 2. Find eigen pairs over for each of the following matrices:
and
and
matrix such that the eigenvectors of and are different. 4. Let be a matrix such that ( is called an idempotent matrix). Then prove that its eigen values are either 0 or or both.
Lecture 8
THEOREM 6.1.11 Let not necessarily distinct. Then be an matrix with eigen values and
EXERCISE 1. Let be a skew symmetric matrix of order value of 2. Let be a orthogonal matrix such that .If Then prove that 0 is an eigen , then prove that
Let
be an
matrix. Then in the proof of the above theorem, we observed that the is a polynomial equation of degree it has the form in Also,
Note that, in the expression substitute by elements of It turns out that the expression
is an element of
holds true as a matrix identity. This is a celebrated theorem called the Cayley Hamilton Theorem. We state this theorem without proof and give some implications. THEOREM (Cayley Hamilton Theorem) Let satisfies its characteristic equation. That is, holds true as a matrix identity. Some of the implications of Cayley Hamilton Theorem are as follows. be a square matrix of order Then
Remark
1. Let function,
2. Suppose we are given a square matrix of order and we are interested in calculating where is large compared to Then we can use the division algorithm to find numbers and a polynomial such that
3. Hence, by the Cayley Hamilton Theorem, 4. 5. Let be a non-singular matrix of order Then note that and
EXERCISE Find inverse of the following matrices by using the Cayley Hamilton Theorem
with is linearly
Proof. The proof is by induction on the number of eigen values. The result is obviously true if as the corresponding eigenvector is non-zero and we know that any set containing exactly one non-zero vector is linearly independent. Let the result be true for equation We prove the result for We consider the
(1)
We have
(2)
This is an equation in
We therefore get
Thus, we have the required result. We are thus lead to the following important corollary. COROLLARY The eigenvectors corresponding to distinct eigen values of an matrix are linearly independent.
Assignment
1. For an 1. 2. If matrix prove the following. and have the same set of eigen values. is an eigen value of an invertible matrix then then is an eigen value of for any positive and
4. If and are matrices with have the same set of eigen values.
2. Let
and be matrices for which 1. Do and have the same set of eigen values? 2. Give examples to show that the matrices and and let
3. Let be an eigen pair for a matrix another matrix 1. Then prove that
is an eigen pair for the matrix are respectively the eigen values of
Lecture 9
Diagonalisation
DEFINITION (Matrix Diagonalisation) A matrix is said to be diagonalisable if there exists a non-singular matrix such that is a diagonal matrix. Remark Let be an diagonalisable matrix with eigen values By
definition, is similar to a diagonal matrix Observe that as similar matrices have the same set of eigen values and the eigen values of a diagonal matrix are its diagonal entries.
EXAMPLE Let
1. Let Then has no real eigen value and hence doesn't have eigenvectors that are vectors in Hence, there does not exist any non-singular real matrix 2. In case, such that is a diagonal matrix. are and the and
respectively. Also,
Define a
complex matrix by
Then
THEOREM let be an matrix. Then linearly independent eigenvectors. COROLLARY 6.2.5 let be an distinct. Then is diagonalisable. EXAMPLE
has are
1. Let
Then
Hence,
has
eigen values It is easily seen that and are the only eigen pairs. That is, the matrix has exactly one eigenvector corresponding to the repeated eigen value Hence, by Theorem the matrix is not diagonalisable.
Hence,
has
Note that the set consisting of eigenvectors corresponding to the eigen value are not orthogonal. This set can be replaced by the orthogonal set the eigen value as which still consists of eigenvectors corresponding to . Also, the set
forms a basis of
is the corresponding
mutually orthogonal. In general, for any real symmetric matrix there always exist eigenvectors and they are mutually orthogonal. This result will be proved later.
Assignment
1. By finding the eigen values of the following matrices, justify whether or not for some real non-singular matrix and a real diagonal matrix
and
diagonalisable? , where if
4. Let be an matrix and an matrix. Suppose Then show that is diagonalisable if and only if both and are diagonalisable. 5. Let be a linear transformation with and
Then 1. determine the eigen values of 2. find the number of linearly independent eigenvectors corresponding to each eigen value? 3. is diagonalisable? Justify your answer.
6. Let be a non-zero square matrix such that diagonalised. 7. Are the following matrices diagonalisable?
Show that
cannot be