Vector Spaces: Persson@berkeley - Edu
Vector Spaces: Persson@berkeley - Edu
Definition
A vector space V over a field F is a set with the operations
addition and scalar multiplication, so that for each pair x, y V
there is a unique x + y V, and for each a F and x V there is
a unique ax V, such that:
(VS 1) For all x, y V, x + y = y + x.
Matrices
n-tuples
Example
The set of all n-tuples (a1 , a2 , . . . , an ) with a1 , a2 , . . . , an F is
denoted Fn . This is a vector space with the operations of
coordinatewise addition and scalar multiplication: If c F and
u = (a1 , a2 , . . . , an ) Fn , v = (b1 , b2 , . . . , bn ) Fn , then
u + v = (a1 + b1 , a2 + b2 , . . . , an + bn ), cu = (ca1 , ca2 , . . . , ca3 ).
u, v are equal if ai = bi for i = 1, 2, . . . , n. Vectors in Fn can be
written as column vectors
a1
a2
..
.
Example
An m n matrix is an array of the form
..
..
..
.
.
.
am1 am2
an
amn
(cA)ij = cAij
for 1 i m, 1 j n.
Example
Let F(S, F ) denote the set of all functions from a nonempty set S
to a field F . This is vector space with the usual operations of
addition and scalar multiplication: If f, g F(S, F ) and c F :
Corollary 1
The vector 0 described in (VS 3) is unique (the zero vector).
Corollary 2
The vector y described in (VS 4) is unique (the additive inverse).
Theorem 1.2
In any vector space V, the following statements are true:
(a) 0x = 0 for each x V
Subspaces
Verification of Subspaces
It is clear that properties (VS 1,2,5-8) hold for any subset of
vectors in a vector space. Therefore, a subset W of a vector space
V is a subspace of V if and only if:
Definition
A subset W of a vector space V over a field F if called a subspace
of V if W is a vector space over F with the operations of addition
and scalar multiplication defined on V.
Note that V and {0 } are subspaces of any vector space V. {0 } is
called the zero subspace of V.
1
2
3
4
x + y W whenever x, y W
cx W whenever c F and x W
W has a zero vector
(b) x + y W whenever x, y W
Example
The transpose At of an m n matrix A is the n m matrix
obtained by interchanging rows and columns of A, that is,
(At )ij = Aji .
A symmetric matrix A has
At
Theorem 1.4
Any intersection of subspaces of a vector space V is a subspace
of V .
However, the union of subspaces is not necessarily a subspace,
since it need not be closed under addition.
Linear Combinations
Definition
Let V be a vector space and S a nonempty subset of V. A vector
v V is called a linear combination of vectors of S if there exist a
finite number of vectors u1 , u2 , . . . , un in S and scalars
a1 , a2 , . . . , an in F such that
v = a1 u1 + a2 u2 + + an un .
In this case we also say that v is a linear combination of
u1 , u2 , . . . , un and call a1 , a2 , . . . , an the coefficients of the linear
combination
Note that 0v = 0 for each v V, so the zero vector is a linear
combination of any nonempty subset of V.
Span
Linear Dependence
Definition
Let S be a nonempty subset of a vector space V. The span of S,
denoted span(S), is the set consisting of all linear combinations of
the vectors in S. For convenience, we define span() = {0 }.
Theorem 1.5
The span of any subset S of a vector space V is a subspace of V.
Moreover, any subspace of V that contains S must also contain
the span of S.
Definition
A subset S of a vector space V generates (or spans) V is
span(S) = V. In this case, we also say that the vectors of S
generate (or span) V.
Properties
Definition
A subset S of a vector space V is called linearly dependent if there
exist a finite number of distinct vectors u1 , u2 , . . . , un in S and
scalars a1 , a2 , . . . , an , not all zero, such that
a1 u1 + a2 u2 + + an un = 0.
In this case we also say that the vectors of S are linearly dependent.
Definition
A subset S of a vector space that is not linearly dependent is called
linearly independent, and the vectors of S are linearly independent.
Properties of linearly independent sets
The empty set is linearly independent
A set with a single nonzero vector is linearly independent
A set is linearly independent the only representations of 0
as a linear combination of its vectors are trivial
Basis
Theorem 1.6
The V be a vector space, and let S1 S2 V. If S1 if linearly
dependent, then S2 is linearly dependent.
Definition
A basis for a vector space V is a linearly independent subset of V
that generates V. The vectors of form a basis for V.
Example
Corollary
Let V be a vector space, and let S1 S2 V. If S2 is linearly
independent, then S1 is linearly independent.
Theorem 1.7
Let S be a linearly independent subset of a vector space V, and let
v be a vector in V that is not in S. Then S {v} is linearly
dependent if and only if v span(S).
Properties of Bases
Theorem 1.8
Let V be a vector space and = {u1 , . . . , un } be a subset of V.
Then is a basis for V if and only if each v V can be uniquely
expressed as a linear combination of vectors of :
v = a1 u1 + a2 u2 + + an un
Corollary 1
Let V be a vector space having a finite basis. Then every basis for
V contains the same number of vectors.
Dimension
Definition
A vector space V if called finite-dimensional if it has a basis
consisting of a finite number of vectors, this unique number
dim(V) is called the dimension of V. If V is not finite-dimensional
it is called infinite-dimensional.
Corollary 2
Let V be a vector space with dimension n.
(a) Any finite generating set for V contains at least n vectors, and
a generating set for V that contains exactly n vectors is a
basis for V.
(b) Any linearly independent subset of V that contains exactly n
vectors is a basis for V.
(c) Every linearly independent subset of V can be extended to a
basis for V.
Dimension of Subspaces
Theorem 1.11
Let W be a subspace of a finite-dimensional vector space V. Then
W is finite-dimensional and dim(W) dim(V). Moreover, if
dim(W) = dim(V), then V = W.
Corollary
If W is a subspace of a finite-dimensional vector space V, then any
basis for W can be extended to a basis for V.