Vector Space
Vector Space
v 1 + v 2 = v 2 + v 1 (for all v 1 , v 2 ∈ V)
2) (Associativity)
(v 1 + v 2 ) + v 3 = v 1 + (v 2 + v 3 ) (for all v 1 , v 2 , v 3 ∈ V)
(αβ)v = α(βv) (for all α, β, ∈ F and v ∈ V)
3) (Distributive Properties)
v + −v = 0
6) (Multiplicative Identity) 1v = v for every vector v ∈ V.
Note that the vector space Rn over the field R satisfies all these properties. Similarly, the set Fn is a vector space over F,
where vector addition and scalar multiplication are defined entrywise via the arithmetic of F. Other examples of vector spaces:
• The vector space V over the field F (where F is any general field), consisting of all countably infinite tuples (x1 , x2 , x3 , . . .),
where xi ∈ F for all entries i ∈ {1, 2, . . . , }, and where arithmetic is defined entrywise using arithmetic in F.
• The vector space V over the field R, where V is the set of all continuous functions of time t for t ∈ (−∞, ∞).
• The vector space V over the field R, where V is the set of all polynomial functions f (t) of degree less than or equal to
n. That is, V = {f (t) | f (t) = α0 + α1 t + α2 t2 + . . . + αn tn , αi ∈ R}.
III. S UBSPACES
Definition 1: Let V be a vector space over a field F. Let S ⊂ V be a subset of V. We say that S is a subspace if:
v1 + v2 ∈ S for all v 1 , v 2 ∈ S
αv ∈ S for all v ∈ V, α ∈ F
where addition and scalar multiplication are the same in S as they are in V.
It is easy to prove that if S is a subspace of vector space V over field F, then S is itself a vector space over field F. (It is
important to note that −v = (−1)v in proving this...why?).
Definition 3: Let {x1 , . . . , xk } be a collection of vectors in a vector space V over a field F. We define Span{x1 , . . . , xk }
as the set of all linear combinations of {x1 , . . . , xk }. Note that Span{x1 , . . . , xk } ⊂ V.
Definition 4: Let S be a subspace of a vector space V over a field F. We say that a collection of vectors {x1 , . . . , xk } span
S if Span{x1 , . . . , xk } = S.
Definition 5: We say that a collection of vectors {x1 , . . . , xk } in a vector space V (over a field F) are linearly independent
if the equation α1 v 1 + α2 v 2 + . . . + αk v k = 0 can only be true if αi = 0 for all i ∈ {1, . . . , k}.
Definition 6: A collection of vectors {x1 , . . . , xk } is a basis for a subspace S if the collection {x1 , . . . , xk } is linearly
independent in S and spans S.
The following lemmas have proofs that are identical (or nearly identical) to the corresponding lemmas proven in class for
the vector space Rn . The proofs are left as an exercise.
Lemma 6: A collection of vectors {x1 , . . . , xk } are linearly independent if and only if none of the vectors can be written
as a linear combination of the others.
Lemma 8: (k ≤ m) Let {x1 , . . . , xk } be a collection of vectors that are linearly independent in a subspace S. Let
{y 1 , . . . , y m } be a collection of vectors that span S. Then k ≤ m.
Lemma 9: Any two bases of a subspace S have the same size, defined as the dimension of the subspace.
Lemma 11: Let S be a subspace of a vector space V, where V has dimension n. Then S has a finite basis, and the dimension
of S is less than or equal to n.
Note that the standard basis for Fn is given by {e1 , . . . , en }, where ei is a n-tuple with all entries equal to 0 except for
entry i, which is equal to 1.
Note that the collection {1, t, t2 , . . . , tn } is a basis for the vector space V over the field R, where V is the space of all
polynomial functions of degree less than or equal to n (why is this true?). Thus, this vector space has dimension n + 1. Note
also that, for any n, this vector space is a subspace of the vector space over R defined by all continuous functions. Thus, the
dimension of the vector space of all continuous functions is infinite (as it contains subspaces of dimension n for arbitrarily
large n).
V. M ATRICES
Let A be a m × n matrix with elements in F. Note that the equation Ax = 0 (where 0 ∈ Fm and x ∈ Fn ) has only the
trivial solution x = 0 ∈ Fn if and only if the columns of A are linearly independent.
Lemma 12: A square n×n matrix A (with elements in F) is non-singular if and only if its columns are linearly independent,
if and only if Ax = 0 has only the trivial solution.
The above lemma follows from the fact that if A is non-singular, it has a single unique solution to Ax = b for all b ∈ Fn
(which is true by Gaussian Elimination), and if it is singular it does not have a solution for some vectors b ∈ Fn and it has
multiple solutions for the remaining b ∈ Fn .
Lemma 13: Let A, B be square n × n matrices. If AB = I, then both A and B are invertible, and A−1 = B and B −1 = A.
The above lemma follows from the fact that AB = I implies A is non-singular (why?) and hence invertible.
Lemma 14: A square n × n matrix A (with elements in F) has linearly independent columns (and hence is invertible) if
and only if its transpose AT has linearly independent columns (and hence is invertible). Thus, a square invertible matrix A
has both linearly independent rows and linearly independent columns.
The above lemma follows from the fact that AA−1 = I, and hence (A−1 )T AT = I.
That is, the probability that at least one of the events occurs is less than or equal to the sum of the individual event
probabilities.