Topic 5 Linear Combination Linear Dependence Spanning, Orthogonal-Word
Topic 5 Linear Combination Linear Dependence Spanning, Orthogonal-Word
If one vector is equal to the sum of scalar multiples of other vectors, it is said to be a linear
combination of the other vectors.
11 1 3 2∗1+3∗3
[ ] = 2[ ]+ 3[ ] = [ ]
16 2 4 2∗2+3∗4
a b c
Note that 2b is a scalar multiple and 3c is a scalar multiple. Thus, a is a linear combination
of b and c
A set of vectors is linearly independent if no vector in the set is (a) a scalar multiple of another
vector in the set or (b) a linear combination of other vectors in the set; conversely, a set of
vectors is linearly dependent if any vector in the set is (a) a scalar multiple of another vector in
the set or (b) a linear combination of other vectors in the set.
a = [1 2 3] b = [4 5 6] c = [5 7 9] d= [2 4 6] e = [0 1 0] f = [0 0 1]
▪ Vectors a and b are linearly independent, because neither vector is a scalar multiple of
the other.
▪ Vectors a and d are linearly dependent, because d is a scalar multiple of a; i.e., d = 2a.
▪ Vector c is a linear combination of vectors a and b, because c = a + b. Therefore, the set
of vectors a, b, and c is linearly dependent.
Example 5.0
(0 1 2) (3 2 1) (3 3 3) (3 4 5)
1
a b c d
Solution
Definition. Let V be a vector space over a field F, and let S be a subset of V. The span of S is
2
(Remember that by default, when you convert a vector in "parenthesis form" like " "
to a matrix, you convert the vector to a column vector.)
𝑏 = −4 That is,
The last matrix says " ", a contradiction. The system is inconsistent, so there are no such
numbers a and b. Therefore, (5, −2, 6) is not in the span of S.
Thus, to determine whether the vector is in the span of , , ..., in , form the
augmented matrix
If the system has a solution, b is in the span, and coefficients of a linear combination of the v's
which add up to b are given by a solution to the system. If the system has no solutions, then b is
not in the span of the v's.
(In a general vector space where vectors may not be "numbers in slots", you have to get back to
the definition of spanning set.)
3
Example 5.2 Consider the following set of vectors in 𝑅 3
This shows that, given any vector (𝑥, 𝑦, 𝑧) one can find a linear combination of the original three
vectors which equals (𝑥, 𝑦, 𝑧)
5.3 Orthogonality
Finding approximate solutions of equations generally requires computing the closest vector on a
subspace to a given vector. This becomes an orthogonality problem: one needs to know which
vectors are perpendicular to the subspace. It will be necessary to find the closest point on a
subspace to a given point
4
The closest point has the property that the difference between the two points is orthogonal, or
perpendicular, to the subspace. For this reason, we need to develop notions of orthogonality,
length, and distance.
For example,
1 4 1
(2) . (5) = (2) . (4 5 6) = 1.4 + 2.5 + 3.6 = 4 + 10 + 18 = 32
3 6 3
5
Therefore, for any vector x, we have:
• 𝑥. 𝑥 ≥ 0
• 𝑥. 𝑥 = 0 ⇔ 𝑥 = 0. This leads to a good definition of length.
The length of a vector x in Rn is the number
6
7
Orthogonal Vectors
8
9
Definition: A basis 𝐵 = {𝑣1 , … , 𝑣𝑘 } of a subspace S of 𝑅 𝑛 (including the whole space) is said to
be orthogonal if any two vectors in it are orthogonal to each other
10
If 𝐵 = {𝑣1 , … , 𝑣𝑘 } is an orthogonal basis of a subspace S of 𝑅 𝑛 , then any vector 𝑤 of 𝑆 is the
sum of its projections on each basis vector:
𝑣 .𝑤 𝑣2 .𝑤 𝑣 .𝑤
𝑤 = 𝑣1 .𝑣 𝑣1 + 𝑣 + ⋯ + 𝑣1 .𝑣 𝑣𝑘
1 1 𝑣2 .𝑣2 1 𝑘 𝑘
Proof
We know that each vector 𝑤 has a unique set of coordinates in the given basis:
𝑤 = 𝑤1 𝑣1 + 𝑤2 𝑣2 + … + 𝑤𝑘 𝑣𝑘
To see what these coordinates are, we can take the dot product of 𝑤 with 𝑣1 and use the fact that
the basis vectors are all orthogonal to each other:
which is what was claimed. Of course, we can do the same for all other vectors, thus proving the
above fact
In other words, for an orthogonal basis, the linear combination produced by the coordinates splits
the vector into the sum of its projections, just as we know to happen in the standard
(𝑥, 𝑦, 𝑧) basis.
1 1 1 1 −1 0
Example: 𝐵1 = ⟨⟦0⟧ | ⟦1⟧ | ⟦1⟧⟩ 𝐵2 = ⟨⟦1⟧ | ⟦ 1 ⟧ | ⟦0⟧⟩
0 0 1 0 0 3
Consider these two bases for 𝑅 3 . Both are bases and, from a certain point of view, 𝐵1 is better, as
its vectors only contain 0’s and 1’s. But let’s look at it from a coordinates point of view. If we
consider the vector w = 1, -2, 3], its coordinate in the first base are (3, -5, 3) :
1 1 1 1
3 ⟦0⟧ − 5 ⟦1⟧ + 3 ⟦1⟧ = ⟦−2⟧
0 0 1 3
To find these coordinates, we solve a linear system, which is not a difficult task, given the
simplicity of the numbers. But what do those coordinates tell us, beyond their meaning from the
definition? The coordinates in the second base are ( − 0.5, 1.5,1) and we can also get these
11
through a linear system, but their meaning is more concrete. The projection of 𝑤 on the first
vector is half the first vector, in the opposite direction:
𝑣1. 𝑤 ⟦1 1 0⟧. ⟦1 − 2 3⟧ 1
𝑃𝑟𝑜𝑗𝑣𝑤1 = 𝑣1 = 𝑣1 = − 𝑣1
𝑣1 . 𝑣1 ⟦1 1 0⟧. ⟦1 1 0⟧ 2
Therefore, the coordinates in this basis are given simply by the respective dot products and still
represent the multiple of the basis vector that is used to build up the given vector.
1⁄ − 1⁄
√2 √2 0
𝐵 ∗ = |1⁄ | , | 1⁄ | , |0|
√2 √2 1
{ 0 0 }
1If we divide each vector of the earlier basis 𝐵2 by its length, we obtain the orthonormal basis
we are considering here. The coordinate of the vector w = 1, -2, 3 in this basis are given by the
dot products:
1⁄ −1⁄
1 √2 1 √2 1 0
1 3
|−2| . |1⁄ | = − ; |−2| . | 1⁄ | = − √2 ; |−2| . |0| = 3
√2
3 √2 3 √2 3 1
0 0
−
1⁄ − 1⁄
√2 √2 0 1
− 1
− ⁄ |1 3
| − ⁄ | 1⁄ | + 3 |0| = |−2|
√2 ⁄√2 √2 √2 1 3
0 0
12