0% found this document useful (0 votes)
53 views12 pages

Topic 5 Linear Combination Linear Dependence Spanning, Orthogonal-Word

The document discusses linear combinations of vectors and linear dependence/independence. It provides examples to illustrate key concepts: 1) A vector a is a linear combination of vectors b and c if a can be expressed as a sum of scalar multiples of b and c. 2) A set of vectors is linearly dependent if one vector can be expressed as a linear combination of the others. 3) The span of a set of vectors is the set of all possible linear combinations of those vectors. A set spans a subspace if every vector in the subspace can be written as a linear combination of the set. 4) Orthogonal vectors are perpendicular to each other. An orthogonal basis is a basis

Uploaded by

wendykuria3
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views12 pages

Topic 5 Linear Combination Linear Dependence Spanning, Orthogonal-Word

The document discusses linear combinations of vectors and linear dependence/independence. It provides examples to illustrate key concepts: 1) A vector a is a linear combination of vectors b and c if a can be expressed as a sum of scalar multiples of b and c. 2) A set of vectors is linearly dependent if one vector can be expressed as a linear combination of the others. 3) The span of a set of vectors is the set of all possible linear combinations of those vectors. A set spans a subspace if every vector in the subspace can be written as a linear combination of the set. 4) Orthogonal vectors are perpendicular to each other. An orthogonal basis is a basis

Uploaded by

wendykuria3
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

5.

0 Linear Combination of Vectors

If one vector is equal to the sum of scalar multiples of other vectors, it is said to be a linear
combination of the other vectors.

For example, suppose a = 2b + 3c, as shown below.

11 1 3 2∗1+3∗3
[ ] = 2[ ]+ 3[ ] = [ ]
16 2 4 2∗2+3∗4

a b c

Note that 2b is a scalar multiple and 3c is a scalar multiple. Thus, a is a linear combination
of b and c

5.1 Linear Dependence of Vectors

A set of vectors is linearly independent if no vector in the set is (a) a scalar multiple of another
vector in the set or (b) a linear combination of other vectors in the set; conversely, a set of
vectors is linearly dependent if any vector in the set is (a) a scalar multiple of another vector in
the set or (b) a linear combination of other vectors in the set.

Consider the row vectors below.

a = [1 2 3] b = [4 5 6] c = [5 7 9] d= [2 4 6] e = [0 1 0] f = [0 0 1]

Note the following:

▪ Vectors a and b are linearly independent, because neither vector is a scalar multiple of
the other.
▪ Vectors a and d are linearly dependent, because d is a scalar multiple of a; i.e., d = 2a.
▪ Vector c is a linear combination of vectors a and b, because c = a + b. Therefore, the set
of vectors a, b, and c is linearly dependent.

Example 5.0

Consider the row vectors shown below.

(0 1 2) (3 2 1) (3 3 3) (3 4 5)

1
a b c d

Which of the following statements are true?

(A) Vectors a, b, and c are linearly dependent.


(B) Vectors a, b, and d are linearly dependent.
(C) Vectors b, c, and d are linearly dependent.
(D) All of the above.
(E) None of the above.

Solution

The correct answer is (D), as shown below.

▪ Vectors a, b, and c are linearly dependent, since a + b = c.


▪ Vectors a, b, and d are linearly dependent, since 2a + b = d.
▪ Vectors b, c, and d are linearly dependent, since 2c - b = d.

5.2 Spanning Sets

Definition. Let V be a vector space over a field F, and let S be a subset of V. The span of S is

That is, the span consists of all linear combinations of vectors in S.

S spans a subspace W of V if W =(𝑆) ; that is, if every element of W is a linear combination of


elements of S.

Example 5.1 Let

(a) Prove or disprove: is in the span of S.


(b) Prove or disprove: (5, -2, 6) is in the span of S.

Find numbers a and b such that

2
(Remember that by default, when you convert a vector in "parenthesis form" like " "
to a matrix, you convert the vector to a column vector.)

This is equivalent to the matrix equation

Row reduce the augmented matrix: The solution is = 3 ,

𝑏 = −4 That is,

The vector (3, −1, −4) is in the span of S.

b) Find numbers a and b such that

This is equivalent to the matrix equation

Row reduce to solve the system:

The last matrix says " ", a contradiction. The system is inconsistent, so there are no such
numbers a and b. Therefore, (5, −2, 6) is not in the span of S.

Thus, to determine whether the vector is in the span of , , ..., in , form the
augmented matrix

If the system has a solution, b is in the span, and coefficients of a linear combination of the v's
which add up to b are given by a solution to the system. If the system has no solutions, then b is
not in the span of the v's.

(In a general vector space where vectors may not be "numbers in slots", you have to get back to
the definition of spanning set.)

3
Example 5.2 Consider the following set of vectors in 𝑅 3

Prove that the span of the set is all of 𝑅 3

Let (𝑥, 𝑦, 𝑧) ∈ 𝑅 3 . I must find real numbers a, b, c such that

The matrix equation is

Row reduce to solve:

This shows that, given any vector (𝑥, 𝑦, 𝑧) one can find a linear combination of the original three
vectors which equals (𝑥, 𝑦, 𝑧)

Thus, the span of the original set of three vectors is all of 𝑅 3

5.3 Orthogonality

Finding approximate solutions of equations generally requires computing the closest vector on a
subspace to a given vector. This becomes an orthogonality problem: one needs to know which
vectors are perpendicular to the subspace. It will be necessary to find the closest point on a
subspace to a given point

4
The closest point has the property that the difference between the two points is orthogonal, or
perpendicular, to the subspace. For this reason, we need to develop notions of orthogonality,
length, and distance.

The Dot Product 𝑥1. 𝑦1


The basic construction in this section is the dot product, which measures angles
between vectors and computes the length of a vector.
Definition. The dot product of two vectors x, y in 𝑅 𝑛 is
𝑥1 𝑦1 𝑥1
𝑥2 𝑦2 𝑥2
𝑥. 𝑦 = ( ⋮ ) . ( ⋮ ) = ( ⋮ ) . (𝑦1 𝑦2 ⋯ 𝑦𝑛 ) = 𝑥1 𝑦1 + 𝑥2 𝑦2 + ⋯ 𝑥𝑛 𝑦𝑛
𝑥𝑛 𝑦𝑛 𝑥𝑛

Thinking of x, y as column vectors, this is the same as 𝑥 𝑇 𝑦 .

For example,

1 4 1
(2) . (5) = (2) . (4 5 6) = 1.4 + 2.5 + 3.6 = 4 + 10 + 18 = 32
3 6 3

Notice that the dot product of two vectors is a scalar.


You can do arithmetic with dot products mostly as usual, as long as you remember
you can only dot two vectors together, and that the result is a scalar.

Properties of the Dot Product. Let x, y, z be vectors in Rn and let c be a scalar.


1. Commutativity: x + y = y + x.
2. Distributivity with addition: (x + y) z = x z + y z.
3. Distributivity with scalar multiplication: (c x) y = c(x y).
The dot product of a vector with itself is an important special case:
𝑥1 𝑥1
𝑥2 𝑥2
𝑥. 𝑥 = ( ⋮ ) . ( ⋮ ) = 𝑥12 + 𝑥22 + 𝑥32
𝑥𝑛 𝑥𝑛

5
Therefore, for any vector x, we have:
• 𝑥. 𝑥 ≥ 0
• 𝑥. 𝑥 = 0 ⇔ 𝑥 = 0. This leads to a good definition of length.
The length of a vector x in Rn is the number

|𝑥| = √𝑥. 𝑥= √𝑥12 + 𝑥22 + ⋯ + 𝑥𝑛2

This is true for vectors in R2 , by the Pythagorean theorem.

6
7
Orthogonal Vectors

8
9
Definition: A basis 𝐵 = {𝑣1 , … , 𝑣𝑘 } of a subspace S of 𝑅 𝑛 (including the whole space) is said to
be orthogonal if any two vectors in it are orthogonal to each other

10
If 𝐵 = {𝑣1 , … , 𝑣𝑘 } is an orthogonal basis of a subspace S of 𝑅 𝑛 , then any vector 𝑤 of 𝑆 is the
sum of its projections on each basis vector:
𝑣 .𝑤 𝑣2 .𝑤 𝑣 .𝑤
𝑤 = 𝑣1 .𝑣 𝑣1 + 𝑣 + ⋯ + 𝑣1 .𝑣 𝑣𝑘
1 1 𝑣2 .𝑣2 1 𝑘 𝑘

Proof

We know that each vector 𝑤 has a unique set of coordinates in the given basis:

𝑤 = 𝑤1 𝑣1 + 𝑤2 𝑣2 + … + 𝑤𝑘 𝑣𝑘

To see what these coordinates are, we can take the dot product of 𝑤 with 𝑣1 and use the fact that
the basis vectors are all orthogonal to each other:

𝑤. 𝑣1= 𝑤1 𝑣1 . 𝑣1+ 𝑤2 𝑣2 . 𝑣1+⋯+ 𝑤𝑘 𝑣𝑘 . 𝑣1

But this implies that


𝑤.𝑣1
𝑤1 = 𝑣1 .𝑣1

which is what was claimed. Of course, we can do the same for all other vectors, thus proving the
above fact

In other words, for an orthogonal basis, the linear combination produced by the coordinates splits
the vector into the sum of its projections, just as we know to happen in the standard
(𝑥, 𝑦, 𝑧) basis.

1 1 1 1 −1 0
Example: 𝐵1 = ⟨⟦0⟧ | ⟦1⟧ | ⟦1⟧⟩ 𝐵2 = ⟨⟦1⟧ | ⟦ 1 ⟧ | ⟦0⟧⟩
0 0 1 0 0 3

Consider these two bases for 𝑅 3 . Both are bases and, from a certain point of view, 𝐵1 is better, as
its vectors only contain 0’s and 1’s. But let’s look at it from a coordinates point of view. If we
consider the vector w = 1, -2, 3], its coordinate in the first base are (3, -5, 3) :

1 1 1 1
3 ⟦0⟧ − 5 ⟦1⟧ + 3 ⟦1⟧ = ⟦−2⟧
0 0 1 3
To find these coordinates, we solve a linear system, which is not a difficult task, given the
simplicity of the numbers. But what do those coordinates tell us, beyond their meaning from the
definition? The coordinates in the second base are ( − 0.5, 1.5,1) and we can also get these

11
through a linear system, but their meaning is more concrete. The projection of 𝑤 on the first
vector is half the first vector, in the opposite direction:

𝑣1. 𝑤 ⟦1 1 0⟧. ⟦1 − 2 3⟧ 1
𝑃𝑟𝑜𝑗𝑣𝑤1 = 𝑣1 = 𝑣1 = − 𝑣1
𝑣1 . 𝑣1 ⟦1 1 0⟧. ⟦1 1 0⟧ 2

Similarly for the other two (check it).

Therefore, the coordinates in this basis are given simply by the respective dot products and still
represent the multiple of the basis vector that is used to build up the given vector.

1⁄ − 1⁄
√2 √2 0
𝐵 ∗ = |1⁄ | , | 1⁄ | , |0|
√2 √2 1
{ 0 0 }

1If we divide each vector of the earlier basis 𝐵2 by its length, we obtain the orthonormal basis
we are considering here. The coordinate of the vector w = 1, -2, 3 in this basis are given by the
dot products:

1⁄ −1⁄
1 √2 1 √2 1 0
1 3
|−2| . |1⁄ | = − ; |−2| . | 1⁄ | = − √2 ; |−2| . |0| = 3
√2
3 √2 3 √2 3 1
0 0

Indeed we can see that


1⁄ − 1⁄
√2 √2 0 1
− 1
− ⁄ |1 3
| − ⁄ | 1⁄ | + 3 |0| = |−2|
√2 ⁄√2 √2 √2 1 3
0 0

12

You might also like