0% found this document useful (0 votes)
18 views6 pages

Section 6.3 Filled Lecture Notes

Section 6.3 discusses orthogonal projections of a vector y onto a subspace W spanned by vectors u1 and u2. It introduces the concept of decomposing y into a projection ŷ in W and a vector z in the orthogonal complement W?. The Orthogonal Decomposition Theorem is presented, stating that any vector can be expressed as the sum of its projection onto a subspace and a vector orthogonal to that subspace.

Uploaded by

shadowmei04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views6 pages

Section 6.3 Filled Lecture Notes

Section 6.3 discusses orthogonal projections of a vector y onto a subspace W spanned by vectors u1 and u2. It introduces the concept of decomposing y into a projection ŷ in W and a vector z in the orthogonal complement W?. The Orthogonal Decomposition Theorem is presented, stating that any vector can be expressed as the sum of its projection onto a subspace and a vector orthogonal to that subspace.

Uploaded by

shadowmei04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Section 6.

3: Orthogonal Projections

Ben Lantz

MATH 2210 Fall 2023


In section 6.2, we found the projection of a vector y onto a line spanned by a single vector.
Now, consider the projection of y onto a plane, W = Span{u1 , u2 }, denoted ŷ = projW y.

We build projW y by looking at the projections of y onto Span{u1 } and Span{u2 }.

Note that projW y = projL1 y + projL2 y where L1 and L2 are the spaces spanned by u1 and u2 respectively.
Definition. Let W be a subspace of a vector V , then W ? is the set of all vectors in V that are orthogonal to all vectors
in W .

In Section 6.2, we decomposed a vector y into a part (ŷ) in the span of a vector u and a part (z = y ŷ) that is
orthogonal to u.
Now we will generalize this idea.

We want to write y = ŷ + z where ŷ is the projection of y onto a subspace W and z is a vector in W ? .

Example 1. Let y 2 R5 where {u1 , u2 , u3 , u4 , u5 } is an orthogonal basis for R5 and y = c1 u1 +c2 u2 +c3 u3 +c4 u4 +c5 u5 .
Write y = ŷ + z where ŷ is projW y where W = Span{u1 , u2 } and z 2 W ? .

i air
Show E w̅ O and E w̅ O
hence E suitqui o for
Ew̅ Gus Cytty 545 w̅
any Cier
GUI 4ñ 545 0

similarfor Ew̅
We don’t actually need an orthogonal basis for all of Rn to find this decomposition, we only need an orthogonal
basis for W .

1
Theorem 1. Orthogonal Decomposition Theorem Let W be a subspace of Rn . Then each y 2 Rn can be written as

y = ŷ + z

where ŷ 2 W and z 2 W ? .
If {u1 , u2 , . . . , up } is an orthogonal basis for W , then

y · u1 y · u2 y · up
ŷ = projW y = u1 + u2 + · · · + up
u1 · u1 u2 · u2 up · up
= projL1 y + projL2 y + · · · + projLp y

and z = y ŷ.
where
Li span ui

Note If using an orthonormal basis

g Guhh Gupñp as ii ñi l forall i


2 3 2 3 2 3
2 2 1
6 7 6 7 6 7
6 7 6 7 6 7
Example 2. Let u1 = 6 5 7, u2 = 6 1 7 and y = 6 2 7. Note that {u1 , u2 } is an orthogonal basis for
4 5 4 5 4 5
1 1 3
W = Span{u1 , u2 }.
Write y as the sum of a vector in W and a vector in W ? .
I I E

g 9 5 in

55
1 f fi
Properties of Orthogonal Projections

If y 2 W , then projW y = y.

E
projW y is the closest vector to y in W . 7
W
I
We say that profW y is the best approximation to y by elements in W .

distance 115 911 11211

If {u1 , u2 , . . . , up } is an orthonormal basis for W , then projW y = U T U y where the columns of U are the orthonormal
basis vectors.

You might also like