0% found this document useful (0 votes)
8 views23 pages

Lect3 02web

The document discusses concepts in Linear Algebra, focusing on Euclidean structures in Rn, including vector representation, orthogonal complements, and projections. It defines key terms such as vector addition, scalar multiplication, lengths, distances, and scalar products, along with their properties. Additionally, it covers orthogonality, fundamental subspaces related to matrices, and provides examples and theorems related to these concepts.

Uploaded by

xym
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views23 pages

Lect3 02web

The document discusses concepts in Linear Algebra, focusing on Euclidean structures in Rn, including vector representation, orthogonal complements, and projections. It defines key terms such as vector addition, scalar multiplication, lengths, distances, and scalar products, along with their properties. Additionally, it covers orthogonality, fundamental subspaces related to matrices, and provides examples and theorems related to these concepts.

Uploaded by

xym
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

MATH 304

Linear Algebra
Lecture 17:
Euclidean structure in Rn (continued).
Orthogonal complement.
Orthogonal projection.
Vectors: geometric approach

A B′

A′
• A vector is represented by a directed segment.
• Directed segment is drawn as an arrow.
• Different arrows represent the same vector if
they are of the same length and direction.
−→ −−→
Notation: AB (= A′ B ′ ).
Linear structure: vector addition
Given vectors a and b, their sum a + b is defined
−→ −→ −→
by the rule AB + BC = AC .
−→
That is, choose points A, B, C so that AB = a and
−→ −→
BC = b. Then a + b = AC .

B b
a C
a+b B′ b
A a C′
a+b
A′
Linear structure: scalar multiplication

Let v be a vector and r ∈ R. By definition, r v is a


vector whose magnitude is |r | times the magnitude
of v. The direction of r v coincides with that of v if
r > 0. If r < 0 then the directions of r v and v are
opposite.
v

3v

−2v
Beyond linearity: Euclidean structure

Euclidean structure includes:


• length of a vector: |x|,
• angle between vectors: θ,
• dot product: x · y = |x| |y| cos θ.

C
y

θ x
A B
Vectors: algebraic approach
An n-dimensional coordinate vector is an element of
Rn , i.e., an ordered n-tuple (x1 , x2 , . . . , xn ) of real
numbers.
Let a = (a1 , a2 , . . . , an ) and b = (b1 , b2 , . . . , bn ) be
vectors, and r ∈ R be a scalar. Then, by definition,
a + b = (a1 + b1 , a2 + b2 , . . . , an + bn ),
r a = (ra1 , ra2 , . . . , ran ),
0 = (0, 0, . . . , 0),
−b = (−b1 , −b2 , . . . , −bn ),
a − b = a + (−b) = (a1 − b1 , a2 − b2 , . . . , an − bn ).
Cartesian coordinates: geometric meets algebraic

(−3, 2) (−3, 2)

(2, 1) (2, 1)

Once we specify an origin O, each point A is


−→
associated a position vector OA. Conversely, every
vector has a unique representative with tail at O.
Cartesian coordinates allow us to identify a line, a
plane, and space with R, R2 , and R3 , respectively.
Length and distance

Definition. The length of a vector


v = (v1 , v2 , . . . , vn ) ∈ Rn is
p
kvk = v12 + v22 + · · · + vn2 .

The distance between vectors/points x and y is


ky − xk.

Properties of length:
kxk ≥ 0, kxk = 0 only if x = 0 (positivity)
kr xk = |r | kxk (homogeneity)
kx + yk ≤ kxk + kyk (triangle inequality)
Scalar product
Definition. The scalar product of vectors
x = (x1 , x2 , . . . , xn ) and y = (y1 , y2 , . . . , yn ) is
x · y = x1 y1 + x2 y2 + · · · + xn yn .

Properties of scalar product:


x · x ≥ 0, x · x = 0 only if x = 0 (positivity)
x·y =y·x (symmetry)
(x + y) · z = x · z + y · z (distributive law)
(r x) · y = r (x · y) (homogeneity)
In particular, x · y is a bilinear function (i.e., it is
both a linear function of x and a linear function of y).
Relations between lengths and scalar products:

kxk = x · x
|x · y| ≤ kxk kyk (Cauchy-Schwarz inequality)
kx − yk2 = kxk2 + kyk2 − 2 x·y

By the Cauchy-Schwarz inequality, for any nonzero


vectors x, y ∈ Rn we have
x·y
cos θ = for some 0 ≤ θ ≤ π.
kxk kyk
θ is called the angle between the vectors x and y.
The vectors x and y are said to be orthogonal
(denoted x ⊥ y) if x · y = 0 (i.e., if θ = 90o ).
Problem. Find the angle θ between vectors
x = (2, −1) and y = (3, 1).
√ √
x · y = 5, kxk = 5, kyk = 10.
x·y 5 1
cos θ = =√ √ =√ =⇒ θ = 45o
kxk kyk 5 10 2

Problem. Find the angle φ between vectors


v = (−2, 1, 3) and w = (4, 5, 1).
v · w = 0 =⇒ v ⊥ w =⇒ φ = 90o
Orthogonality

Definition 1. Vectors x, y ∈ Rn are said to be


orthogonal (denoted x ⊥ y) if x · y = 0.

Definition 2. A vector x ∈ Rn is said to be


orthogonal to a nonempty set Y ⊂ Rn (denoted
x ⊥ Y ) if x · y = 0 for any y ∈ Y .

Definition 3. Nonempty sets X , Y ⊂ Rn are said


to be orthogonal (denoted X ⊥ Y ) if x · y = 0
for any x ∈ X and y ∈ Y .
Examples in R3 . • The line x = y = 0 is
orthogonal to the line y = z = 0.
Indeed, if v = (0, 0, z) and w = (x, 0, 0) then v · w = 0.

• The line x = y = 0 is orthogonal to the plane


z = 0.
Indeed, if v = (0, 0, z) and w = (x, y , 0) then v · w = 0.

• The line x = y = 0 is not orthogonal to the


plane z = 1.
The vector v = (0, 0, 1) belongs to both the line and the
plane, and v · v = 1 6= 0.

• The plane z = 0 is not orthogonal to the plane


y = 0.
The vector v = (1, 0, 0) belongs to both planes and
v · v = 1 6= 0.
Proposition 1 If X , Y ∈ Rn are orthogonal sets
then either they are disjoint or X ∩ Y = {0}.
Proof: v ∈ X ∩ Y =⇒ v ⊥ v =⇒ v · v = 0 =⇒ v = 0.

Proposition 2 Let V be a subspace of Rn and S


be a spanning set for V . Then for any x ∈ Rn
x ⊥ S =⇒ x ⊥ V .
Proof: Any v ∈ V is represented as v = a1 v1 + · · · + ak vk ,
where vi ∈ S and ai ∈ R. If x ⊥ S then
x · v = a1 (x · v1 ) + · · · + ak (x · vk ) = 0 =⇒ x ⊥ v.

Example. The vector v = (1, 1, 1) is orthogonal to


the plane spanned by vectors w1 = (2, −3, 1) and
w2 = (0, 1, −1) (because v · w1 = v · w2 = 0).
Orthogonal complement
Definition. Let S ⊂ Rn . The orthogonal
complement of S, denoted S ⊥ , is the set of all
vectors x ∈ Rn that are orthogonal to S. That is,
S ⊥ is the largest subset of Rn orthogonal to S.
Theorem 1 S ⊥ is a subspace of Rn .
Note that S ⊂ (S ⊥ )⊥ , hence Span(S) ⊂ (S ⊥ )⊥ .
Theorem 2 (S ⊥ )⊥ = Span(S). In particular, for
any subspace V we have (V ⊥ )⊥ = V .
Example. Consider a line L = {(x, 0, 0) | x ∈ R}
and a plane Π = {(0, y , z) | y , z ∈ R} in R3 .
Then L⊥ = Π and Π⊥ = L.
Fundamental subspaces
Definition. Given an m×n matrix A, let
N(A) = {x ∈ Rn | Ax = 0},
R(A) = {b ∈ Rm | b = Ax for some x ∈ Rn }.

R(A) is the range of a linear mapping L : Rn → Rm ,


L(x) = Ax. N(A) is the kernel of L.
Also, N(A) is the nullspace of the matrix A while
R(A) is the column space of A. The row space of
A is R(AT ).
The subspaces N(A), R(AT ) ⊂ Rn and
R(A), N(AT ) ⊂ Rm are fundamental subspaces
associated to the matrix A.
Theorem N(A) = R(AT )⊥ , N(AT ) = R(A)⊥ .
That is, the nullspace of a matrix is the orthogonal
complement of its row space.
Proof: The equality Ax = 0 means that the vector x is
orthogonal to rows of the matrix A. Therefore N(A) = S ⊥ ,
where S is the set of rows of A. It remains to note that
S ⊥ = Span(S)⊥ = R(AT )⊥ .

Corollary Let V be a subspace of Rn . Then


dim V + dim V ⊥ = n.
Proof: Pick a basis v1 , . . . , vk for V . Let A be the k×n
matrix whose rows are vectors v1 , . . . , vk . Then V = R(AT )
and V ⊥ = N(A). Consequently, dim V and dim V ⊥ are rank
and nullity of A. Therefore dim V + dim V ⊥ equals the
number of columns of A, which is n.
Orthogonal projection

Theorem 1 Let V be a subspace of Rn . Then


any vector x ∈ Rn is uniquely represented as
x = p + o, where p ∈ V and o ∈ V ⊥ .

In the above expansion, p is called the orthogonal


projection of the vector x onto the subspace V .

Theorem 2 kx − vk > kx − pk for any v 6= p in V .

Thus kok = kx − pk = min kx − vk is the


v∈V
distance from the vector x to the subspace V .
Orthogonal projection onto a vector
Let x, y ∈ Rn , with y 6= 0.
Then there exists a unique decomposition x = p + o
such that p is parallel to y and o is orthogonal to y.

o x

p
y

p = orthogonal projection of x onto y


Orthogonal projection onto a vector
Let x, y ∈ Rn , with y 6= 0.
Then there exists a unique decomposition x = p + o
such that p is parallel to y and o is orthogonal to y.

We have p = αy for some α ∈ R. Then


0 = o · y = (x − αy) · y = x · y − αy · y.

x·y x·y
=⇒ α = =⇒ p= y
y·y y·y
Problem. Find the distance from the point
x = (3, 1) to the line spanned by y = (2, −1).
Consider the decomposition x = p + o, where p is parallel to
y while o ⊥ y. The required distance is the length of the
orthogonal component o.
x·y 5
p= y = (2, −1) = (2, −1),
y·y 5

o = x − p = (3, 1) − (2, −1) = (1, 2), kok = 5.

Problem. Find the point on the line y = −x that


is closest to the point (3, 4).
The required point is the projection p of v = (3, 4) on the
vector w = (1, −1) spanning the line y = −x.
 
v·w −1 1 1
p= w= (1, −1) = − ,
w·w 2 2 2
Problem. Let Π be the plane spanned by vectors
v1 = (1, 1, 0) and v2 = (0, 1, 1).
(i) Find the orthogonal projection of the vector
x = (4, 0, −1) onto the plane Π.
(ii) Find the distance from x to Π.
We have x = p + o, where p ∈ Π and o ⊥ Π.
Then the orthogonal projection of x onto Π is p and
the distance from x to Π is kok.
We have p = αv1 + βv2 for some α, β ∈ R.
Then o = x − p = x − αv1 − βv2 .
 
o · v1 = 0 α(v1 · v1 ) + β(v2 · v1 ) = x · v1
⇐⇒
o · v2 = 0 α(v1 · v2 ) + β(v2 · v2 ) = x · v2
x = (4, 0, −1), v1 = (1, 1, 0), v2 = (0, 1, 1)

α(v1 · v1 ) + β(v2 · v1 ) = x · v1
α(v1 · v2 ) + β(v2 · v2 ) = x · v2
 
2α + β = 4 α=3
⇐⇒ ⇐⇒
α + 2β = −1 β = −2

p = 3v1 − 2v2 = (3, 1, −2)


o = x − p = (1, −1, 1)

kok = 3

You might also like