Unit6 EuclideanSp
Unit6 EuclideanSp
EUCLIDEAN GEOMETRY
3 Metric properties.
4 Orthogonality.
5 Orthogonal diagonalization.
Contents
3 Metric properties.
4 Orthogonality.
5 Orthogonal diagonalization.
Euclidean vector space
Definition 6.1
Let V be a vector space over R. The map
·: V ×V → R
(~v , ~u ) 7 → ~v · ~u
is an inner product (or scalar product, or dot product) if
a) (linearity in each argument)
(α~v1 + β~v2 ) · ~u = α~v1 · ~u + β~v2 · ~u
∀~v1 , ~v2 , ~u ∈ V , ∀α, β ∈ R;
~u · (α~v1 + β~v2 ) = α~u · ~v1 + β~u · ~v2
b) (symmetry)
~v · ~u = ~u · ~v ∀~v , ~u ∈ V ;
c) (positive-definiteness)
~v · ~v ≥ 0 ∀~v ∈ V , and ~v · ~v = 0 ⇒ ~v = ~0.
In that case, (V , ·) is an Euclidean space.
Example
Example 6.1
Show that
Definition 6.2
Given an Euclidean space (V , ·), the map
V −→ R
~v 7→ ~v 2 = ~v · ~v
Example 6.2
Find the scalar square associated to the scalar product
3 Metric properties.
4 Orthogonality.
5 Orthogonal diagonalization.
Matrix form
~ = X t GY ,
~u · w
Properties:
Since the inner product is symmetric, gij = ~vi · ~vj = ~vj · ~vi = gji and,
therefore, G is a symmetric matrix.
The inner product is positive-definite, so that gii = ~vi · ~vi > 0.
n X
X n
Moreover, (x1 , x2 , . . . , xn ) · (y1 , y2 , . . . , yn ) = gij xi yj ; in other
i=1 j=1
words, gij is the coefficient associated to the term xi yj in the
expression of the inner product.
In terms of the Gram matrix, the scalar square is
n
X n−1 X
X n−1
(x1 , x2 , . . . , xn )2 = gii xi2 + 2gij xi xj .
i=1 i=1 j=i+1
Examples
Example 6.3
Given the inner product on R3
(x1 , x2 , x3 ) · (y1 , y2 , y3 ) =
Example 6.4
Consider a scalar square on R2 defined by
(x, y )2 = 3x 2 − xy + 2y 2 .
Example 6.5
Decide which of the following matrices are associated to an inner product:
5 1 2 2 0 1
A = −1 1 −1 ; B= 0 8 2 ;
−2 1 3 1 2 −2
1 1 −1 1 3 1
C = 1 2 0 ; D = 3 0 1 .
−1 0 3 1 1 1
Change of basis
Example 6.6
Consider the following inner product:
Find its Gram matrices first for the standard basis and then for the basis
B = {(1, 1), (−1, 1)}.
Contents
3 Metric properties.
4 Orthogonality.
5 Orthogonal diagonalization.
Length
Definition 6.3
In an Euclidean space (V , ·), the length (or norm) of a vector ~v ∈ V is
the real number √ √
k~v k = ~v · ~v = ~v 2 .
Any vector ~v ∈ V whose length is 1 (i.e., k~v k = 1) is a unit vector.
For each vector ~v 6= ~0, its normalized vector (or associated unit vector,
or versor) is the unit vector codirectional with ~v :
~v
.
k~v k
Metric properties
|~v · w
~ | ≤ ||~v || ||~
w ||.
Corollary
In the above inequality, the two sides are equal if and only if ~v and w
~ are
linearly dependent.
Angle between two vectors
As an immediate consequence of Cauchy-Schwarz inequality,
~v · w
~
−1 ≤ ≤1 ~ ∈ V \ {~0}.
∀~v , w
||~v || ||~
w ||
This property gives sense to the following definition.
Definition 6.4
~ ∈ V \ {~0} is the unique real
The angle between two vectors ~v , w
number θ ∈ [0, π] such that
~v · w
~
cos(θ) = .
||~v || ||~
w ||
Remark 6.1
Multiplying both sides in the above equation by the lengths of the vectors,
~v · w
~ = ||~v || ||~
w || cos(θ) ∀~v , w
~ ∈ V.
The metric depends on the inner product
Example 6.7
Consider the inner product on R3 whose associated Gram matrix w.r.t. the
natural basis is
1 1 −1
G = 1 2 −1 .
−1 −1 2
Compute the angle between (1, 0, 0) and (0, 1, 0).
Contents
3 Metric properties.
4 Orthogonality.
5 Orthogonal diagonalization.
Orthogonal vectors
Definition 6.5
In an Euclidean space (V , ·), two vectors ~u , ~v are orthogonal, ~u ⊥ ~v , if
~u · ~v = 0.
Theorem 6.3
In an Euclidean space (V , ·), consider an indexed set of vectors
S = {~v1 , ~v2 , . . . , ~vn } ⊂ V \ {~0}. If ~vi ⊥ ~vj for all i 6= j, then S is linearly
independent.
Orthogonal projection
Theorem 6.4
Consider an Euclidean space (V , ·) and a vector ~v ∈ V \ {~0}. Then, each
vector ~u ∈ V can be written uniquely in the form ~u = α~v + w~ with α ∈ R
~ ∈ V such that w
and w ~ ⊥ ~v .
Example 6.8
Let us consider R3 endowed with the usual inner product. Denoting
~v = (2, −3, 1), write the vector ~u = (1, 3, 5) as the sum of a vector
proportional to ~v and another orthogonal to it.
Orthogonal and orthonormal bases
Motivation:
Let us consider an Euclidean space (V , ·).
We look for a basis of V for which the Gram matrix associated to the
inner product is as simple as possible: a diagonal matrix or, even better,
the identity.
Thus, we search for a basis of V ,
such that
1 if i = j,
~vi · ~vj =
0 if i 6= j.
Orthogonal and orthonormal bases
Definition 6.6
In a finite-dimensional Euclidean space, a basis
and impose that ~vp ⊥ ~vi for all i ∈ {1, 2, . . . , p − 1}; or, equivalently,
~ep · ~vi
αi = ∀i = 1, 2, . . . , p − 1 .
~vi · ~vi
Moreover, from the orthogonal basis B 0 = {~v1 , ~v2 , . . . , ~vn }, we can obtain
the orthonormal basis
~v1 ~v2 ~vn
, , ..., .
k~v1 k k~v2 k k~vn k
Example
Example 6.9
Consider the vector space R3 endowed with the inner product whose Gram
matrix w.r.t. the natural basis is
1 1 −1
G = 1 2 −1 .
−1 −1 2
Definition 6.7
Two subspaces U, W of an Euclidean space are orthogonal subspaces,
U ⊥ W , if ~u ⊥ w~ for all ~u ∈ U and w~ ∈ W.
In particular, a vector ~u is orthogonal to W if ~u ⊥ w ~ ∈ W.
~ for all w
Theorem 6.5
Let (V , ·) be an Euclidean space and U a subspace of V . A vector ~b ∈ V
is orthogonal to U if and only if it is orthogonal to each vector of a basis
of U.
The orthogonal complement
Definition 6.8
Let U be a subspace of an Euclidean space (V , ·). The orthogonal
complement of U is the set of all vectors that are orthogonal to those of
U,
U ⊥ = {~w ∈V ; w ~ ⊥ ~u ∀~u ∈ U} .
Theorem 6.6
If U is a subspace of an Euclidean space (V , ·), then also U ⊥ is a subspace
of V and it satisfies
n o
U ∩ U ⊥ = ~0 .
Orthogonal projection
~v − ~u ∈ U ⊥ .
Theorem 6.7
Under the above conditions and when ~v ∈
/ U,
k~v − w
~ k > k~v − projU (~v )k ∀~
w ∈ U with w
~ 6= projU (~v ).
Orthogonal projection
equivalently
Example 6.10
Consider the vector space C([0, 1]) endowed with the inner product
Z 1
p·q = p(x) q(x) dx ∀p, q ∈ C([0, 1]).
0
Find the orthogonal projection of the function f (x) = sin(πx) ∈ C([0, 1])
onto the subspace R2 [x].
Example
3 Metric properties.
4 Orthogonality.
5 Orthogonal diagonalization.
Symmetric operators
Definition 6.9
Let (V , ·) be an Euclidean space, and T : V → V an endomorphism of V .
The operator T is symmetric if
GA = At G = (GA)t .
Example 6.11
Consider the vector space R3 endowed with the inner product whose Gram
matrix for the natural basis is
2 2 2
G = 2 4 0 .
2 0 3
Theorem 6.8
The characteristic polynomial of a symmetric endomorphism has the form
Theorem 6.9
Let (V , ·) be an Euclidean space, and T : V → V a symmetric
endomorphism. For any pair of different eigenvalues of T , the associated
corresponding eigenspaces are orthogonal: S(α) ⊥ S(β) for all α 6= β.
Corollary
Every symmetric endomorphism is diagonalizable in an orthonormal basis.
Such diagonalization is called orthonormal diagonalization.
Example
Example 6.12
Consider the inner product · : R3 × R3 −→ R whose Gram matrix for the
standard basis is
2 −3 3
G = −3 5 −4 ,
3 −4 6
and let T denote the endomorphism of R3 whose associated matrix for the
standard basis is
3 α 0
M(T ) = 0 −1 0 .
−2 4 −1
Find the values of α ∈ R for which T is symmetric. For such values,
diagonalize T in an orthonormal basis.