0% found this document useful (0 votes)
24 views36 pages

Unit6 EuclideanSp

1) The document discusses Euclidean geometry and vector spaces. It defines an inner product on a vector space and describes how this makes it an Euclidean space. 2) An inner product allows defining concepts like length of a vector, orthogonality of vectors, and angles between vectors. It also associates a Gram matrix and scalar square to a vector space. 3) Properties of inner products are discussed, including Cauchy-Schwarz inequality, metric properties, and how to write vectors as sums of orthogonal components. Orthogonal and orthonormal bases are also introduced.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views36 pages

Unit6 EuclideanSp

1) The document discusses Euclidean geometry and vector spaces. It defines an inner product on a vector space and describes how this makes it an Euclidean space. 2) An inner product allows defining concepts like length of a vector, orthogonality of vectors, and angles between vectors. It also associates a Gram matrix and scalar square to a vector space. 3) Properties of inner products are discussed, including Cauchy-Schwarz inequality, metric properties, and how to write vectors as sums of orthogonal components. Orthogonal and orthonormal bases are also introduced.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

UNIT 6:

EUCLIDEAN GEOMETRY

Gijón Polytechnic School of Engineering


University of Oviedo
Contents

1 Euclidean vector space.

2 Matrix associated to an inner product.

3 Metric properties.

4 Orthogonality.

5 Orthogonal diagonalization.
Contents

1 Euclidean vector space.

2 Matrix associated to an inner product.

3 Metric properties.

4 Orthogonality.

5 Orthogonal diagonalization.
Euclidean vector space

Definition 6.1
Let V be a vector space over R. The map
·: V ×V → R
(~v , ~u ) 7 → ~v · ~u
is an inner product (or scalar product, or dot product) if
a) (linearity in each argument)
(α~v1 + β~v2 ) · ~u = α~v1 · ~u + β~v2 · ~u
∀~v1 , ~v2 , ~u ∈ V , ∀α, β ∈ R;
~u · (α~v1 + β~v2 ) = α~u · ~v1 + β~u · ~v2
b) (symmetry)
~v · ~u = ~u · ~v ∀~v , ~u ∈ V ;
c) (positive-definiteness)
~v · ~v ≥ 0 ∀~v ∈ V , and ~v · ~v = 0 ⇒ ~v = ~0.
In that case, (V , ·) is an Euclidean space.
Example

Example 6.1
Show that

(x1 , x2 ) · (y1 , y2 ) = 2x1 y1 + x2 y1 + x1 y2 + x2 y2

defines an inner product on the vector space R2 .


Scalar square

Definition 6.2
Given an Euclidean space (V , ·), the map

V −→ R
~v 7→ ~v 2 = ~v · ~v

is the associated scalar square.

Example 6.2
Find the scalar square associated to the scalar product

(x1 , x2 ) · (y1 , y2 ) = 2x1 y1 + x2 y1 + x1 y2 + x2 y2 ,

and compute (1, 0)2 .


Contents

1 Euclidean vector space.

2 Matrix associated to an inner product.

3 Metric properties.

4 Orthogonality.

5 Orthogonal diagonalization.
Matrix form

Given (V , ·) an Euclidean space and B = {~v1 , ~v2 , . . . , ~vn } a basis of V , we


rewrite the inner product in matrix form as

~ = X t GY ,
~u · w

~ = y1~v1 + y2~v2 + · · · + yn~vn ,


where ~u = x1~v1 + x2~v2 + · · · + xn~vn , w and

~v12 ~v1 .~v2 ... ~v1 .~vn


     
x1 y1
 ~v1 .~v2 ~v 2 ... ~v2 .~vn   x2   y2
    
2
G = . ..  ; X =  ..  ; Y =  .. .
 
. ..
 . . .   .   . 
~v1 .~vn ~v2 .~vn ... ~vn2 xn yn

The matrix G is known as the Gram matrix (or Grammian matrix, or


Gramian) associated to the inner product · : V × V → R.
Gram matrix

Properties:
Since the inner product is symmetric, gij = ~vi · ~vj = ~vj · ~vi = gji and,
therefore, G is a symmetric matrix.
The inner product is positive-definite, so that gii = ~vi · ~vi > 0.
n X
X n
Moreover, (x1 , x2 , . . . , xn ) · (y1 , y2 , . . . , yn ) = gij xi yj ; in other
i=1 j=1
words, gij is the coefficient associated to the term xi yj in the
expression of the inner product.
In terms of the Gram matrix, the scalar square is
n
X n−1 X
X n−1
(x1 , x2 , . . . , xn )2 = gii xi2 + 2gij xi xj .
i=1 i=1 j=i+1
Examples

Example 6.3
Given the inner product on R3

(x1 , x2 , x3 ) · (y1 , y2 , y3 ) =

= x1 y1 + 6x2 y2 + 7x3 y3 + 2x1 y2 + 2x2 y1 − 2x2 y3 − 2x3 y2 ,


find its Gram matrix for the standard basis.

Example 6.4
Consider a scalar square on R2 defined by

(x, y )2 = 3x 2 − xy + 2y 2 .

Compute the Gram matrix associated to its corresponding inner product


for the standard basis.
Characterization of a Gram matrix

Theorem 6.1 (Sylvester’s criterion)


A ∈ Mn (R) is the Gram matrix associated to some inner product if and
only if A is symmetric and
 
a11 a12 ... a1i
 a12 a22 ... a2i 
det  . >0 ∀i = 1, 2, . . . n.
 
.. ..
 .. . . 
a1i a2i ... aii
Example

Example 6.5
Decide which of the following matrices are associated to an inner product:

5 1 2 2 0 1
   

A =  −1 1 −1  ; B= 0 8 2 ;
   

−2 1 3 1 2 −2

1 1 −1 1 3 1
   

C = 1 2 0 ; D =  3 0 1 .
   

−1 0 3 1 1 1
Change of basis

Let (V , ·) be an Euclidean space, and consider two bases B and B 0 of V .


If G is the Gram matrix of the inner product w.r.t. B, whereas H is the
one w.r.t. B 0 , then
H = P t GP,
where P is the change-of-basis matrix corresponding to changing from B
to B 0 .

Example 6.6
Consider the following inner product:

(x1 , x2 ) · (y1 , y2 ) = 2x1 y1 + x2 y1 + x1 y2 + x2 y2 for (x1 , x2 ), (y1 , y2 ) ∈ R2 .

Find its Gram matrices first for the standard basis and then for the basis
B = {(1, 1), (−1, 1)}.
Contents

1 Euclidean vector space.

2 Matrix associated to an inner product.

3 Metric properties.

4 Orthogonality.

5 Orthogonal diagonalization.
Length

Definition 6.3
In an Euclidean space (V , ·), the length (or norm) of a vector ~v ∈ V is
the real number √ √
k~v k = ~v · ~v = ~v 2 .
Any vector ~v ∈ V whose length is 1 (i.e., k~v k = 1) is a unit vector.
For each vector ~v 6= ~0, its normalized vector (or associated unit vector,
or versor) is the unit vector codirectional with ~v :
~v
.
k~v k
Metric properties

Theorem 6.2 (Cauchy-Schwarz inequality)


~ in an Euclidean space (V , ·), it holds that
For all vectors ~v , w

|~v · w
~ | ≤ ||~v || ||~
w ||.

Corollary
In the above inequality, the two sides are equal if and only if ~v and w
~ are
linearly dependent.
Angle between two vectors
As an immediate consequence of Cauchy-Schwarz inequality,
~v · w
~
−1 ≤ ≤1 ~ ∈ V \ {~0}.
∀~v , w
||~v || ||~
w ||
This property gives sense to the following definition.
Definition 6.4
~ ∈ V \ {~0} is the unique real
The angle between two vectors ~v , w
number θ ∈ [0, π] such that

~v · w
~
cos(θ) = .
||~v || ||~
w ||

Remark 6.1
Multiplying both sides in the above equation by the lengths of the vectors,
~v · w
~ = ||~v || ||~
w || cos(θ) ∀~v , w
~ ∈ V.
The metric depends on the inner product

Example 6.7
Consider the inner product on R3 whose associated Gram matrix w.r.t. the
natural basis is  
1 1 −1
G = 1 2 −1  .
−1 −1 2
Compute the angle between (1, 0, 0) and (0, 1, 0).
Contents

1 Euclidean vector space.

2 Matrix associated to an inner product.

3 Metric properties.

4 Orthogonality.

5 Orthogonal diagonalization.
Orthogonal vectors

Definition 6.5
In an Euclidean space (V , ·), two vectors ~u , ~v are orthogonal, ~u ⊥ ~v , if
~u · ~v = 0.

Theorem 6.3
In an Euclidean space (V , ·), consider an indexed set of vectors
S = {~v1 , ~v2 , . . . , ~vn } ⊂ V \ {~0}. If ~vi ⊥ ~vj for all i 6= j, then S is linearly
independent.
Orthogonal projection

Theorem 6.4
Consider an Euclidean space (V , ·) and a vector ~v ∈ V \ {~0}. Then, each
vector ~u ∈ V can be written uniquely in the form ~u = α~v + w~ with α ∈ R
~ ∈ V such that w
and w ~ ⊥ ~v .

Example 6.8
Let us consider R3 endowed with the usual inner product. Denoting
~v = (2, −3, 1), write the vector ~u = (1, 3, 5) as the sum of a vector
proportional to ~v and another orthogonal to it.
Orthogonal and orthonormal bases

Motivation:
Let us consider an Euclidean space (V , ·).
We look for a basis of V for which the Gram matrix associated to the
inner product is as simple as possible: a diagonal matrix or, even better,
the identity.
Thus, we search for a basis of V ,

B = {~v1 , ~v2 , . . . , ~vn } ,

such that 
 1 if i = j,
~vi · ~vj =
0 if i 6= j.

Orthogonal and orthonormal bases

Definition 6.6
In a finite-dimensional Euclidean space, a basis

B = {~v1 , ~v2 , . . . , ~vn }

is an orthogonal basis if ~vi ⊥ ~vj = 0 for all i 6= j. If, in addition, each


vector ~vi is a unit vector, B is an orthonormal basis.

Notice that the Gram matrix w.r.t. an orthogonal basis is diagonal,


whereas w.r.t. an orthonormal basis it is the identity.
Gram-Schmidt process
Let (V , ·) be an Euclidean space and B = {~e1 ,~e2 , . . . ,~en } a basis of V .

We can obtain an orthogonal basis B 0 = {~v1 , ~v2 , . . . , ~vn } as follows:


For p = 1, 2, . . . , n, we write ~ep as a linear combination of ~v1 , ~v2 , . . . , ~vp ,

~ep = α1~v1 + α2~v2 + · · · + αp−1~vp−1 + ~vp ,

and impose that ~vp ⊥ ~vi for all i ∈ {1, 2, . . . , p − 1}; or, equivalently,

~ep · ~vi
αi = ∀i = 1, 2, . . . , p − 1 .
~vi · ~vi

Moreover, from the orthogonal basis B 0 = {~v1 , ~v2 , . . . , ~vn }, we can obtain
the orthonormal basis
 
~v1 ~v2 ~vn
, , ..., .
k~v1 k k~v2 k k~vn k
Example

Example 6.9
Consider the vector space R3 endowed with the inner product whose Gram
matrix w.r.t. the natural basis is
 
1 1 −1
G = 1 2 −1  .
−1 −1 2

Apply the Gram-Schmidt process to deduce an orthonormal basis of R3 .


Orthogonal subspaces

Definition 6.7
Two subspaces U, W of an Euclidean space are orthogonal subspaces,
U ⊥ W , if ~u ⊥ w~ for all ~u ∈ U and w~ ∈ W.
In particular, a vector ~u is orthogonal to W if ~u ⊥ w ~ ∈ W.
~ for all w

Theorem 6.5
Let (V , ·) be an Euclidean space and U a subspace of V . A vector ~b ∈ V
is orthogonal to U if and only if it is orthogonal to each vector of a basis
of U.
The orthogonal complement

Definition 6.8
Let U be a subspace of an Euclidean space (V , ·). The orthogonal
complement of U is the set of all vectors that are orthogonal to those of
U,
U ⊥ = {~w ∈V ; w ~ ⊥ ~u ∀~u ∈ U} .

Theorem 6.6
If U is a subspace of an Euclidean space (V , ·), then also U ⊥ is a subspace
of V and it satisfies
n o
U ∩ U ⊥ = ~0 .
Orthogonal projection

Let us consider (V , ·) an Euclidean space (not necessarily


finite-dimensional), U a vector subspace of V with dim(U) = m, and
~v ∈ V a given vector.
We look for a vector ~u ∈ U such that ~v − ~u is orthogonal to U:

~v − ~u ∈ U ⊥ .

In this case, ~u is called the orthogonal projection of ~v onto U, and it is


denoted by ~u = projU (~v ) or ~u = πU (~v ).

Theorem 6.7
Under the above conditions and when ~v ∈
/ U,

k~v − w
~ k > k~v − projU (~v )k ∀~
w ∈ U with w
~ 6= projU (~v ).
Orthogonal projection

How to find projU (~v):


Let B = {~e1 ,~e2 , · · · ,~em } be a basis of U.
By definition, ~u = projU (~v ) ∈ U, so that it must be of the form

~u = c1~e1 + c2~e2 + ... + cm~em .

Moreover, ~v − ~u should be orthogonal to U:

~v − (c1~e1 + c2~e2 + ... + cm~em ) ∈ U ⊥ ;

equivalently

(~v − (c1~e1 + c2~e2 + ... + cm~em )) ⊥ ~ej ∀j = 1, 2, . . . , m.

Imposing these m conditions leads to a linear system that characterizes


c1 , c2 , . . . , cm .
Example

Example 6.10
Consider the vector space C([0, 1]) endowed with the inner product
Z 1
p·q = p(x) q(x) dx ∀p, q ∈ C([0, 1]).
0

Find the orthogonal projection of the function f (x) = sin(πx) ∈ C([0, 1])
onto the subspace R2 [x].
Example

Orthogonal projection of sin(πx) on R2[x]


Contents

1 Euclidean vector space.

2 Matrix associated to an inner product.

3 Metric properties.

4 Orthogonality.

5 Orthogonal diagonalization.
Symmetric operators

Definition 6.9
Let (V , ·) be an Euclidean space, and T : V → V an endomorphism of V .
The operator T is symmetric if

~u · T (~v ) = T (~u ) · ~v ∀~u , ~v ∈ V .

If T is a symmetric endomorphism of V , and G and A denote the matrices


associated to · and T w.r.t. a basis B, then

GA = At G = (GA)t .

In particular, if B is orthonormal, then G = I and, therefore, A is


symmetric.
Example

Example 6.11
Consider the vector space R3 endowed with the inner product whose Gram
matrix for the natural basis is
 
2 2 2
G =  2 4 0 .
2 0 3

Let T be the endomorphism of V whose matrix for the natural the


canonical basis is  
1 2 a
A=  2 1 b .
2 2 3
Find the values of a, b ∈ R for which T is symmetric.
Basic properties of symmetric operators

Theorem 6.8
The characteristic polynomial of a symmetric endomorphism has the form

p(λ) = (λ1 − λ)n1 (λ2 − λ)n2 · · · (λp − λ)np with λi ∈ R.

Consequence: every symmetric endomorphism is diagonalizable over R.

Theorem 6.9
Let (V , ·) be an Euclidean space, and T : V → V a symmetric
endomorphism. For any pair of different eigenvalues of T , the associated
corresponding eigenspaces are orthogonal: S(α) ⊥ S(β) for all α 6= β.

Corollary
Every symmetric endomorphism is diagonalizable in an orthonormal basis.
Such diagonalization is called orthonormal diagonalization.
Example

Example 6.12
Consider the inner product · : R3 × R3 −→ R whose Gram matrix for the
standard basis is  
2 −3 3
G =  −3 5 −4  ,
3 −4 6
and let T denote the endomorphism of R3 whose associated matrix for the
standard basis is  
3 α 0
M(T ) =  0 −1 0  .
−2 4 −1
Find the values of α ∈ R for which T is symmetric. For such values,
diagonalize T in an orthonormal basis.

You might also like