Modul 6 - Inner Product Spaces-Editedaruni English
Modul 6 - Inner Product Spaces-Editedaruni English
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
1. Do you think that vectors are a mathematical element that can be drawn
as a line segment that has a direction?
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
• Is a.b = b.a?
• Write two different formulas of : a.b
• Note:
A sample helps to bring an abstract concept to be more concrete.
A sample may help to think more generally, but it is not enough to
validate a general concept
Dot Product in Rn
• Find a.a
• In which cases does a.a = 0?
• Symmetry
• Homogeneity
• Additive
• Positive
Norm of a Vector, Angle between 2
Vectors
• Find the norm of vector a in Rn
a (a1, a2) (a1, a2, a3) (a1, a2, a3, a4) (a1, a2, …, an)
+ addition
….
Scalar
R multiplication
R R2 R3 R4 …. Rn
Inner Product Space
u.v <…,…>
Dot Product Inner Product
Proyy x
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
V symmetry
<..,..> additive
+ homogeneity
positive
R
A vector space over real numbers with an inner product is called an inner product space
Euclid Inner Product
Let u and v be any two vectors in R3, and it is defined that: R3
<u, v> = u.v (dot product) <a, b> = a.b Dot product
Proving properties 3 and 4 are for your exercise/homework. R3 is an inner product space (Euclidean n-
space)
Euclidean Inner Product
The dot product is an inner product of Rn, and it is called as the Euclidean inner
product. Rn with a Euclidean inner product is called the Euclidean n-space.
Weighted Inner Product
If u and v are any two vectors in R3, it is defined
that: R3
u, v 3u1 v 1 2u 2 v 2 5u 3 v 3
u, v 3u1 v 1 2u 2 v 2 5u 3 v 3
Weighted dot product
+
Try to find whether these 4 axioms are satisfied or
R
not.
Inner product space (over R)
1. <u, v> = <v, u>
2. <(u + v), w> = <u, w> + <v, w>
3. <k u, v> = k <u, v>
4. <v, v> > 0 where <v, v> = 0 if and only if v = 0
Proof : Positive-weighted Inner Product
u, v 3u1 v 1 2u 2 v 2 5u 3 v 3
1 . u, v 3 u 1 v 1 2 u 2 v 2 5 u 3 v 3
3 v 1 u1 2 v 2u 2 5 v 3 u 3
v, u
u, v w 1 u1 v 1 w 2u 2 v 2 ... w nu nv n
Rn
+
4 properties are satisfied, i.e.:
1. Au. Av = Av.Au R
2. A (u + v). Aw = Au.Aw + Av.Aw
3. A (k u.Av) = k (Au.Av)
4. Av.Av > 0 where Av.Av = 0 if and only if v = 0
Note:
Au and Av are vectors in Rn. Such that, Au. Av is a scalar. It is easy to show that
the 4 axioms are satisfied by this inner product.
That vector space Rn is said to be an inner product space Rn that is generated by
matrix A.
Exercise 2
2. If all weights (in the weighted Euclidean inner product) are 1, what kind
of inner product will you get?
Inner Product on P3 (1)
Any two vectors in P3 P3
2 3
p a0 a1 x a2 x a3 x
p, q a0 b0 a1 b1 a2 b2 a3 b3
2 3
q b0 b1 x b2 x b3 x 2
p a0 a1 x a2 x a3 x
3
2 3
q b0 b1 x b2 x b3 x
Conclusion:
Function <p, q> is an inner product on P3
Inner Product on P3 (2)
Any two vectors in P3 P3
2 3
p a0 a1 x a2 x a3 x
p, q a0 b0 a1 b1 a2 b2 a3 b3
2 3
q b0 b1 x b2 x b3 x 2
p a0 a1 x a2 x a3 x
3
2 3
q b0 b1 x b2 x b3 x
Conclusion:
The function <p, q> is an inner product on P3
Not an inner product in P3
Any two vectors in P3 P3
2 3
p a0 a1 x a2 x a3 x
p, q a0 b0 a1 b1 a2 b2 a3 b3
2 3
q b0 b1 x b2 x b3 x 2
p a0 a1 x a2 x a3 x
3
2 3
q b0 b1 x b2 x b3 x
Conclusion:
<p, q> is an inner product on P3
The definite integral: inner product on
C[a,b]
a
f, g f ( x ) g ( x )dx
a
p p x
q q x
R +
Inner product space
Inner product in Mnxn
So, Mnxn with the inner product above is an inner product space.
Review: Dot Product in Rn
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
Example 1:
(a) u = (2, 3, 1) in the Euclidean space R3
2 2 2
u 2 3 1 14
(b) u = (2, 3, 1) in R3 where it has a weighted inner product, with weights 2, 1, 1
consecutively.
2 2 2
u 2 .2 1 .3 1 .1 18
Distance between two vectors in the
Euclidean vector space
Distance between two vectors is the norm of the vector that is the difference
between the two vectors
y
(b1, b2)
a-b
b (a1, a2)
a
x
2 2
a b (b1 a1 ) (b2 a2 )
Example 2: norm of matrix
p = p(x) =x2 + 1 in P3 with the inner product defined as the definite integral
from -1 to 1. Then, the norm of p is
p ( x 1) dx
2 2
1
5
x
5 2
3
x x
3
1
1
....
1
Example 4: distance between two
polynomials
Based on the definition of inner product on P3:
2 3
p a0 a1 x a2 x a3 x
2 3
q b0 b1 x b2 x b3 x
p, q a0 b0 a1 b1 a2 b2 a3 b3
pq p q, p q
1 4 1 1 7
Properties of the Norm
(3) k u k u
(4) u v u v
Review: Dot Product in Rn
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
u u. v
cos
α u v
v
The angle between two vectors in the
general vector space
u, v
cos
u v
Answer:
u. v 0 .2 1 .1 2 .0 1
cos
u v 1 .3 3
Example 4: cosinus angle between two
vectors
Let u = (0, 1, 0), v = (2, 1, 2) where the weighted inner product on Rn is
defined as follows:
u. v 1 .0 .2 2 .1 .1 3 .2 .0 2
Answer: cos
u v 1 .3 3
Example 5: cosinus of the angle
between two vectors
An inner product is defined in Mnxn: <u, v> = trace(ATB)
Note:
u 2 .2 3 .3 1 .1 1 .1 15
v 2 .2 1 .1 1 .1 3 .3 15
Orthogonal Vectors
Example 6:
Let p = p(x) = x dan q =q(x) = x2 in P3 where <p.q> is defined as:
1
p, q p x q x dx
1
1 1
1
Such that, p , q xx dx
2 4
x 0
4
1
1
A B
The Pythagorean Theorem also applies to inner product spaces.
Let u and v be orthogonal vectors, <u, v> = 0
2
u v u v, u v (norm definition)
u v, u u v, v (inner product property)
u, u v , u u, v v , v
(inner product property)
2 2 (<u, v> = 0 because they are orthogonal)
u 2 u, v v
2 2
u v
u 0 ,1 ,0 ,0
v 1 ,0 ,1 ,2
then,
u v, u v 1,1,1,2 , 1,1,1,2 7
2
uv
2
u u, u 0 1 0 0 1
2
v v, v 1 0 1 4 6
2 2 2
It is shown that: u v u v 7
Orthogonal Complement
Definition 6.6.:
Let W be a subspace of the inner product space V.
Vector v is orthogonal to W if v is orthogonal to every other vector in W.
The set of all vectors that are orthogonal to W is called the orthogonal
complement of W, and it is denoted by W ┴ (read: W perp)
L = W┴
W
Orthogonal Complement (cont)
0
Example 5
1. If V is an inner product space, determine the orthogonal complement of
{0}.
Answer: V.
r1 .v 0
r .v 0
LS Ax = 0
2
0
v is element of Null(A)
m 0
r . v
Null(A) and Row (A) are Orthogonal Complement
to each other
Proof:
1. Let there be any vector v that is orthogonal to every vector in Row(A), show
that Av = 0, where v is an element of Null(A)
2. If Av = 0, show that v is orthogonal to every vector in Row(A).
Av = 0, then r1 .v r2 .v ... rm v 0
• Coll(AT) = Row(A)
• Based on the previous explanation, substituting A with AT, we obtain
that Coll(A) and (Null(AT)) are orthogonal complement to each other
Row(A), Coll(A), Null(A)
Coll(A) 0 Null(AT)
0 Row(A) 0 Null(A)
Relation between Row(A), Coll(A), Null(A)
Null(A) = [Row(A)]┴
Row(A) 0 Null(A)
Rm
Null(AT) = [Coll(A)]┴
Coll(A) 00 Null(AT)
Review: Dot Product in Rn
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
Example 8: k
(0, 0, 1)
1. { i, j, k } orthogonal basis in R3 j
i y
2 3
(0, 1, 0)
2. {1, x, x , x } Orthogonal basis in P3 (1, 0, 0)
x
Given a basis of a finite-dimensional vector space, we can obtain an orthonormal basis using the
Gram-Schmidt procedure.
Example 9: orthonormal basis in P3
Let there be an orthonormal basis of P3, i.e. B = {1, x, x2, x3} where its inner
product is:
p, q a0 b0 a1 b1 a2 b2 a3 b3
• Proof:
B v 1 , v 2 , v 3 , v 4 Every pair of its vector is orthogonal:
2 3
v1 1 0 x 0 x 0 x v 1 , v 2 1 .0 0 .1 0 .0 0 .0 0
v2 0 1x 0x 0x
2 3
v 1 , v 3 1 .0 0 .0 0 .1 0 .0 0
v3 0 0x 1x 0x
2 3
v1 , v 4 v 2, v 3 v 2, v 4 v 3 , v 4 0
2 3
v4 0 0x 0x 1x The norm of every vector = 1
v 1 v 1 , v 1 1 .1 0 0 0 1 v3 v4 1
v 2 v 2 , v 2 0 1 .1 0 0 1
Coordinate relative to an orthonormal
basis
1 1
u 1 0, 2, 2
2 2
1 1 1
u2 6 , 6, 6
3 6 6
1 1 1
u3 3, 3 , 3
3 3 3
a 1,1,1
Example 10:
1 1 B = {u1, u2, u3} is an orthonormal basis of V, a is a vector in V
u 1 0, 2, 2
2 2
then a can be expressed as a linear combination of the vectors
1 1 1
u2
3
6 ,
6
6,
6
6
in basis.
1 1
1
u3 3,
1
3 ,
1
3
a 2u 1 6u 2 3u 3
3 3 3 3 3
1 1
a 1,1,1 Coordinate matrix a relative to B: a B 2 , 6,
3
3 3
Norm, Angle and Distance Formula
Theorem 6.5:
If B is an orthonormal basis of an inner product space, and if
u B u1 , u 2 ,..., u n and v B v 1 , v 2 ,..., v n
then
(a) u u1 2 u 2 2 ... u n 2
Theorem 6.6.:
If B u1 , u 2 ,..., u n is an orthogonal basis of the inner product space V,
and a is an element of V, then
a, u 1 a, u 2 a, u n
a 2
u1 2
u 2 ... 2
un
u1 u2 un
Example 12:
1 1 2 2 2
B 0 ,1 ,1 , 1 , , , , , a 1,1,1
2 2 3 3 3
B is an orthogonal basis of V, a is a vector in V then a can be expressed as a linear
combination of vectors of basis.
2 1
2 1 1 12 2 2
a 1 0,1 ,1 1 , , , , a B 1 , ,
3 2 2 23 3 3 3 2
Projection Theorem
u u
w2 w2
W O
O w1 w1
W
If W is a line or plane that goes through the main axis, then every vector u in
the euclidean vector space can be expressed as: u = w1+w2
where w1 is along (inside of) W, and w2 is perpendicular to W
u
w2
W
O w 1 proj w u
Jika v 1 u1
〈 u 2 , v 1〉 v 2 u 2 - proj W1 u 2
v 2 u 2 proj W1 u 2 u 2 2
v1
v1
u2
〈 u 3 , v 1〉 〈 u 3 , v 2〉
v 3 u 3 proj W2 u 3 u 3 2
v1 2
v2
v1 v2 W1
O v 1= u 1 proj W1 u 2
n 1
〈 v n , v i〉 w 1 span({ v 1 })
v n u n proj W n 1 u n u n 2
vi
vi
i 1
w 2 span({ v 1 , v 2 })
B' { v 1 , v 2 , , v n } Is an orthogonal basis
v1 v2 vn Is an orthonormal basis
B'' { , ,, }
v1 v2 vn
Example 13: Gram-Schmidt Process
Given a basis of R3 B = {(0, 1, 1), (1, 0, 1), (1, 1, 0)}. Change the basis into an
orthonormal basis using Gram-Schmidt process.
〈 u 3 , v 1〉 〈 u 3 , v 2〉
1. Take v1 u1 3 . v 3 u 3 proj W2 u 3 u 3 2
v1 2
v2
v1 v2
v 1 0,1,1
1 1 1 1
〈 u 2 , v 1〉 v 3 1,1,0 0,1,1 1, ,
2 . v 2 u 2 proj W1 u 2 u 2 v1 2 3 2 2
v1
2 2
2 2 2
v 3 , ,
3 3 3
0 .1 1 .0 1 .1
v 2 1 ,0,1 2 2 2
0,1 ,1 4. Orthogonal basis is obtained : B v 1 , v 2 , v 3
0 1 1
v v v
1 5. Orthonorma l basis : B 1 , 2 , 3
v 2 1 ,0,1 0,1 ,1
2
v1 v 2 v 3
1 1 1 1 1 1 1
v 2 1 , , 0, 2, 2 , 6 , 6, 6 ,
2 2 3 6 6
2 2 B
1 3 , 1 3 , 1 3
3 3 3
Review: Dot Product in Rn
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
6.5 QR-Decomposition
Gram-Schmidt Process on matrices with
linearly independent columns
• Let matrix A have linearly independent columns, such that it forms a
basis for Coll(A):
B u1 , u2 ,..., u n
u 1 u 1 , p 1 p 1 u 1 , p 2 p 2 ... u 1 , p n p n
u 2 u 2 , p 1 p 1 u 2 , p 2 p 2 ... u 2 , p n p n
u n u n , p 1 p 1 u n , p 2 p 2 ... u n , p n p n
u1 , p 1 u2 ,p 1 un,p1
u1 , p 2 u2 ,p 2 un,p 2
A u 1 u 2 u n p1 p 2 p n
u 1 , p n u2 ,p n u n , p n
Theorem 6.9:
If A is a matrix with linearly independent column vectors, then A can be
decomposed into a product of two matrices, i.e. A = QR; where Q is a matrix with
orthonormal columns and R is an invertible upper triangular matrix.
1 0 0 12 3
12
0
1 1 0 GramSchmid t process & normalizat ion 1 1 2
A v 1 1 , v 2
2 12 , v 6 ,
1 1 1 2 1
3 16
12
1 1
1 1 1 1
2 12 6
3
12 12
0
1 1 2
Q 1
2 12 6
2 1
12
1
6
1 1 1
2 12 6
A
1
Q 1
2 12 6 Q QI
1 1 1 2 1 1
T
QQ I
12 6
1 1
1 1 1 1
2 12 6
R IR Q T QR Q T QR Q T A
2 3
2 1
R 0 3
12
2
12
0 0 2
6
QR-Decomposition Procedure
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
• A set which consists of columns of A, i.e. {c1, c2, …,cn} forms a basis for
Coll(A)
• Coll(A) = Rn
• After applying the G-S process and normalize B, then we obtain the
normal standard basis for Rn
Orthogonal Matrix
Definisi 6.8:
A nxn matrix A is orthogonal if and only if AT = A-1
Which means
ATA= I A = [ c1 c2 … cn]
(I)ij = ciT cj = ci.cj
So ci.cj = 0 if i ≠ j
ci.cj= 1 if i = j, where ci.ci = ║ci║2.
Conclusion:
A set which consists of column vectors of A is orthonormal; a set which
consist of the rows of A is also orthonormal.
Orthogonal Matrix Property
Proof:
• Det(A) = det(AT) = det(A-1) = 1/det(A). Then det(A) = 1 or -1
T T T T T
• ║Ax║= ( Ax ) Ax x A Ax x Ix x x x
Example 15: Orthogonal Matrices
0 1 0 cos sin
sin
1 0 0
cos
0 0 1
1 0 1
1
2 2
3
1
2
0 2 0
2
3
2
1
1 0 1
Properties of Orthogonal Matrices
A [ a 1 a 2 ... a n ]
1, if i j
[ I ]ij A A
T
ai
T
aj
0 , if i j
ij
Review: Dot Product in Rn
Inner Product
Gram-Schmidt Procedure
QR-Decomposition
Straight Line
y y
(2, 7)
(1, 2) (3, 2)
(0, 1) (3, 1)
(0, 0) x x
(2, 0)
y ax b 2
y ax bx c
Match a straight line on a data
y
Let there be points on a plane. Find a line equation (2, 7)
that is most suitable to describe the data.
(1, 2)
(0, 0) x
We obtain the following linear system
0 = a.0 + b 0 1 0
a
2 = a. 1 + b where its matrix equation is: 1 1 b 2
7 = a. 2 + b 2 1 7
The linear system does not have a solution (inconsistent) because the points are
not on the same line. Question: how to find the most suitable line that describes
the data?
Brief Review
2. Null(AT) =(Coll(A))┴
Least Square Solution
b e
Ax Coll(A)
b e
Ax W= Coll(A)
Ax W= Coll(A)
Given A and b, finding the best approximation with the least error means finding x that satisfies:
Ax = proyColl(A)b
Ax = proyColl(A)b. (shown in the picture above)
b - Ax = b - proyColl(A)b
b - Ax orthogonal to W, and W┴ = Null(AT), the set of solutions of AT x = 0. b-Ax is inside of
the nullspace of AT.
Hence, AT(b-Ax) = 0 or ATb= ATAx which is known as the normal system of Ax = b.
Such that, finding the approximation of the solution of Ax = b, is the same as finding the
solution of the consistent linear system, i.e. the normal system.
Definition: Least Square Solution
If A is a mxn matrix, the least square solution of Ax = b is the solution x* from the normal
system ATAx* = ATb
Normal System (cont’d)
Given a linear system Ax = b with m equations with n unknowns each, then:
ATb= ATAx is the normal system of said linear system.
1. The normal system consist of n equations with n unknowns.
2. The normal system is guaranteed to be consistent and the solution of a normal
system is called as the least square solution.
3. If vector x is the least square solution, then: proyColl(A)b = Ax
4. The normal system can have infinitely many solutions
If the columns of A are linearly independent, then the least square solution is
exactly one.
If the columns of A is linearly independent, then ATA has an inverse, such that
the least square solution is as follows
ATAx = ATb (multiply with (ATA)-1 )
(ATA)- 1ATAx = (ATA)-1 ATb (associative)
x = (ATA)-1 ATb (least square solution)
Example 16: a straight line (1)
Given points (0, 0), (1, 2), and (2, 7). We will find the line equation that most represents these
data. We construct a linear system from the data we have.
A line equation has the general form of y = ax + b. We need to find a and b.
If the points are all on the line, then a and b satisfy the following linear system:
0 = a. 0 + b 0 1 0
a
2 = a. 1 + b where the matrix equation is: 1 1 b 2
7 = a. 2 + b 2 1 7
(3, 2)
(0, 1) (3, 1)
(2, 0) x
Substitute the variables in the equation based on the data points we have to
form a linear system, then find the normal system.
Solution: parabola (cont’d)
y1 = ax12 + bx1 + c 1 = a(0)2 + b(0) + c 0 0 1 1
a
y2 = ax22 + bx2 + c 0 = a(2)2 + b(2) + c 4 2 1 0
Written as:
9 b
y3 = ax32 + bx3 + c 1 = a(3)2 + b(3) + c 3 1 1
c
y4 = ax42 + bx4 + c 2 = a(3)2 + b(3) + c 9 3 1 2
Normal system:
0 0 1 1
0 4 9 9 a 0 4 9 9
4 2 1 0
0 2 3 3 b 0 2 3 3
1 9 3 1 1
1 1 1 c 1 1 1 1
1
9 3 2
The solution of normal system:
1
0 0 1 1
a 0 4 9 9 0 4 9 9 2 / 3
4 2 1 0
b 0 2 3 3 0 2 3 3 11 / 6
9 3 1 1
c 1 1 1 1 1
1 1 1 1
9 3 1 2
Parabola (cont’d)
Given data points (0, 1), (2, 0), (3, 1), and (3, 2)
The general form of a parabola is y = ax2 + bx + c
The solution: y
a = 2/3
b = -11/6
y = -3x2 +5x +2 (3, 2)
c=1
(0, 1) (3, 1)
(2, 0) x
Answer:
It is the same as the solution of said linear system.
Answer:
When columns of A are linearly independent, or when ( AT A) 1 exists
Reflection
96
Post-test module