Multiple View Geometry: Exercise Sheet 2: A B A B A B A B A B
Multiple View Geometry: Exercise Sheet 2: A B A B A B A B A B
Part I: Theory
1. Which groups have you seen in the lecture? Write down the names and the correct inclusions!
(e.g.: group A ⊂ group B)
R
3. Let A ∈ n×n be a symmetric matrix with the orthonormal basis of eigenvectors v1 , . . . , vn
and eigenvalues λ1 ≥ . . . ≥ λn . Find all vectors x, that minimize the following term:
min x> Ax
||x||=1
R
n
P
Hint: Use the expression x = αi vi with coefficients αi ∈ and compute appropriate coeffi-
i=1
cients!
1
Part II: Practical Exercises
The Moore-Penrose pseudo-inverse
To solve the linear system Ax = b for an arbitrary (non-quadratic) matrix A ∈ Rm×n of rank
r ≤ min(m, n), one can define a (generalized) inverse, also called the Moore-Penrose pseudo-inverse
(refer to Chapter 1, last slide).
In this exercise we want to solve the linear system Dx = b with D ∈ Rm×4 , b ∈ Rm a vector whose
components are all equal to 1, and x∗ = [4, −3, 2, −1]T ∈ R4 should be one possible solution of the
linear system, i.e. for any row [d1 , d2 , d3 , d4 ] of D:
We recall that the set of all possible solutions is given by S = {x∗ + v | v ∈ kernel(D)}.
3. Repeat the two previous questions, by setting m to a higher value. How is the precision im-
pacted?
(a) Solve again the linear system using questions (1) and (2).
Thus rank(D) = 3 and dim(kernel(D)) = 1.
(b) Use the function null to get a vector v ∈ kernel(D).
The set of all possible solutions is S = {x + λv | λ ∈ R}.
(c) According to the last slide of Chapter 1, we know that the following statement holds:
xmin = A+ b is among all minimizers of kAx − bk2 the one with the smallest norm |x|.
Let λ ∈ R, xλ = x + λv one possible solution, and eλ = kDxλ − bk2 the associated error.
Using the function plot, display both graphs of kxλ k and eλ according to λ ∈ {−100, . . . , 100},
and observe that the statement indeed holds.