ch3 Gaussian Filters PDF
ch3 Gaussian Filters PDF
GAUSSIAN FILTERS
Pierre-Paul TACHER
1
1.1. We can define the state vector to include position and velocity:
x
Yt = t
ẋt
1.2. Evolution of the state will be approximated to the second and first order, for x and ẋ respec-
tively :
1
xt+1 = xt + ∆t × ẋt + (∆t)2 × ẍt + o(∆t2 )
2
1
⇒ xt+1 ≈ xt + 1 × ẋt + × ẍt
2
We have
1 1
T 4 2
G×G = 1
2 1
Note that since we do not incorporate measurements for now, we have using the notation of the
book
µt+1 = µt+1
Σt+1 = Σt+1
t Σt
0 0
0
0 0
0.25 0.5
1
0.5 1
2.5 2
2
2 2
8.75 4.5
3
4.5 3
21 8
4
8 4
41.25 12.5
5
12.5 5
71.5 18
6
18 6
113.75 24.5
7
24.5 7
170 32
8
32 8
242.25 40.5
9
40.5 9
332.5 50
10
50 10
0
The uncertainty ellipses of figure 0 are centered at µt = , have axes whose directions are given
0
by the eigenvectors of Σt , and the semi-minor and semi-major axis are scaled by eigenvalues; for
detailed derivation, cf. A.1. Σt is a symetric real matrix, it can be diagonalized in an orthonormal
basis, with real eigenvalues
Σt = Pt Dt PtT
where Pt is an orthogonal matrix and Dt is diagonal. Note the first ellipse is degenerated since Σ1
has a null eigenvalue.
PROBABILISTIC ROBOTICS: GAUSSIAN FILTERS 3
0.45 .89 −0.75 0.66
P = P =
−0.89 0.45 −0.66 −0.75
1.25 0 4.27 0
D= D=
0 0 0 0.23
−
→ −
→
j →
−
u j →−
→
−
v i
→
−
i →
−
u v →
−
−0.88 0.48
P =
−0.48 −0.88
11.22 0
D=
0 0.54
−
→
j →−
i
→
−
u −→v
−0.93 −0.37
P =
−0.37 −0.93
24.17 0
D=
0 0.83
−
→
j →−
i
→
−
u −→
v
2
2.1. We can modelize the noisy observation of the position by:
zt = x t + ξt
xt
= 1 0 + ξt
| {z } ẋt
C
4 PROBABILISTIC ROBOTICS: GAUSSIAN FILTERS
−0.95 −0.30
P =
−0.30 −0.95
45.14 0
D=
0 1.11
−
→
j →−
i
→
−
u −→
v
Figure 0. 0.95 uncertainty ellipses for the gaussian posteriors at dates t ∈ J1, 5K
σ 2 ).
where the random variable ξt is gaussian ξt ,→ N (0, |{z}
Q
K5 = Σ5 C T (CΣ5 C T + Q)−1
0.80
=
0.24
µ5 = µ5 + K5 (z5 − Cµ5 )
4.02
=
1.22
Σ5 = (I − K5 C)Σ5
8.05 2.44
=
2.44 1.95
3
Let’s recall the definition of the characteristic function of a random variable: (a tad of theory of
measure is needed)(h·, ·i denotes the scalar product in Rd )
PROBABILISTIC ROBOTICS: GAUSSIAN FILTERS 5
0.94 −0.33
P =
0.33 0.94
8.90 0
D=
0 1.10
−
→
v →
−
→ −
u
j →
−
i
Figure 1. 0.95 uncertainty ellipse for the gaussian posterior after measurement of
position at dates t = 5
Definition. Let X be an random variable which takes it values in Rd . The characteristic function
of X is the function which goes from Rd to C:
Note the characteristic function of X is the Fourier transform of its probability distribution. We
will also use the following properties to derive the prediction update of Kalman filter:
Theorem. Let two r.v. X1 and X2 be independant and take their values in Rd . The characteristic
function of their sum is given by
Theorem. Random variable X is Gaussian X ,→ N (µ, Λ) if and only if its characteristic function
ϕX is defined by
1
∀t ∈ Rd , ϕX (t) = exp ihµ, ti exp[− hΛt, ti]
2
xt = Axt−1 + But + t
We assume xt−1 is Gaussian xt−1 ,→ N (µt−1 , Σt−1 ), and t is Gaussian t ,→ N (0, Rt ). xt−1
and t are assumed to be independant. ut is deterministic. By linearity we know yt = Axt−1 +
But is Gaussian yt ,→ N (Aµt−1 + But , AΣt−1 AT ). By the above property, we can compute the
6 PROBABILISTIC ROBOTICS: GAUSSIAN FILTERS
−
→
j
−
→
i
0 0.01 0
Figure 2. 0.95 uncertainty ellipses for the gaussian N ( , )
0 0 0.01
using again the property of gaussian in the other direction, this shows r.v. xt is Gaussian xt ,→
N (Aµt−1 + But , AΣt−1 AT + Rt ).
| {z } | {z }
µt Σt
4
4.1. The system evolves according to
0
x x + cos θ
y 0 = y + sin θ
θ θ
We will assume x, y and θ are gaussian r.v.:x ,→ N (0, σ12 ) y ,→ N (0, σ12 ) θ ,→ N (0, σ22 ), where
σ12 = 0.01 and σ22 = 10000. We could compute the expectancy and variance of cos θ and sin θ
(cf. A.2) ,but since the variability of θ is high with regards to 2π, and
θ
∀θ ∈ R, cos θ = cos(θ − × 2π)
2π
θ
sin θ = sin(θ − × 2π)
2π
θ x
it seems reasonable to assume α = θ − 2π × 2π is an uniform r.v. α ,→ U([0, 2π]). is
2 y
x 0 σ 0
gaussian ,→ N ( , 1 ). To draw the posterior, we will first draw the corresponding
y 0 0 σ12
0.95 uncertainty ellipse (cf. figure 2). We will use the following notation for this disk :
D = {z ∈ C, |z| 6 r}
−
→
j
−
→
i
0
x
Figure 3. 0.95 uncertainty ring for the random variable 0
y
We can show (cf. A.3) that f (D × [0, 2π]) = {z ∈ C, 1 − r 6 |z| 6 1 + r}. It can then be asserted
| {z }
0 E
x
r.v. belongs to E with a probability not lesser than 0.95.We represent E in figure 3.
y0
4.2. In order to linearize the state evolution equation, we have to linearize cos θ and sin θ about
Eθ = 0, which we know already makes no sense because of the high variability of θ.
cos θ = 1 + o(θ)
x→0
⇒ cos θ ≈ 1
sin θ = θ + o(θ)
x→0
⇒ sin θ ≈ θ
→
−
j
→
−
i
0
x
Figure 4. 0.95 uncertainty area for the random variable 0 after linearization of
y
the system about θ = 0, σ22 = 10000
→
−
j
→
− →
−
j j
→
−
i
→
− →
−
i i
0
x
Figure 5. 0.95 uncertainty area for the random variable 0 after linearization of
y
the system about θ = 0, and using respectively σ2 = 1,σ2 = 0.5 and σ22 = 0.25
2 2
If σ22 = 0.25,
0
x 1 0.01 0
,→ N ( , )
y0 0 0 0.26
x + cos θ
In figure 6 and 7, Matlab simulation of 1000 realisation of random variable are superposed
y + sin θ
on relevant uncertainty area.
PROBABILISTIC ROBOTICS: GAUSSIAN FILTERS 9
→
−
b
b
b
b
b b b b b
b b b
b b b b
j
b b b b
b b b b b
bb b b b b b b
b
b b b bb b b b
b b
b
bb b b b b b b b b bb b b b
b b b
b b
b b b bb bb bb b bb b
b
b
b b b b b b
b b b b b b b
b b b b b bb b b bb b b
b
b b
b
b bbb b bb b b b b
b b b b b b b b b b b b
b b b b b b b b b b
b b b b b b
b b bb b b b b b b b b b b
b b b b b b b b b b b
b b b b b
b b b
b b
b b b b b
b b b b bb b b b b
b b
b b b bb b b b b b b
b
b b b b b b b b bb b b bb b
b bb b b b b
b b b b b b b b b
b b b b b b b b
b b
b b b b b b b b b bb
b b b b
b b b b
b b b bb b b b b b
bb bb b b b b
b bb b
b b
b b b
b
b
b b b b b b bb b
b b b
bb bb b b b b b b b
b
b
b bb b b
→
−
b b b b
b b b b b b
bb b b b
b b b b b bb b b b b
b b b b b b b b b
b b
b b b b b b
b b b
b bb b b
b b b
b b b b b b
b b b b
b
b b bb b bb b b b bb
b b b
bb b b b b b b b b b
i
b b b b b
bb b b b b
b b b b
b b
b b b b b
b b b b b b b b
b b b b b
b b b b b b b b
b b b b b b b b
bb bb
b b bb b b b b b
b
b
b
b
b
b b
b b b b
b b bb
b b b
b b bb
b b b b
b b b b b b b b b b b b
b b
b b
b b b b b b bb b
b b b
bb b b b b b b
b b b b b
b b b b b bb bb b
b b b b b
b b b b b b b
b b b b
b b
b b
b b b b b b b
b bb b b b
b
b b b
bb b b b b
b b b b b bb b
b b b
b
b bb b b
b b b b
b b
b b bb b b bb b b b b
b b b b
b b b b b b
b b
b b
bb b b b b b
b b b b
b b b b b b b b b
b
b b b b b b
b b b b b b b b b
b b
b b b b b bb b b
b b b bb b b b b b b b b b b
b b b b b b
b b b b b b
b b b b b b b b b b b b
b b b b bb b b
b b b b b b b
b b b b b b b b b b b b b
b b b b b
b bb b b b
b b b
b b b b
b b b
b b b b b
b b b bb b b b b b b b
b b b b b b bb
bb b b b b b b b b
b b b b b b b b b b
b bb b
b b b b b
bb
b b b bbb b
b b b b bb b b b b
b b b b b b b b b b b b b
b b b b b b b b b b b
b b
b b b b b b b b b b
b b b bb b bb b b b b b b
b b b b b b b
b b b b b
b b b b b bb b b b bb b
b b
b b b b
b b b b b b b b b b b
b b b b b
b b b b b b b
b b b b b b
b b b
b b
b b bb b b
b b
b
b
0
x
Figure 6. Matlab simulation of the random variable 0 , with σ22 = 10000
y
b b
b →
− b bb
b
j
b b b
b b bb b
b b b
b b b b
b b b b b
b b b
b bb
b b
b b b b b
b b
b b b b b b b bb b b
b b b b b b b b
b b b b b b
→
− →
−
b b b b b
b bb b b b b b b bb b b b
b b b b b b b bb b b b
bb b
b bb b b b b bb b b b b b
b b b
b b b
b b b
b bb
b b b b b
bb
b b b b b b b b b
b b b b b
b b b
b
b b bb b b b b b b bb
b b bb b b
b bb b b
b b b b b b b b b
b b b b b b
b b b b b b b b b b b b
b b b
b b b b bb b b b
j j
b b b b b b b b b
b b b b b b
b b b b
b b bb b b b
b b bb b b b bb b
b b
b b b b b
b bb b b b bb bb b b
b bb b b b b b b
b b b b b b bb b b b b b b b b b b
b b bb b b b b bb b b
bb b b b bb b bb b
b b bb b bb b b b b b b b
b b b b
b bbb b b b b b b b b
→
−
b bb b b bb b b b
b bb b b
b b b b b b b bbb b b
b b b b b b b
b b b bb b b b bb b
b b
b b b b bb b b
b b b
b b
b b b b b b b bb b
b b b b b b b b
b b
b
b
b b b bb b
b b b bb
b
b b bb b b b b b b b b b b bb
b b b b b b b b b b b b b b b b b
b b b b bb b b b b b b b b b b b b bb
b b
b b b b b b b
b b b b
b
b b b b b b b b b b bb b b
b b b b bb b
b bb b bb bb b b b b b bb b b b bb b b b bb
b bb b b b b b b bb b b b b bb
b b b b b b b bbb b b b
b b bbb b b b b
b b b b b b
b b b b b b bb b b b b
b b b b b b b b bb b b b b b b b b
bb
b
b b
i
b b b bb b b b
bb b b
b
b b
bb b b b b bb b b b b b b b b bb b b b
b b b b bb b b b b b b b b b b b bb b
b b
bb b b b b
b b
b b b bb b b b b b bb b b
b
b b b b b
b b b b b b b bb b
b
b b b b b bb b b b b b b b b bb b b b
b b b b
b b b bb b b b bb b b b b bbb
b b b b
bb b b b b b b b bb b b b b b
b b b b b bb b b b b b bb b b bb b b
b b bb b b b b bb
b b b b b bb b
b b b b b b b
b b b b b bb bb
→
− →
−
b b b b b b
b b bb b bb b b b
b
b b bb b b b b bb b b b b b b b b b bb b
b b b b b bb bb b b b b bb bb b b b b b b
b b b b b bb b b b
b b b b
bb b b b b bb b b
b bb b b b b b b b bb b b b b b b
b bb b bb b b b b bb b b b
b b b b bb b b b b b b b b b b bb b b bb b b b b b b bb b b b b b b
b
b b b b b b b bb b b b b b b b b b
b b bb b b
b b b b b b b b b b bb b b
b
b b b b b bb b b b b b
b b b b b bb b bb b b bb b b b b b b
b b
b b
b bb b b b b b b b b b b
b b b b b b bbb b bb b b b b
b bb b b b b b b bb b b b b b b b b
b
b
b b b
bb bb b b b b bb b b b b b b b b b b b b
b b b b b b
b b b b bb b b b b b b b
b b b b
b bb
b b b bb b
b
b b b b
b
b bb bb b b b b bb b b
b b b bb b b b b b b b b
b b b b b b b b b bbb b b
b b b b bb b b b b bb bb b b b b b b bb b b b bb b b b
i
b
i
b b b b
b b b b b b b b b b b bb b b b b b b b b bb b b b b b b b
b b b b b b b b b b b b bb b
b
b
b b b b b b bb b b b b bb b b bb b b b b b b b
b bb b b b b b
b b b b b bb b b b b b b b b b b b b b bb
b b
b
b b bb b b b b b b
b b bb b b b b bb b b b b b b b bb b b b b b b b b
b b b b b b b bb
b b b b b bb b b b b b
b bb b b b b b b bb b b
b b
b b b b b b b b bb b b b b
b bb b bb
b b b b b bb b b bbb b b b b b b b b b b bbb
b bb b b b b b bb bb bb b b bb b b b b b b b b b b b bb b b b b
b
b b b
b b b bb b b b b b b b b b b bb
b b
b b b b b b b b b b
b b b bb b b b b bb b b b b b b b bb b b b b b
b b b bb b b
b b b b b b b b b b bb b b b b b b b b b b bb b bb b
b b
b b b b bb b b b b b b b b b b b b bb b b b b
b b b b bb b b b b b b bb b b b b b b bb b b bb b b b
b b b b b b b b bb b b b b b bb b b b b bb b b b b b b b b b b b b b b b b b
b b b b b b b b b b b bb b b b b bb b b b bb b b b
b b b b b bb b bb b b b b bb b b b b b b b b b b b b bb b b
b b
b b b bb b
b b bb b b b b b b
b b b b bb b bb b b b b b
b b bb b b b b b b b b b bb b b b b
b b b b b b b b bb b b b bb b b b b bb bb b b b b b b b bb b b b b
b
bb b
b
b b b b bb b b b b b bb b b b b bb b b b bb b
bb b b
b b bb bb b b bb b b b b b b b b bb b b b b b b b b b b
b b b b b b b b b bbb b
b b bb b b b b bb bb b b b b bb b b
b b b b b b b b b b b bb b b
bb b b b b b bb b b b b b b b b b b
b bb
b
bb b b b b b b
b bb b
b b b b b b b b b b b b b b b b b bb b b bb b b b b b b
b b b bbb bb b b b bb bb bb b b b b b b
b b b b b b b b bb b
b b bb b b bb b b b b b b b b b b b b b b b
b b b b b
b b b b
b b b b b bb b b b b b b
b b b b b b b b bb b b b b
b
b b b b bb b
b
b b
b b b b b bb b b b b b
b b b b b b b b b b bb b b b b b b b bb bb b b b
b b b b b b b bb bb b
b b b bb b b b b b b b
b b b b b b b b b bb bb bb b
b bb b
b b b b b b b b b b b bb b b b b b b b bb
b b b b b b b b
b b
b bb b b b b bb bb
b b b b b b bb b b bb b b b
bb
b b bb b b b bbb b
b bb
b bb bb b b
b b b b b
b b bb b b bb bb b b bb b b b
b b b b b b b b b
b b b b b b b bb b bb bb b b
b b b b b b bb b b
b bb b b b b bb b b bb b b
b b b b bb b b bb b b b b b b b
b b b b b
b b b b b
b
b b b b b b b b b bb b
b b b b
b b b b b b b b b b b b bb b b b b b b b b b b b b b bb
b b
b b bb b b bb b b b b b b bb b bb
b b b b b b
bb
b
b b b
b b b b b b bb b bb b b
b b b
b b b
b b
b b b bb b b b b b b bb bb bb b b b b
b b bbbb b
b b b
b b b b b b b bb
b b b b b b b b bb
b b b b b b b b b b b b b b bb b b b
b
b b b
b b bb b b b b
b b b b
b b b b
b b b b b b b
b b b b b b
bb
bb b bb b b b b b b b b
b b b
b
bb b
b b b
b b b b b b b b b b b b b b b bb b b b b b b b
b b b b b b b b b b b bb b b b b bb
b
b
b b b
b b b b bb b b b b bb b b b b b
b b b b b b
bb b b b bb b b b b b b b b b
b b bb b b b b
b b b b b b b b b b b b b b b b b
b bb b bb b
b b b b b b b b b b
b
b b
b b b b b b b b b b
b b b b b bb b b b b b
b b b bb bb b b
b
b b b
b b b b b b b bb bb b b b b b bb b b
b b b b b b b b b b b b
b b b b b b b b
b b
b b b b b b
b b b b b b b
b bb b b
b
b b b b b b b b b b b bb
b b b b
b b b bb b b b b
b b b b b
b b
b b b b b b b
b b b b
b b b
b
b
b
b
0
x
Figure 7. Matlab simulation of the random variable 0 , with σ22 = 1, σ22 = 0.5
y
and σ22 = 0.25
5
Handling an additive constant D1 in motion model is straightforward.
xt = Axt−1 + But + D1 +t
| {z }
deterministic constant
Since But is already deterministic in our model, it suffices to replace in the derivation of the motion
step But by But + D1 to get
µt = Axt−1 + But + D1 + t
Σt = AΣt−1 AT + R
If the measurement model is now
zt = Cxt + D2 + ξt
6
Not sure to understand what is expected exactly in this exercise.
PROBABILISTIC ROBOTICS: GAUSSIAN FILTERS 11
Appendix A
A.1.
Uncertainty ellipse. Let X : Ω → Rd a Gaussian random variable X ,→ N (µ, Σ). We suppose
Σ is positive definite. Let Y a random variable having a chi squared distribution with k degrees of
freedom Y ,→ χ2 (d), and c ∈ R+ such that P (Y 6 c) = 0.95. We have
P (X ∈ {x ∈ Rd , (x − µ)T Σ−1 (x − µ) 6 c)}) = 0.95
| {z }
E
Proof. Let
T : Rd → R
x 7→ (x − µ)T Σ−1 (x − µ)
and the random variable Y 0 = T (X). Since Σ is symetric positive definite real matrix, we write
λ1 0 . . . 0
0 λ2 . . . 0
T
Σ = U .. .. U
.. . .
. . . .
0 0 . . . λd
" #
where U = U1 U2 . . . Ud is orthogonal and (λ1 , λ2 , . . . λd ) ∈ (R+∗ )d . We consider the random
variable
1
√
λ
0 . . . 0
0 1 √1 ... 0
λ2
T
Z= . . . U (X − µ)
.
.. .
. . . .
.
0 0 . . . √1λ
d
| {z }
D1
from where,
Y 0 = T (X)
Xd
= Zi2
i=1
12 PROBABILISTIC ROBOTICS: GAUSSIAN FILTERS
shows that Y 0 is the sum of d independant squared standard centered normal r.v., it is known
Y 0 ,→ χ2 (d). But then
Z
P (X ∈ E) = 1X∈E dP
ZΩ
= 1X∈T −1 ([0,c]) dP
ZΩ
= 1Y 0 ∈[0,c] dP
Ω
= P (Y 0 6 c)
= 0.95
For d = 2, equation of E in orthonormal frame (µ, −
→, −
u →
1 u2 ) becomes
E: (x − µ)T U D12 U T (x − µ) 6 c
(U T (x − µ))2 (U2T (x − µ))2
⇔ 1 + 6c
λ1 λ2
0 0
x12 x22
⇔ + 6c
λ1 λ2
02 0
x1 x22
⇔ √ + √ 61
( λ1 c)2 ( λ2 c)2
Then,
Z
1 t2
2
E cos θ =√ cos2 te− 2σ2 dt
2πσ ZR
1 1 cos 2t − t22
=√ ( + )e 2σ dt
2πσ RZ 2 2
Z
2 t2
1 1 − t2 1
=√ [ cos 2te 2σ dt + e− 2σ2 dt]
2πσ 2 ZR 2 R
1 1 u2 1
= √ cos ue− 8σ2 du +
2 2πσ 2 R 2
Z 2
1 1 − u 1
= cos u √ e 2×(2σ)2 du +
2 R 2π(2σ) 2
2
1 (2σ) 1
= e− 2 +
2 2
1 −2σ 2
E cos2 θ = (1 + e )
2
E sin2 θ = 1 − E cos2 θ
1 2
E sin2 θ = (1 − e−2σ )
2
and
f : D × [0, 2π] → C
(z, θ) 7→ z + eiθ
−
→ −
→
j j
b
z′
−
→ z 1 −
→
|z| b
i b
1
|z + eiθ | b