Course 1
Course 1
M2 E3A SAAS
Fabien Bonardi
[email protected]
September 2024
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
In probabilistic methods, the states of a system, controls and sensor measurements are
modeled as random variables. Random variables can take on multiple values according to
a their probabilistic law. Probabilistic inference is the way to deduce laws for random
variables that are relative to others and the observed data. We denote X a random
variable and x a specific value that X might assume,
p(x) = p(X = x)
is the probability that the random variable X has value x. The space of all possible values
for a random variable can be discrete or continuous (For continuous random variables,
we speak about Probability Density Function).
X
p(x) = 1
x
Z
p(x)dx = 1
p(x, y) = p(x)p(y)
conditional probability :
p(x, y)
p(x|y) =
p(y)
If X and Y are independent, then
p(x)p(y)
p(x|y) = = p(x)
p(y)
Continuous case : Z
p(x) = p(x|y)p(y)dy
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
Bayes rule :
Discrete case :
p(y|x)p(x) p(y|x)p(x)
p(x|y) = =P
p(y) x 0 p(y|x )p(x )
0 0
Continuous case :
p(y|x)p(x) p(y|x)p(x)
p(x|y) = =R
p(y) p(y|x 0 )p(x 0 )dx 0
p(y|x)p(x)
p(x|y) =
p(y)
We suppose that x model a system state (state of the environment or a robot state for
instance) and we want to infer that state from a y value returned by a sensor,
• p(x) is the prior information we have about the state (prior probability distribution)
• p(y) is the information we get from the sensor (data)
• Probability distribution p(y|x) models the relation between the true state and
information fetched by the sensor (generative model).
Probability p(x|y) is called posterior probability distribution
In the usual case and according to the hypothesis already made, the emergence of state
xt might be conditionned on all past states, measurements and controls :
The state is complete, that is to say, the system is modeled by a Markov Chain :
And
p(zt |x0:t , z1:t−1 , u1:t ) = p(zt |xt )
The system state at time t is stochastically dependent on the state at time t − 1 and the
control action at time t.
The measurement at time t is stochastically dependent on the system state at time t.
Such model is named hidden Markov model (HMM) or dynamic bayesian network.
x0 x1 xt xt+1
u1 ut ut+1
z1 zt zt+1
prediction :
bel(xt ) = p(xt |z1:t−1 , u1:t )
In an algorithm, the process of computing bel(xt ) from bel(xt ) is named correction or
measurement update.
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
Outputs : bel(xt )
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
Bayesian filters
Histogram filter
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
With a finite number of possible states, the algorithm from the usual Bayesian filter is
straightforward.
Inputs : {pk,t−1 }, ut , zt
At time t, for each state xt ∈ ΩX with probability pk,t :
• pk,t = i p(Xt = xk |ut , Xt−1 = xi )pi,t−1 (prediction)
P
ΩX index of the image dataset We define bel(xt ) = p(xt |z1:t , u1:t ) ∈ ΩX probability that
prabaility in at t same index image ω on database
bel(x0 ) = p(x0 ) is defined as an equi. PDF on database index base (no prior information
on vehicule position)
Inputs : {pk,t−1 }, zt
At time t, for each xt ∈ ΩX with pk,t :
• pk,t = i p(Xt = xk |Xt−1 = xi )pi,t−1 (prediction)
P
Prediction :
1
pk,t = (pk−1,t−1 + pk,t−1 + pk+1,t−1 )
3
Measurement update :
1
η=P
k p̃k,t
pk,t = η p̃k,t
0.6
100
0.5
200
0.4
300
0.3
400
0.2
500
0.1
600 0.0
0 50 100 150
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
Outputs : bel(xt )
[i]
Belief bel(xt ) is a set Ψt of m particles ψt at time step t.
n o
[1] [2] [i] [m]
Ψt , ψt , ψt , . . . , ψt , . . . , ψt
[i] T
n o h i
[i] [i] [i] [i] [i] [i]
ψt = xt , wt = x1,t , x2,t , . . . , xn,t , wt
Inputs : {Ψt−1 }, ut , zt
At time t :
• Ψt = Ψt = ∅
For i = 1 à m
• Sample ψt[i] ∼ p(xt |ut , xt−1
[i]
) (Prediction)
• wt[i] = p(zt , xt[i] ) (Measurement update)
• Ψt = Ψt + {xt[i] , wt[i] }
For i = 1 à m
• Sample i with probability wt[i] (Resampling)
• Add ψt[i] à Ψt
Outputs : Ψt
Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 40 / 47
Bayesian filters
Particle filter
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
System states and sensors models are defined by gaussian probability density functions,
one-dimensional or multidimensional
Gaussian probability density function :
!
1 x−µ 2
1
√ exp −
σ 2π 2 σ
xt = At xt−1 + Bt ut + t
model sensor is linear too :
zt = Ct xt + δt
1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants
• IF, Information Filter : Kalman filter with alternative Gaussian PDF parametrization.
(µ, Σ) → Ω = Σ−1 and ξ = Σ−1 µ
• EKF, Extended Kalman Filter : non-linear system → Taylor expansion at each time
step at the neighborhood of the state estimation
• ESKF, Error-State Kalman Filter : estimation on the error state instead of the state
itself, needs an additional step in the recursive algorithm to “reset” to zero the error
state.