0% found this document useful (0 votes)
27 views47 pages

Course 1

Sensor Fusion

Uploaded by

ashraf.faraj1999
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views47 pages

Course 1

Sensor Fusion

Uploaded by

ashraf.faraj1999
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

Sensor fusion and bayesian filtering

M2 E3A SAAS

Fabien Bonardi
[email protected]

UFR Sciences et Technologies & Laboratoire IBISC

September 2024

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 1 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 2 / 47


Introduction

• Several fields of application (tracking with computer vision systems, robotics,


meteorology, etc)
• Signal processing with a large variety of sensors, some redundant information
• Proprioceptive and exteroceptive sensors

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 3 / 47


A practical example
Industrial robotics

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 4 / 47


A practical example
Outdoor robotics

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 5 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 6 / 47


Probability basics : reminders
Notations

In probabilistic methods, the states of a system, controls and sensor measurements are
modeled as random variables. Random variables can take on multiple values according to
a their probabilistic law. Probabilistic inference is the way to deduce laws for random
variables that are relative to others and the observed data. We denote X a random
variable and x a specific value that X might assume,
p(x) = p(X = x)
is the probability that the random variable X has value x. The space of all possible values
for a random variable can be discrete or continuous (For continuous random variables,
we speak about Probability Density Function).
X
p(x) = 1
x
Z
p(x)dx = 1

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 7 / 47


Probability basics
Notations

Joint distribution of two random variables X and Y :

p(x, y) = p(X = x and Y = y)

Two random variables are independent if

p(x, y) = p(x)p(y)

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 8 / 47


Probability basics
Notations : conditional probability

conditional probability :
p(x, y)
p(x|y) =
p(y)
If X and Y are independent, then

p(x)p(y)
p(x|y) = = p(x)
p(y)

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 9 / 47


Probability basics
Theorem of total probabily

Theorem of total probability


Discrete case : X
p(x) = p(x|y)p(y)
y

Continuous case : Z
p(x) = p(x|y)p(y)dy

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 10 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 11 / 47


Probabilistic approaches : bayesian filtering
Bayes rule

Bayes rule :
Discrete case :
p(y|x)p(x) p(y|x)p(x)
p(x|y) = =P
p(y) x 0 p(y|x )p(x )
0 0

Continuous case :
p(y|x)p(x) p(y|x)p(x)
p(x|y) = =R
p(y) p(y|x 0 )p(x 0 )dx 0

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 12 / 47


Probabilistic approaches : bayesian filtering
Why bayes rule is useful

p(y|x)p(x)
p(x|y) =
p(y)
We suppose that x model a system state (state of the environment or a robot state for
instance) and we want to infer that state from a y value returned by a sensor,
• p(x) is the prior information we have about the state (prior probability distribution)
• p(y) is the information we get from the sensor (data)
• Probability distribution p(y|x) models the relation between the true state and
information fetched by the sensor (generative model).
Probability p(x|y) is called posterior probability distribution

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 13 / 47


Probabilistic approaches : bayesian filtering
Introduction of time

We add now time in notations.


• The random value xt represents the system state at time t (for example : robot
position or speed).
• The random value ut represents the control action done by the system at time t (for
example : going forward command, manipulating an object from the environment,
etc).
• The random value zt represents measurements of the environment returned by the
system’s sensors at time t (for example : measured distance form a landmark,
rotation estimation, etc).

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 14 / 47


Probabilistic approaches : bayesian filtering
About notations

p(xt1 :t2 ) = p(xt1 , xt1 +1 , . . . , xt2 )


with t1 ≤ t2

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 15 / 47


Probabilistic approaches : bayesian filtering
Recursive state estimation

In the usual case and according to the hypothesis already made, the emergence of state
xt might be conditionned on all past states, measurements and controls :

p(xt |x0:t−1 , z1:t−1 , u1:t )

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 16 / 47


Probabilistic approaches : bayesian filtering
Additional hypothesis

The state is conditional independent from the measurement :

p(xt |x0:t−1 , z1:t−1 , u1:t ) = p(xt |x0:t−1 , u1:t )

The state is complete, that is to say, the system is modeled by a Markov Chain :

p(xt |x0:t−1 , u1:t ) = p(xt |xt−1 , ut )

And
p(zt |x0:t , z1:t−1 , u1:t ) = p(zt |xt )

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 17 / 47


Probabilistic approaches : bayesian filtering
To resume

The system state at time t is stochastically dependent on the state at time t − 1 and the
control action at time t.
The measurement at time t is stochastically dependent on the system state at time t.
Such model is named hidden Markov model (HMM) or dynamic bayesian network.

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 18 / 47


Probabilistic approaches : bayesian filtering
Hidden Markov model

x0 x1 xt xt+1

u1 ut ut+1

z1 zt zt+1

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 19 / 47


Probabilistic approaches : bayesian filtering
Belief

In the literature, belief, state of knowledge or information state :

bel(xt ) = p(xt |z1:t , u1:t )

prediction :
bel(xt ) = p(xt |z1:t−1 , u1:t )
In an algorithm, the process of computing bel(xt ) from bel(xt ) is named correction or
measurement update.

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 20 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 21 / 47


Bayesian filters
Usual definition

bel(xt ) = p(xt |z1:t , u1:t )


bel(xt ) = p(xt |z1:t−1 , u1:t )
Inputs : bel(xt−1 ), ut , zt
At time t, for each state xt ∈ ΩX :
• bel(xt ) = p(xt |ut , xt−1 )bel(xt−1 )dxt−1 (prediction)
R

• bel(xt ) = ηp(zt |xt )bel(xt ) (measurement update)


with η such that bel(xt )dxt = 1
R

Outputs : bel(xt )

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 22 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 23 / 47


Bayesian filters
An illustration

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 24 / 47


Bayesian filters
How to implement

In order to implement a Bayesian filter, there is 3 hypothesis to define :


• state transition probability : p(xt |xt−1 , ut )
• measurement probability : p(zt |xt )
• initial belief : bel(x0 ) (soit p(x0 ))

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 25 / 47


Bayesian filters
How to implement

Bayesian filters

Gaussian filters Nonparametric filters

Kalman filter Information filter


Particle filter
UKF Discret Bayes filter
EKF

Histogram filter

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 26 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 27 / 47


Bayesian filters
Discrete Bayesian filter

With a finite number of possible states, the algorithm from the usual Bayesian filter is
straightforward.
Inputs : {pk,t−1 }, ut , zt
At time t, for each state xt ∈ ΩX with probability pk,t :
• pk,t = i p(Xt = xk |ut , Xt−1 = xi )pi,t−1 (prediction)
P

• pk,t = ηp(zt |Xt = xk )pk,t (measurement update)


with η such that k pk,t = 1
P
Outputs : pk,t

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 28 / 47


Practical example
Model choices : possible states space and initial belief

ΩX index of the image dataset We define bel(xt ) = p(xt |z1:t , u1:t ) ∈ ΩX probability that
prabaility in at t same index image ω on database

bel(x0 ) = p(x0 ) is defined as an equi. PDF on database index base (no prior information
on vehicule position)

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 29 / 47


Practical example
Model choices : state transition probability

For each t > 0 and ω ∈ ΩX :


1
p(Xt = ω + 1|Xt−1 = ω) =
3
1
p(Xt = ω − 1|Xt−1 = ω) =
3
1
p(Xt = ω|Xt−1 = ω) =
3

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 30 / 47


Practical example
Model choices : measurement probability

For each t > 0 and ω ∈ ΩX :


p(Zt = ω|Xt = ω) = 0, 7
p(Zt 6= ω|Xt = ω) = 0, 3

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 31 / 47


Practical example
Discrete Bayesian filter

Inputs : {pk,t−1 }, zt
At time t, for each xt ∈ ΩX with pk,t :
• pk,t = i p(Xt = xk |Xt−1 = xi )pi,t−1 (prediction)
P

• pk,t = ηp(zt |Xt = xk )pk,t (measurement update)


with η such that k pk,t = 1
P
Outputs : pk,t

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 32 / 47


Practical example
Discrete Bayesian filter

Prediction :
1
pk,t = (pk−1,t−1 + pk,t−1 + pk+1,t−1 )
3
Measurement update :

p̃k,t = 0.7 pk,t if same image than index k




= 0.3 pk,t else.

1
η=P
k p̃k,t

pk,t = η p̃k,t

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 33 / 47


Practical example
IR and visible images

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 34 / 47


Practical example
Confusion matrix
0

0.6

100

0.5

200

0.4

300
0.3

400
0.2

500
0.1

600 0.0
0 50 100 150

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 35 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 36 / 47


Bayesian filters
Usual definition

bel(xt ) = p(xt |z1:t , u1:t )


bel(xt ) = p(xt |z1:t−1 , u1:t )
Inputs : bel(xt−1 ), ut , zt
At time t, for each state xt ∈ ΩX :
• bel(xt ) = p(xt |ut , xt−1 )bel(xt−1 )dxt−1 (prediction)
R

• bel(xt ) = ηp(zt |xt )bel(xt ) (measurement update)


with η such that bel(xt )dxt = 1
R

Outputs : bel(xt )

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 37 / 47


Bayesian filtering
Bayesian filters implementation

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 38 / 47


Bayesian filtering
Particle filter

[i]
Belief bel(xt ) is a set Ψt of m particles ψt at time step t.
n o
[1] [2] [i] [m]
Ψt , ψt , ψt , . . . , ψt , . . . , ψt

Each particle is a possible state vector and a weight wm .

[i] T
n o h i 
[i] [i] [i] [i] [i] [i]
ψt = xt , wt = x1,t , x2,t , . . . , xn,t , wt

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 39 / 47


Bayesian filters
Particle filter

Inputs : {Ψt−1 }, ut , zt
At time t :
• Ψt = Ψt = ∅
For i = 1 à m
• Sample ψt[i] ∼ p(xt |ut , xt−1
[i]
) (Prediction)
• wt[i] = p(zt , xt[i] ) (Measurement update)
• Ψt = Ψt + {xt[i] , wt[i] }
For i = 1 à m
• Sample i with probability wt[i] (Resampling)
• Add ψt[i] à Ψt
Outputs : Ψt
Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 40 / 47
Bayesian filters
Particle filter

In this algorithm, prediction consists in applying evolution to each particles according to


the state transition model. Measurement update defines the weighting of each particle
according to the sensor model and data. An additional step of “re-sampling” is made in
order to delete some particles with a low weight duplicate particles with a high one.
There is different ways to re-sample particles (variable number of particles VS fixed,
random sample to avoid starvation, etc)

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 41 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 42 / 47


Gaussian filters
Gaussian probability density function

System states and sensors models are defined by gaussian probability density functions,
one-dimensional or multidimensional
Gaussian probability density function :
 !
1 x−µ 2

1
√ exp −
σ 2π 2 σ

Multivariable gaussian PDF :


 
1 1 T −1
exp − (x − µ) Σ (x − µ)
(2π)N/2 |Σ|1/2 2

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 43 / 47


Gaussian filters
Kalman filter

To apply a Kalman filter, the system has to be linear.


State transition model is linear :

xt = At xt−1 + Bt ut + t
model sensor is linear too :

zt = Ct xt + δt

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 44 / 47


Gaussian filters
Kalman filters

Inputs : µt−1 , Σt−1 , ut , zt


À l’instant t :
• Prediction :
• µt = At µt−1 + Bt ut
• Σt = At Σt−1 ATt + Rt
• Correction :
• Kt = Σt CtT (Ct Σt CtT + Qt )−1
• µt = µt + Kt (zt − Ct µt )
• Σt = (I − Kt Ct )Σt
Outputs : µt , Σt

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 45 / 47


Presentation

1 Probability basics
Reminders about probabilities
Bayesian filtering
2 Practical approach of bayesian filters
Usual definition of a bayesian filtering algorithm
Hypothesis to specify
3 Nonparametric filters
Discrete Bayesian filter
Particle filter
4 Gaussian filters
Kalman filter
Kalman filters variants

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 46 / 47


Gaussian filters
Kalman filters variants

• IF, Information Filter : Kalman filter with alternative Gaussian PDF parametrization.
(µ, Σ) → Ω = Σ−1 and ξ = Σ−1 µ
• EKF, Extended Kalman Filter : non-linear system → Taylor expansion at each time
step at the neighborhood of the state estimation
• ESKF, Error-State Kalman Filter : estimation on the error state instead of the state
itself, needs an additional step in the recursive algorithm to “reset” to zero the error
state.

Fabien Bonardi (UFR-ST & IBISC) Sensors fusion September 2024 47 / 47

You might also like