Lecture 7 Tracking
Lecture 7 Tracking
Lecture 7 Tracking
Dr Christos Kyrkou
KIOS Research and Innovation Center of Excellence,
Electrical and Computer Engineering Department, University of Cyprus
[email protected]
ECE 627
Applications
❑Motion capture
❑Augmented Reality
❑Action Recognition
❑Security, traffic monitoring
❑Video Compression
❑Video Summarization
❑Medical Screening
Tracking Examples
❑Traffic: https://www.youtube.com/watch?v=DiZHQ4peqjg
❑Soccer: http://www.youtube.com/watch?v=ZqQIItFAnxg
❑Face: http://www.youtube.com/watch?v=i_bZNVmhJ2o
❑Body: https://www.youtube.com/watch?v=_Ahy0Gh69-M
❑Eye: http://www.youtube.com/watch?v=NCtYdUEMotg
❑Gaze: http://www.youtube.com/watch?v=-G6Rw5cU-1c
Steps of tracking
❑Prediction: What is the next state of the object given past
measurements?
P(X t Y0 = y0 ,, Yt −1 = yt −1 )
Steps of tracking
❑Prediction: What is the next state of the object given past
measurements?
P(X t Y0 = y0 ,, Yt −1 = yt −1 )
Simplifying assumptions
❑Only the immediate past matters
P(X t X 0 ,, X t −1 ) = P(X t X t −1 )
dynamics model
Simplifying assumptions
❑ Only the immediate past matters
P(X t X 0 ,, X t −1 ) = P(X t X t −1 )
dynamics model
❑Measurements depend only on the current state
P(Yt X 0 , Y0 , X t −1 , Yt −1 , X t ) = P(Yt X t )
Simplifying assumptions
❑Only the immediate past matters
P(X t X 0 ,, X t −1 ) = P(X t X t −1 )
dynamics model
Y0 Y1 Y2 Yt-1 Yt
Dr Christos Kyrkou Lecture 7: Tracking 14
ECE 627
Problem statement
❑We have models for
▪ Likelihood of next state given current state: P ( X t X t −1 )
▪ Likelihood of observation given the state: P (Yt X t )
Probabilistic tracking
❑Base case:
▪ Start with initial prior that predicts state in absence of any
evidence: 𝑃(𝑋0 )
▪ For the first frame, correct this given the first measurement: 𝑌0 = 𝑦0
Probabilistic tracking
❑Base case:
▪ Start with initial prior that predicts state in absence of any
evidence: 𝑃(𝑋0 )
▪ For the first frame, correct this given the first measurement: 𝑌0 = 𝑦0
Likelihood Prior
P( y0 | X 0 ) P( X 0 )
P( X 0 | Y0 = y0 ) = P( y0 | X 0 ) P( X 0 )
P( y0 )
Posterior
Normalizing Factor
Probabilistic tracking
❑Base case:
▪ Start with initial prior that predicts state in absence of any
evidence: 𝑃(𝑋0 )
▪ For the first frame, correct this given the first measurement: 𝑌0 = 𝑦0
❑Given corrected estimate for frame t-1:
▪ Predict for frame t ➔ P ( X t y 0 ,, y t −1 )
▪ Observe 𝑦𝑡 ; Correct for frame 𝑡 ➔ P ( X t y 0 ,, y t −1 , y t )
predict correct
Prediction
❑Prediction involves representing P(X t y0 ,, yt −1 )
given P(X t −1 y0 ,, yt −1 )
P(X t y0 ,, yt −1 )
= P(X t , X t −1 y0 ,, yt −1 )dX t −1
= P( X t | X t −1 , y0 ,, yt −1 )P( X t −1 | y0 ,, yt −1 )dX t −1
Law of total probability
Prediction
❑Prediction involves representing P(X t y0 ,, yt −1 )
given P(X t −1 y0 ,, yt −1 )
P(X t y0 ,, yt −1 )
= P(X t , X t −1 y0 ,, yt −1 )dX t −1
= P( X t | X t −1 , y0 ,, yt −1 )P( X t −1 | y0 ,, yt −1 )dX t −1
= P(Conditioning
X t | X t −1 )P( Xont −1X|t–1
y0 ,, yt −1 )dX t −1
Prediction
❑Prediction involves representing P(X t y0 ,, yt −1 )
given P(X t −1 y0 ,, yt −1 )
P(X t y0 ,, yt −1 )
= P(X t , X t −1 y0 ,, yt −1 )dX t −1
= P( X t | X t −1 , y0 ,, yt −1 )P( X t −1 | y0 ,, yt −1 )dX t −1
= P( X t | X t −1 )P( X t −1 | y0 ,, yt −1 )dX t −1
Independence assumption
Dr Christos Kyrkou Lecture 7: Tracking 21
ECE 627
Prediction
❑Prediction involves representing P(X t y0 ,, yt −1 )
given P(X t −1 y0 ,, yt −1 )
P(X t y0 ,, yt −1 )
= P(X t , X t −1 y0 ,, yt −1 )dX t −1
= P( X t | X t −1 , y0 ,, yt −1 )P( X t −1 | y0 ,, yt −1 )dX t −1
= P( X t | X t −1 )P( X t −1 | y0 ,, yt −1 )dX t −1
dynamics corrected estimate
model from previous step
Dr Christos Kyrkou Lecture 7: Tracking 22
ECE 627
Correction
❑Correction involves computing P(X t y0 ,, yt )
given predicted value P(X t y0 ,, yt −1 )
Correction
❑Correction involves computing P(X t y0 ,, yt )
given predicted value P(X t y0 ,, yt −1 )
P(X t y0 ,, yt )
P ( y t | X t , y 0 ,, y t −1 )
= P ( X t | y 0 ,, y t −1 )
P ( y t | y 0 ,, y t −1 )
P ( y t | X t )PBayes’
( X t | yRule
0 ,, y t −1 )
=
P ( y t | y 0 ,, y t −1 )
P ( y t | X t )P ( X t | y 0 ,, y t −1 )
=
P( X t | y 0 ,, y t −1 )P ( y t | X t )dX t
Dr Christos Kyrkou Lecture 7: Tracking 24
ECE 627
Correction
❑Correction involves computing P(X t y0 ,, yt )
given predicted value P(X t y0 ,, yt −1 )
P(X t y0 ,, yt )
P( yt | X t , y0 ,, yt −1 )
= P( X t | y0 ,, yt −1 )
P( yt | y0 ,, yt −1 )
P( yt | X t )P( X t | y0 ,, yt −1 )
=
P( yt | y0 ,, yt −1 )
( yt | X t )P( X t assumption
PIndependence | y0 ,, yt −1 )
=
P( X yt |directly
(observation
t 0 )P( y only
y ,, ydepends | X )on
t −1 dX state Xt)
t t t
Correction
❑Correction involves computing P(X t y0 ,, yt )
given predicted value P(X t y0 ,, yt −1 )
P(X t y0 ,, yt )
P ( y t | X t , y 0 ,, y t −1 )
= P ( X t | y 0 ,, y t −1 )
P ( y t | y 0 ,, y t −1 )
P ( y t | X t )P ( X t | y 0 ,, y t −1 )
=
P ( y t | y 0 ,, y t −1 )
P ( y t | X t )P ( X t | y 0 ,, y t −1 )
=
P( y t | X t )P ( X t | y 0 ,, y t −1 )dX t
Conditioning on Xt
Dr Christos Kyrkou Lecture 7: Tracking 26
ECE 627
Correction
❑Correction involves computing P(X t y0 ,, yt )
given predicted value P(X t y0 ,, yt −1 )
P(X t y0 ,, yt )
P ( y t | X t , y 0 ,, y t −1 )
= P ( X t | y 0 ,, y t −1 )
P ( y t | y 0 ,, y t −1 )
P ( y t | X t )P ( X t | y 0 ,, y t −1 )
=
P ( y t | y 0 ,, y t −1 )
observation predicted
model P ( y t | X t )P ( X t | y 0 ,, y t −1 ) estimate
=
P( y t | X t )P ( X t | y 0 ,, y t −1 )dX t
normalization factor
Dr Christos Kyrkou Lecture 7: Tracking 27
ECE 627
❑Correction:
observation predicted
model estimate
P( yt | X t )P( X t | y0 ,, yt −1 )
P( X t | y0 ,, yt ) =
P( y t | X t )P( X t | y0 ,, yt −1 )dX t
Dr Christos Kyrkou Lecture 7: Tracking 28
ECE 627
Ground Truth
Observation
Correction
Comparison
Uncertainty
Observation
and Correction
Dr Christos Kyrkou Lecture 7: Tracking 32
ECE 627
Particle filtering
❑Represent the state distribution non-parametrically
❑Prediction: Sample possible values 𝑋𝑡−1 for the previous state
❑Correction: Compute likelihood of 𝑋𝑡 based on weighted samples
and 𝑃(𝑦𝑡 |𝑋𝑡 )
Particle filtering
Start with weighted
samples from previous
time step
Sample and shift
according to dynamics
model
Spread due to
randomness; this is
predicted density P(Xt|Yt-1)
Weight the samples
according to observation
density
Arrive at corrected density
estimate P(Xt|Yt)
Uncertainty
Observation
and Correction
Dr Christos Kyrkou Lecture 7: Tracking 35
ECE 627
❑Particle Filters:
▪ Non-linear dynamics and non-gaussian noise
❑Kalman Filter
▪ Linear dynamics and gaussian noise
Tracking issues
❑Initialization
▪ Manual
▪ Background subtraction
▪ Detection
Tracking issues
❑Initialization
❑Getting observation and dynamics model
❑Uncertainty of prediction vs. correction
▪ If the dynamics model is too strong, will end up ignoring the data
▪ If the observation model is too strong, tracking is reduced to
repeated detection
Tracking issues
❑Initialization
❑Getting observation and dynamics models
❑Prediction vs. correction
❑Data association
▪ When tracking multiple objects, need to
assign right objects to right tracks (particle
filters good for this)
Tracking issues
❑Initialization
❑Getting observation and dynamics models
❑Prediction vs. correction
❑Data association
❑Drift
▪ Errors can accumulate over time
Drift
Things to remember
❑Tracking objects = detection + prediction
❑Probabilistic framework
▪ Predict next state
▪ Update current state based on observation