Unit 2-4
Unit 2-4
PRPROBABILISTIC REASONING II
UNIT II
Contents
Kalman filters
But even subtle phenomena, such as sensor drift, sudden de-calibration, and
the effects of exogenous conditions (such as weather) on sensor readings, can
be handled by explicit representation within dynamic Bayesian networks
DYNAMIC BAYESIAN NETWORK - Inference
• Dynamic Bayesian networks are Bayesian networks
• Given a sequence of observations, one can construct the full Bayesian network
representation of a DBN by replicating slices until the network is large enough
to accommodate the observations
Unrolling.
DYNAMIC BAYESIAN NETWORK - Inference
Can use any of the inference algorithms—variable elimination, clustering method and so on
Summing out variables is exactly what the variable elimination (Figure 14.11)
algorithm does, and it turns out that running variable elimination with the variables in
temporal order exactly mimics the operation of the recursive filtering update
As the variable elimination proceeds, the factors grow to include all the state variables
The maximum factor size is O() and the total update cost per
step is O(n ),
Approximate Inference
Likelihood weighting
Markov chain Monte Carlo
The number of samples reaching state xt+1 from each
In practice, it seems that the answer is yes: particle
filtering seems to maintain a good approximation to the
true posterior using a constant number of samples