Computer Vision: Models, Learning and Inference
Computer Vision: Models, Learning and Inference
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 2
Conditional independence
• The variable x1 is said to be conditionally
independent of x3 given x2 when x1 and x3 are
independent for fixed x2.
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 3
Conditional independence
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 4
Conditional independence
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 5
Conditional independence
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 6
Graphical models
• A graphical model is a graph based
representation that makes both factorization
and conditional independence relations easy
to establish
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 7
Directed graphical models
• Directed graphical model represents probability
distribution that factorizes as a product of
conditional probability distributions
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 8
Directed graphical models
• To visualize graphical model from factorization
– add one node per random variable and draw arrow to each
variable from each of its parents.
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 9
Example 1
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 10
Example 1
General rule:
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 14
Example 2
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 15
Example 2
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 16
Example 2
General rule:
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 17
Example 2
Algebraic proof:
4 + 3x4 + 2x3
= 22 entries
4 x 3 x 2 = 24 entries
Partition
function
(normalization For large systems, intractable to compute
constant)
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 22
Alternative form
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 24
Undirected graphical models
• To visualize graphical model from factorization
– Sketch one node per random variable
– For every clique, sketch connection from every node to
every other
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 25
Conditional independence
• Much simpler than for directed models:
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 26
Example 1
Represents factorization:
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 27
Example 1
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 28
Example 1
Algebraically:
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 31
Example 2
Or could be....
Executive summary:
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 33
Comparing directed and undirected models
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 34
Comparing directed and undirected models
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 36
Graphical models in computer vision
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 39
Inference in models with many unknowns
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 41
Marginal posterior distributions
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 43
Maximum marginals
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 44
Sampling the posterior
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 45
Drawing samples - directed
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 46
Ancestral sampling example
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 47
Ancestral sampling example
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 49
Gibbs sampling
To generate new sample x in the chain
– Sample each dimension in any order
– To update nth dimension xn
• Fix other N-1 dimensions
• Draw from conditional distribution Pr(xn| x1...N\n)
Get samples by selecting from chain
– Needs burn-in period
– Choose samples spaced apart, so not correlated
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 50
Gibbs sampling example: bi-variate normal
distribution
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 51
Gibbs sampling example: bi-variate normal
distribution
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 52
Learning in directed models
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 54
Learning in undirected models
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 55
Contrastive divergence
Some algebraic manipulation
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 56
Contrastive divergence
Now approximate:
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 58
Conclusions
Can characterize joint distributions as
– Graphical models
– Sets of conditional independence relations
– Factorizations
Two types of graphical model, represent different
but overlapping subsets of possible
conditional independence relations
– Directed (learning easy, sampling easy)
– Undirected (learning hard, sampling hard)
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 59