SP14 CS188 Lecture 13 - Markov Models
SP14 CS188 Lecture 13 - Markov Models
Markov Models
Instructors: Dan Klein and Pieter Abbeel --- University of California, Berkeley
[These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.]
Reasoning over Time or Space
X1 X2 X3 X4
Joint distribution:
More generally:
Questions to be resolved:
Does this indeed define a joint distribution?
Can every joint distribution be factored this way, or are we making some assumptions
about the joint distribution by using this factorization?
Chain Rule and Markov Models
X1 X2 X3 X4
From the chain rule, every joint distribution over can be written as:
Assuming that
and
From the chain rule, every joint distribution over can be written as:
CPT P(Xt | Xt-1): Two new ways of representing the same CPT
Xt-1 Xt P(Xt|Xt-1) 0.9
0.3
sun sun 0.9 0.9
sun sun
sun rain 0.1 rain sun 0.1
rain sun 0.3 0.3
rain rain
rain rain 0.7 0.7 0.7
0.1
Example Markov Chain: Weather
Initial distribution: 1.0 sun 0.3
0.9
rain sun
0.7
0.1
Xt-1 Xt P(Xt|Xt-1)
sun sun 0.9
sun rain 0.1
rain sun 0.3
rain rain 0.7
Also:
Random Variable X in {a, b, c}
Transition matrix:
13
P∞(a)= P(a|a)X P∞(a)+ P(a|b)X P∞(b)+ P(a|c)X P∞(c)
=> P∞(a)=2/5 P∞(a)+1/5 P∞(b)+1/5 P∞(c)
=>5 P∞(a)=2 P∞(a)+ P∞(b)+ P∞(c)
=>3 P∞(a)- P∞(b)- P∞(c)=0 -------- (i)
15