W 10 Markov Model

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13
At a glance
Powered by AI
The key takeaways are that Markov models can be used to model sequential or temporal data by representing the dependencies between current and previous states. They make the Markov assumption that future states only depend on the current state. Some applications include weather prediction, speech recognition, and modeling user behavior on websites.

A Markov model represents a system as being in one of a set of states. It models the probability of transitioning between these states over time. The probabilities of transitioning to the next state depend only on the current state, not on the sequence of events that preceded it.

The Markov assumption states that the probability of moving to the next state depends only on the current state, not on the sequence of past states. This allows complex sequential problems to be modeled more simply. It is important because it allows probabilities to be estimated from limited data by ignoring longer histories.

Markov Model

ISS3102 – Kecerdasan Buatan


Motivation

What is the word at the end


of this ________?

2
Markov Chain
P
sentence

P
is next of 0.6
P P
P P P paragraph
P P
P
P 0.05
What the word this
P P
0.05
P P P P line
P P

are end at
0.3

message
P
3
Markov Chain: Weather Prediction Example
 Design a Markov Chain to predict the
weather of tomorrow using previous
information of the past days.

 We have three types of weather:


𝑠𝑢𝑛𝑛𝑦, 𝑟𝑎𝑖𝑛𝑦, and 𝑐𝑙𝑜𝑢𝑑𝑦.
 So, our model has 3 states: 𝑆 = 𝑆1 , 𝑆2 , 𝑆3 ,
and the name of each state is 𝑆1 = 𝑆𝑢𝑛𝑛𝑦,
𝑆2 = 𝑅𝑎𝑖𝑛𝑦, 𝑆3 = 𝐶𝑙𝑜𝑢𝑑𝑦.

 Assume that the weather lasts all day,


i.e. it doesn’t change from rainy to
sunny in the middle of the day.

4
Markov Chain: Weather Prediction Example
 Assume a simplified model of
weather prediction:
 Collect statistics on what the weather
was like today based on what the
weather was like yesterday, the day
before, and so forth to collect the
following probabilities:

𝑃 𝑤𝑛 |𝑤𝑛−1 , 𝑤𝑛−2 , … , 𝑤1

 With the expression, we can give


probabilities of types of weather for
tomorrow and the next day using 𝑛
days history.

5
Markov Chain: Weather Prediction Example
 For example, the past three days was {𝑠𝑢𝑛𝑛𝑦, 𝑠𝑢𝑛𝑛𝑦, 𝑐𝑙𝑜𝑢𝑑𝑦}, the
probability that tomorrow would be rainy is given by:
 𝑃 𝑤4 = 𝑅𝑎𝑖𝑛𝑦|𝑤3 = 𝐶𝑙𝑜𝑢𝑑𝑦, 𝑤2 = 𝑆𝑢𝑛𝑛𝑦, 𝑤1 = 𝑆𝑢𝑛𝑛𝑦

 Problem: the larger 𝑛 is, the more statistics we must collect.


Suppose 𝑛 = 5, then we must collect statistics for 35 = 243 past
histories. Therefore, Markov Assumption:
 In a sequence 𝑤1 , 𝑤2 , … , 𝑤𝑛 :

𝑃 𝑤𝑛 |𝑤𝑛−1 , 𝑤𝑛−2 , … , 𝑤1 ≈ 𝑃 𝑤𝑛 |𝑤𝑛−1

6
Markov Assumption
 𝑃 𝑤𝑛 |𝑤𝑛−1 , 𝑤𝑛−2 , … , 𝑤1 ≈ 𝑃 𝑤𝑛 |𝑤𝑛−1 called a first-order
Markov Assumption, since we say that the probability of an
observation at time 𝑛 only depends on the observation at
time 𝑛 − 1.

 A second-order Markov assumption would have the


observation at time 𝑛 depend on 𝑛 − 1 and 𝑛 − 2.

 We can express the joint probability using the Markov


assumption.
𝑛

𝑃 𝑤1 , … , 𝑤𝑛 = ෑ 𝑃 𝑤𝑖 |𝑤𝑖−1
𝑖=1

7
Markov Chain: Weather Prediction Example
 Let’s arbitrarily pick some numbers for 𝑃 𝑤𝑡𝑜𝑚𝑜𝑟𝑟𝑜𝑤 |𝑤𝑡𝑜𝑑𝑎𝑦
expressed in Table 1:
Table 1. Probabilities of Tomorrow’s weather
based on Today’s weather
Tomorrow’s Weather
Sunny Rainy Cloudy
Sunny 0.8 0.05 0.15
Today’s Weather
Rainy 0.2 0.6 0.2
Foggy 0.2 0.3 0.5

 For first-order Markov models, we can use these probabilities to


draw a probabilistic finite state automaton.

8
Markov Chain: Weather Prediction Example
 For the weather domain, you would have three states (Sunny,
Rainy, Cloudy) and every day you would transition to a (possibly)
new state based on the probabilities in Table 1.
 Such an automaton would look like this:
𝑃 𝑆𝑢𝑛𝑛𝑦|𝑆𝑢𝑛𝑛𝑦 = 0.8
𝑃 𝑅𝑎𝑖𝑛𝑦|𝑆𝑢𝑛𝑛𝑦 = 0.05 1
𝑃 𝐶𝑙𝑜𝑢𝑑𝑦|𝑆𝑢𝑛𝑛𝑦 = 0.15

𝑃 𝑆𝑢𝑛𝑛𝑦|𝑅𝑎𝑖𝑛𝑦 = 0.2
𝑃 𝑅𝑎𝑖𝑛𝑦|𝑅𝑎𝑖𝑛𝑦 = 0.6 1
𝑃 𝐶𝑙𝑜𝑢𝑑𝑦|𝑅𝑎𝑖𝑛𝑦 = 0.2

𝑃 𝑆𝑢𝑛𝑛𝑦|𝐶𝑙𝑜𝑢𝑑𝑦 = 0.2
𝑃 𝑅𝑎𝑖𝑛𝑦|𝐶𝑙𝑜𝑢𝑑𝑦 = 0.3 1
𝑃 𝐶𝑙𝑜𝑢𝑑𝑦|𝐶𝑙𝑜𝑢𝑑𝑦 = 0.5

9
Markov Chain: Weather Prediction Example
 Exercise 1
 Given that today is Sunny, what’s the probability that tomorrow is
Sunny and the day after is Rainy?

𝑃 𝑤2 , 𝑤3 |𝑤1 = 𝑃 𝑤3 𝑤2 , 𝑤1 ∗ 𝑃 𝑤2 |𝑤1
= 𝑃 𝑤3 𝑤2 ∗ 𝑃 𝑤2 |𝑤1
= 𝑃 𝑅𝑎𝑖𝑛𝑦|𝑆𝑢𝑛𝑛𝑦 𝑃 𝑆𝑢𝑛𝑛𝑦 𝑆𝑢𝑛𝑛𝑦
= 0.05 0.8
= 0.04

10
Markov Chain: Weather Prediction Example
 Exercise 2
 Given that today is Cloudy, what’s the probability that it will be Rainy
two days from now?
 There are three ways to get from
Cloudy today to Rainy two days from
now: {Cloudy, Cloudy, Rainy}, {Cloudy,
Rainy, Rainy}, and {Cloudy, Sunny,
Rainy.}

𝑃 𝑤3 |𝑤1 = 𝑃 𝑤2 = 𝐶, 𝑤3 = 𝑅|𝑤1 = 𝐶 +
𝑃 𝑤2 = 𝑅, 𝑤3 = 𝑅|𝑤1 = 𝐶 +
𝑃 𝑤2 = 𝑆, 𝑤3 = 𝑅|𝑤1 = 𝐶
= 𝑃 𝑤3 = 𝑅|𝑤2 = 𝐶 ∗ 𝑃(𝑤2 = 𝐶|𝑤1 = 𝐶) +
𝑃 𝑤3 = 𝑅|𝑤2 = 𝑅 ∗ 𝑃(𝑤2 = 𝑅|𝑤1 = 𝐶) +
𝑃 𝑤3 = 𝑅|𝑤2 = 𝑆 ∗ 𝑃(𝑤2 = 𝑆|𝑤1 = 𝐶) +
= 0.3 0.5 + 0.6 0.3 + (0.05)(0.2)
= 0.34
11
What is A Markov Model?
 A Markov Model is a stochastic model which models temporal or
sequential data, i.e., data that are ordered.

 It provides a way to model the dependencies of current


information (e.g. weather) with previous information.

 It is composed of states, transition scheme between states, and


emission of outputs (discrete or continuous).

 Several goals can be accomplished by using Markov models:


 Learn statistics of sequential data.
 Do prediction or estimation.
 Recognize patterns.

12
References
 E. Fosler-Lusier, “Markov Models and Hidden Markov Model: A Brief
Tutorial,” International Computer Science Institute, 1998.

 S. Russell and P. Norvig, “Artificial Intelligence A Modern Approach,”


Pearson Education, Inc., 2010.

13

You might also like