Hidden Markov Model Submitted by Sawera Yaseen ROLL NO 1010 Submitted To MR - Hassan University of Okara
Hidden Markov Model Submitted by Sawera Yaseen ROLL NO 1010 Submitted To MR - Hassan University of Okara
SUBMITTED BY
SAWERA YASEEN
ROLL NO 1010
SUBMITTED TO
MR.HASSAN
UNIVERSITY OF OKARA
MARKOV MODEL OVERVIEW
A Markov Model is a statistical model that describes a system
that transitions between different states over time, where the
probability of moving to a future state depends only on the
current state, not on past states. This assumption is known as
the Markov Property and forms the core of these models.
Example
Consider a weather model with three states: Sunny, Cloudy,
and Rainy. The probability of tomorrow’s weather only
depends on today’s weather. If it’s sunny today, there might
be a 70% chance it remains sunny, a 20% chance it becomes
cloudy, and a 10% chance of rain.
States and Transitions
Markov Property
q The Markov Property states that the probability of moving to the next state
depends solely on the present state, not the sequence of states that preceded it.
This is often called memorylessness.
Types of Markov Models
q Discrete-Time Markov Chain (DTMC): Used when transitions happen at regular intervals.
q Continuous-Time Markov Chain (CTMC): Useful when transitions can happen at any
time.
q Hidden Markov Model (HMM): In this type, the states themselves are not directly visible,
but their effects are observed, making them "hidden.“
Applications
Cloudy 0.4
Sunny 0.4
Rainy 0.2
Transition Probability Matrix
Explanation: This matrix shows the probabilities of transitioning from one weather
condition to another:
•If the current day is Cloudy, there's a 50% chance it will remain Cloudy the next
day, a 30% chance it will become Sunny, and a 20% chance of turning Rainy.
•If the day is Sunny, it has a 40% chance of staying Sunny, a 40% chance of
becoming Cloudy, and a 20% chance of changing to Rainy.
•If the day is Rainy, it has a 30% chance of staying Rainy, a 20% chance of becoming
Sunny, and a 50% chance of becoming Cloudy.
These transition probabilities help the model estimate the likelihood of the weather
changing from one state to another from one day to the next.
Weather
Umbrella Normal Raincoat
Condition
Cloudy 0.7 0.2 0.1
Sunny 0.3 0.6 0.1
Rainy 0.5 0.1 0.4
Example Walkthrough
Imagine you observe a sequence of behaviors over three days:
•Day 1: People are carrying umbrellas.
•Day 2: People go out normally.
•Day 3: People are wearing raincoats.
Using the matrices, we can estimate the most likely weather sequence.
1.Day 1:
1. The observation is Umbrella.
2. From the Emission Probability Matrix, Cloudy (0.7) and Rainy (0.5) are likely weather conditions, as
both have relatively high probabilities for umbrellas.
2.Day 2:
1. The observation is Normal.
2. Sunny is the most probable weather, with a 0.6 chance, according to the Emission Probability Matrix.
3.Day 3:
1. The observation is Raincoat.
2. Rainy has the highest probability for Raincoat (0.4), suggesting a rainy condition.
By considering initial probabilities, transition probabilities, and emission probabilities across days, the
model would calculate the most likely weather sequence that fits the observed behaviors, which might be
Cloudy, Sunny, Rainy in this case.
Explanation of the 81 Possibilities
1.Three Hidden States and Three Observations:
1. If we are considering three time steps and we have three possible hidden states (Cloudy,
Sunny, Rainy) for each step, we need to explore every combination of these hidden
states across the sequence.
2. Each time step has 3 choices (Cloudy, Sunny, Rainy), and over three days, we get
3×3×3=27 combinations of states.
2.Two Layers of Observation and State Combinations:
1. Similarly, for each sequence of hidden states, we also consider the possible observable
states (Umbrella, Normal, Raincoat) associated with each hidden state.
2. For each of the 27 hidden state sequences, we have 3 possible observation sequences.
3. Therefore, there are 27×3=81 total combinations of hidden states and observations
81 PROBABILITIES CONDITION
Calculation Example:
Given three days, we need to account for:
•All possible paths of hidden states over three days, each with three options.
•All possible sequences of observations that align with these hidden state paths.
Each possible sequence of hidden states has a probability based on the transition matrix, initial
probability, and emission probabilities. In the HMM, we would calculate the total probability
for each of these 81 paths and then choose the one with the highest probability to predict the
likely sequence of hidden states.
Practical Use
In real applications, calculating probabilities for all 81 combinations is computationally
expensive. Algorithms like the Viterbi algorithm help to efficiently find the most probable
sequence without calculating every possible path. The Viterbi algorithm uses dynamic
programming to reduce the computational complexity by focusing on the most likely paths at
each step.