0% found this document useful (0 votes)
87 views18 pages

Hidden Markov Model Submitted by Sawera Yaseen ROLL NO 1010 Submitted To MR - Hassan University of Okara

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
87 views18 pages

Hidden Markov Model Submitted by Sawera Yaseen ROLL NO 1010 Submitted To MR - Hassan University of Okara

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

HIDDEN MARKOV MODEL

SUBMITTED BY
SAWERA YASEEN
ROLL NO 1010

SUBMITTED TO
MR.HASSAN

UNIVERSITY OF OKARA
MARKOV MODEL OVERVIEW
A Markov Model is a statistical model that describes a system
that transitions between different states over time, where the
probability of moving to a future state depends only on the
current state, not on past states. This assumption is known as
the Markov Property and forms the core of these models.

Example
Consider a weather model with three states: Sunny, Cloudy,
and Rainy. The probability of tomorrow’s weather only
depends on today’s weather. If it’s sunny today, there might
be a 70% chance it remains sunny, a 20% chance it becomes
cloudy, and a 10% chance of rain.
States and Transitions

q A state represents a condition or position in the system. In a Markov Model,


the system is always in one of a finite set of states.
q Transitions are the movements from one state to another. In a basic Markov
Model, the probability of transitioning from one state to another depends only
on the current state.

Markov Property

q The Markov Property states that the probability of moving to the next state
depends solely on the present state, not the sequence of states that preceded it.
This is often called memorylessness.
Types of Markov Models

q Discrete-Time Markov Chain (DTMC): Used when transitions happen at regular intervals.
q Continuous-Time Markov Chain (CTMC): Useful when transitions can happen at any
time.
q Hidden Markov Model (HMM): In this type, the states themselves are not directly visible,
but their effects are observed, making them "hidden.“

Applications

q Finance: Stock price predictions, credit scoring.


q Natural Language Processing: Speech recognition, text generation.
q Biology: Modeling DNA sequences, gene expression.
q Robotics: Path planning and navigation.
HIDDEN MARKOV MODEL
A Hidden Markov Model (HMM) is a statistical model used to represent systems that
have hidden states which can only be observed indirectly through observable events. It is
commonly used in situations where we want to infer hidden information from visible patterns.
Simple Example:
Imagine you want to predict the weather (sunny,
cloudy, or rainy) based on people's behavior,
like whether they carry an umbrella, wear a
raincoat, or go out as usual (normal). You can't
directly observe the weather conditions every
time, but you can observe people's behavior.
Based on their actions, you can estimate the
most likely weather condition.
For instance:
•If you see people carrying umbrellas, it’s likely
to be cloudy or rainy.
•If people are going out without any special
gear, it’s likely sunny.
Using HMM, we estimate the hidden states
(weather) based on observable behaviors (what
people carry).
Important Terms:
1.Hidden States: The conditions we want to infer. In our example, hidden states are Cloudy,
Sunny, and Rainy weather.
2.Observable States: The behaviors or actions we can directly observe. Here, observable
states are Umbrella, Normal, and Raincoat.
3.Initial Probabilities: The probability distribution of the weather at the start of the model.
For instance, there's a 0.4 probability of starting with Cloudy weather, 0.4 for Sunny, and 0.2
for Rainy.
4.Transition Probabilities: The probability of moving from one weather condition to
another. For example, if today is Cloudy, there’s a 50% chance tomorrow will also be
Cloudy, 30% chance for Sunny, and 20% for Rainy.
5.Emission Probabilities: The probability of observing certain behaviors based on the
weather. For example, if it’s Cloudy, there’s a 70% chance people will carry umbrellas, 20%
for going out normally, and 10% for wearing raincoats.
Additional Details:
•HMM Matrices:
• Initial Probability Matrix: Shows the initial probabilities for each weather condition
(Cloudy: 0.4, Sunny: 0.4, Rainy: 0.2).
• Transition Probability Matrix: Indicates the probabilities of transitioning between
weather states (e.g., Cloudy to Sunny has a probability of 0.3).
• Emission Probability Matrix: Indicates the likelihood of different observable states
given a weather condition (e.g., the probability of seeing people with umbrellas if it’s
Cloudy is 0.7).
•Applications: HMMs are widely used in speech recognition, natural language processing,
bioinformatics, and other fields where hidden patterns are inferred from observable data.
VISUAL REPRESENTATION OF PROBABILITY
MATRICES
1. Initial Probability Matrix
Explanation:
The Initial Probability Matrix represents the probability of starting the day
in each weather condition. According to this table, there's a 40% chance that the
day begins as either Cloudy or Sunny, and a 20% chance it starts as Rainy.

Weather Condition Initial Probability

Cloudy 0.4

Sunny 0.4

Rainy 0.2
Transition Probability Matrix
Explanation: This matrix shows the probabilities of transitioning from one weather
condition to another:
•If the current day is Cloudy, there's a 50% chance it will remain Cloudy the next
day, a 30% chance it will become Sunny, and a 20% chance of turning Rainy.
•If the day is Sunny, it has a 40% chance of staying Sunny, a 40% chance of
becoming Cloudy, and a 20% chance of changing to Rainy.
•If the day is Rainy, it has a 30% chance of staying Rainy, a 20% chance of becoming
Sunny, and a 50% chance of becoming Cloudy.
These transition probabilities help the model estimate the likelihood of the weather
changing from one state to another from one day to the next.

From / To Cloudy Sunny Rainy


Cloudy 0.5 0.3 0.2
Sunny 0.4 0.4 0.2
Rainy 0.3 0.2 0.5
Emission Probability Matrix
Explanation: This matrix indicates the likelihood of observing specific behaviors based on the
current weather condition:
•If the weather is Cloudy, there's a 70% chance people will carry umbrellas, a 20% chance
they will go out normally, and a 10% chance of wearing raincoats.
•If the weather is Sunny, people are most likely to go out normally (60%), with lower
chances of carrying an umbrella (30%) or wearing a raincoat (10%).
•If the weather is Rainy, people have a 50% chance of carrying an umbrella, a 40% chance
of wearing a raincoat, and only a 10% chance of going out normally.
The emission probabilities allow the model to interpret observed behaviors (like carrying an
umbrella) as clues for inferring the hidden weather conditions.

Weather
Umbrella Normal Raincoat
Condition
Cloudy 0.7 0.2 0.1
Sunny 0.3 0.6 0.1
Rainy 0.5 0.1 0.4
Example Walkthrough
Imagine you observe a sequence of behaviors over three days:
•Day 1: People are carrying umbrellas.
•Day 2: People go out normally.
•Day 3: People are wearing raincoats.
Using the matrices, we can estimate the most likely weather sequence.
1.Day 1:
1. The observation is Umbrella.
2. From the Emission Probability Matrix, Cloudy (0.7) and Rainy (0.5) are likely weather conditions, as
both have relatively high probabilities for umbrellas.
2.Day 2:
1. The observation is Normal.
2. Sunny is the most probable weather, with a 0.6 chance, according to the Emission Probability Matrix.
3.Day 3:
1. The observation is Raincoat.
2. Rainy has the highest probability for Raincoat (0.4), suggesting a rainy condition.
By considering initial probabilities, transition probabilities, and emission probabilities across days, the
model would calculate the most likely weather sequence that fits the observed behaviors, which might be
Cloudy, Sunny, Rainy in this case.
Explanation of the 81 Possibilities
1.Three Hidden States and Three Observations:
1. If we are considering three time steps and we have three possible hidden states (Cloudy,
Sunny, Rainy) for each step, we need to explore every combination of these hidden
states across the sequence.
2. Each time step has 3 choices (Cloudy, Sunny, Rainy), and over three days, we get
3×3×3=27 combinations of states.
2.Two Layers of Observation and State Combinations:
1. Similarly, for each sequence of hidden states, we also consider the possible observable
states (Umbrella, Normal, Raincoat) associated with each hidden state.
2. For each of the 27 hidden state sequences, we have 3 possible observation sequences.
3. Therefore, there are 27×3=81 total combinations of hidden states and observations
81 PROBABILITIES CONDITION
Calculation Example:
Given three days, we need to account for:
•All possible paths of hidden states over three days, each with three options.
•All possible sequences of observations that align with these hidden state paths.
Each possible sequence of hidden states has a probability based on the transition matrix, initial
probability, and emission probabilities. In the HMM, we would calculate the total probability
for each of these 81 paths and then choose the one with the highest probability to predict the
likely sequence of hidden states.
Practical Use
In real applications, calculating probabilities for all 81 combinations is computationally
expensive. Algorithms like the Viterbi algorithm help to efficiently find the most probable
sequence without calculating every possible path. The Viterbi algorithm uses dynamic
programming to reduce the computational complexity by focusing on the most likely paths at
each step.

You might also like