HMMs Models IN NLP
HMMs Models IN NLP
Malki Aman
Roll No:10
HMMs Model IN NLP
Markov’s Model
• A Markov Model is a model for systems that move from one state to
another, where the next state depends only on the current state, not on
the past.
This is called the Markov Property.
• "Memoryless" process — the future only depends on the present, not the
history.
Example:
Imagine weather:
• If today is Rainy, there's a 70% chance tomorrow is Rainy, and 30% chance
it’s Sunny.
• If today is Sunny, there’s a 60% chance tomorrow is Sunny, and 40% it’s
Rainy.
Diagram Of MM
Contin…
P(Today Rainy)xP(Cloudy|Rainy)xP(Sunny|Cloudy)xP(Rainy|Sunny)
=0.2 x 0.4 x 0.2 x 0.1
=0.0016
Hidden Markov Models
A Hidden Markov Model (HMM) is a statistical model used to
represent systems that follow a Markov process with hidden
(unobservable) states.
Commonly used in natural language processing, speech
recognition, bioinformatics, and more.
Key Concepts
• States: Hidden part of the model (e.g., weather: sunny, rainy).
• Observations: What we actually see (e.g., someone carrying an
umbrella).
• Transition Probabilities: Likelihood of moving from one state
to another.
• Emission Probabilities: Probability of an observation given a
state.
Structure of HMMs
Components:
N: Number of states
M: Number of observation symbols
A: State transition probability matrix
B: Observation probability matrix
π: Initial state distribution
Assumption of HMMs
Markov Assumption: The current state depends only on the previous state.
Output Independence: The observation depends only on the current hidden
state.
Diagram
Initial
Probabilities
State Transition
Probabilities
Observation or
Emission
Probabilities
Example
Advantages: Limitations:
• Powerful for modeling time series •Assumes independence
or sequence data. between observations.
• Well-understood with established •Training can be
algorithms. computationally intensive.
Summary
• HMMs model systems where the state is hidden but
observations are visible.
• Used in many fields for sequence prediction and analysis.
• Understanding algorithms like Viterbi and Forward-Backward is
key to working with HMMs.
Thank you for your Precious
Time 😊