0% found this document useful (0 votes)
17 views16 pages

HMMs Models IN NLP

The document provides an overview of Hidden Markov Models (HMMs), which are statistical models used to represent systems with hidden states and observable outputs, commonly applied in fields like natural language processing and speech recognition. It discusses key concepts such as states, observations, and transition probabilities, along with algorithms like the Forward and Viterbi algorithms used for evaluation and decoding. The document also highlights the advantages and limitations of HMMs, emphasizing their effectiveness in modeling sequence data.

Uploaded by

Malki Aman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views16 pages

HMMs Models IN NLP

The document provides an overview of Hidden Markov Models (HMMs), which are statistical models used to represent systems with hidden states and observable outputs, commonly applied in fields like natural language processing and speech recognition. It discusses key concepts such as states, observations, and transition probabilities, along with algorithms like the Forward and Viterbi algorithms used for evaluation and decoding. The document also highlights the advantages and limitations of HMMs, emphasizing their effectiveness in modeling sequence data.

Uploaded by

Malki Aman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 16

Introduction

Malki Aman
Roll No:10
HMMs Model IN NLP
Markov’s Model
• A Markov Model is a model for systems that move from one state to
another, where the next state depends only on the current state, not on
the past.
This is called the Markov Property.
• "Memoryless" process — the future only depends on the present, not the
history.
Example:
Imagine weather:
• If today is Rainy, there's a 70% chance tomorrow is Rainy, and 30% chance
it’s Sunny.
• If today is Sunny, there’s a 60% chance tomorrow is Sunny, and 40% it’s
Rainy.
Diagram Of MM
Contin…

P(Today Rainy)xP(Cloudy|Rainy)xP(Sunny|Cloudy)xP(Rainy|Sunny)
=0.2 x 0.4 x 0.2 x 0.1
=0.0016
Hidden Markov Models
A Hidden Markov Model (HMM) is a statistical model used to
represent systems that follow a Markov process with hidden
(unobservable) states.
Commonly used in natural language processing, speech
recognition, bioinformatics, and more.
Key Concepts
• States: Hidden part of the model (e.g., weather: sunny, rainy).
• Observations: What we actually see (e.g., someone carrying an
umbrella).
• Transition Probabilities: Likelihood of moving from one state
to another.
• Emission Probabilities: Probability of an observation given a
state.
Structure of HMMs

Components:
N: Number of states
M: Number of observation symbols
A: State transition probability matrix
B: Observation probability matrix
π: Initial state distribution
Assumption of HMMs

Markov Assumption: The current state depends only on the previous state.
Output Independence: The observation depends only on the current hidden
state.
Diagram

Initial
Probabilities

State Transition
Probabilities

Observation or
Emission
Probabilities
Example

P(Raincoat|Rainy) x P(Normal|Sunny) x P(Umbrella|Cloudy) x P(RainCoat|Rainy)


X
P(Today Rainy) x P(Sunny|Rainy) x P(Cloudy|Sunny) x P(Rainy|Cloudy)
= 0.5 x 0.6 x 0.4 x 0.5 x 0.2 x 0.1 x 0.3 x 0.3
= 0.000108
Algorithms Used
Forward Algorithm: For evaluation
Viterbi Algorithm: For decoding (finding most probable state
sequence)
Baum-Welch Algorithm: For training (a type of Expectation-
Maximization)
Applications

• Speech Recognition • Finance / Stock Market Analysis


• Bioinformatics / DNA Sequencing • Gesture or Motion Tracking
• Predictive Text / Spell Correction • Natural Language Processing (NLP)
• Activity Recognition (Wearables) • Anomaly Detection / Fraud Detection
Advantages & Limitations

Advantages: Limitations:
• Powerful for modeling time series •Assumes independence
or sequence data. between observations.
• Well-understood with established •Training can be
algorithms. computationally intensive.
Summary
• HMMs model systems where the state is hidden but
observations are visible.
• Used in many fields for sequence prediction and analysis.
• Understanding algorithms like Viterbi and Forward-Backward is
key to working with HMMs.
Thank you for your Precious
Time 😊

You might also like