What Is The Necessity of HMM ? What Is The Necessity of HMM ?
What Is The Necessity of HMM ? What Is The Necessity of HMM ?
Atul Surwase
• Limitation:- We study that problem of
estimating the parameter in class conditional
densities needed to make a single decision.
• 1) Markov model
• 2) Hidden Markov model
Atul Surwase
HMM Uses
• Uses
– Speech recognition
• Recognizing spoken words and phrases
– Text processing
• Parsing raw records into structured records
– Bioinformatics
• Protein sequence prediction
– Financial
• Stock market forecasts (price pattern prediction)
• Comparison shopping services
Atul Surwase
Speech Production Model
Atul Surwase
• The essential difference between a Markov
chain and a hidden Markov model is that for a
hidden Markov model there is not a one-to-
one correspondence between the states and
the symbols (Why Hidden?).
• It is no longer possible to tell what state the
model was in when xi was generated just by
looking at xi.
Atul Surwase
First order Hidden Markov model
Atul Surwase
Observable Markov Model
Atul Surwase
• We consider a sequence of states at successive
times; the state at any time t is denoted ω(t).
• A particular sequence of length T is denoted by
ωT = {ω(1), ω(2), ..., ω(T)}
• for instance we might have ω6 = {ω1, ω4, ω2,
ω2, ω1, ω4}. Note that the system can revisit a
state at different steps, and not every state
need be visited.
Atul Surwase
• we have been discussing a Markov model, or
technically speaking, a first-order discrete time
Markov model, since the probability at t+1
depends only on the states at t.
• For instance, in a Markov model for the
production of spoken words, we might have states
representing phonemes, and a Markov model for
the production of a spoken work might have
states representing phonemes.
• Such a Markov model for the word “cat” would
have states for /k/, /a/ and /t/, with transitions
from /k/ to /a/; transitions from /a/ to /t/; and
transitions from /t/ to a final silent state.
Atul Surwase
Markov Chain Example
• Based on the weather today what will it be
tomorrow?
• Assuming only four possible weather states
° Sunny
° Cloudy
° Rainy
° Snowing
Atul Surwase
Markov Chain Structure
(Sunny)
State S1 State S2 (Cloudy)
(Rainy) (Snowing)
State S3 State S4
Rainy Snowy
Atul Surwase
Markov Chain Transition
Probabilities
• Transition probability matrix:
Time t + 1
State S1 S2 S3 S4 Total
S1 a11 a12 a13 a14 1
S2 a21 a22 a23 a24 1
Time t
S3 a31 a32 a33 a34 1
S4 a41 a42 a43 a44 1
• aij = P(qt + 1 = Sj | qt = Si)
Atul Surwase
Markov Chain Transition
Probabilities
• Probabilities for tomorrow’s weather based on
today’s weather
Time t + 1
State Sunny Cloudy Rainy Snowing
Sunny 0.6 0.3 0.1 0.0
Cloudy 0.2 0.4 0.3 0.1
Time t
Rainy 0.1 0.2 0.5 0.2
Snowing 0.0 0.3 0.2 0.5
Atul Surwase
Ex:- Coke vs. Pepsi
Given that a person’s last cola purchase was Coke ™,
there is a 90% chance that her next cola purchase will
also be Coke ™.
If that person’s last cola purchase was Pepsi™, there
is an 80% chance that her next cola purchase will also
be Pepsi™.
0.9 0.1
0.8
coke pepsi
0.2
Atul Surwase
Coke vs. Pepsi
Given that a person is currently a Pepsi purchaser,
what is the probability that she will purchase Coke
two purchases from now?
The transition matrices
are: (corresponding to
0.9 0.1
P one purchase ahead)
0.2 0.8
Atul Surwase
Hidden Markov Models
Atul Surwase
• A hidden Markov model (HMM) is a statistical
Markov model in which the system being
modeled is assumed to be a Markov process
with unobserved (hidden) states.
Atul Surwase
HMM Assumptions: dependence
• The Markov Assumption
° Next state only dependent on current state
• The stationary assumption
° Transition probabilities independent of the time
the transition takes place
• The output independence assumption
° Observations independent of previous
observations
Atul Surwase
Markov HMM seen and unseen
sequences
• Markov has an observed state sequence
° S1, S2, S3, S4, S5, S6, …
• HMM has an unseen state sequence
° S1, S2, S3, S4, S5, S6, …
• And an observed event sequence
° O1, O2, O3, O4, O5, O6, …
• HMM unseen state sequence can only be
implied from the observed event sequence
Atul Surwase
Hidden Markov Models
(probabilistic finite state automata)
Often we face scenarios where states cannot be
directly observed.
We need an extension: Hidden Markov Models
a11 a22 a33 a44
Atul Surwase
HMM Weather Example
• Predicting weather based on today’s
• BUT visible weather determined by unseen
meteorological conditions
• Classified as:
° Good
° Variable
° Bad
Atul Surwase
HMM Model – Markov States
Variable
Good Bad
Variable
Good Bad
• States hidden
° e.g. stuck in a windowless room
Atul Surwase
HMM Model – Linked Events
Sunny, Cloudy,
Rainy, Snowy
• Bakis (left-right):
– As time increases, states proceed
from left to right
Atul Surwase
HMM Components
1) Evaluation
2) Decoding
3) Learning
Atul Surwase
Atul Surwase
Atul Surwase
Atul Surwase
Atul Surwase
Atul Surwase
Atul Surwase
(HMM Backward)
Atul Surwase
Atul Surwase
Atul Surwase
Atul Surwase
Atul Surwase
Atul Surwase
Atul Surwase
Example: The dishonest casino
Game:
1. You bet $1
2. You roll (always with a fair die)
3. Casino player rolls (maybe with fair die, maybe with
loaded die)
4. Highest number wins $2
The dishonest casino model
0.05
0.95 0.95
FAIR LOADED
Atul Surwase
Question # 1 – Decoding
GIVEN
1245526462146146136136661664661636616366163616515615115146123562344
What portion of the sequence was generated with the fair die, and what portion
with the loaded die?
Atul Surwase
Question # 2 – Evaluation
GIVEN
1245526462146146136136661664661636616366163616515615115146123562344
How likely is this sequence, given our model of how the casino works?
Atul Surwase
Question # 3 – Learning
GIVEN
1245526462146146136136661664661636616366163616515615115146123562344
Prob(6) = 64%
QUESTION
How “loaded” is the loaded die? How “fair” is the fair die? How often
does the casino player change from fair to loaded, and back?