I2ml3e Chap15
I2ml3e Chap15
INTRODUCTION
TO
MACHNE
LEARNNG
3RD EDTON
ETHEM ALPAYDIN
The MIT Press, 2014
[email protected]
http://www.cmpe.boun.edu.tr/~ethem/i2ml3e
CHAPTER 15:
Transition probabilities
aij P(qt+1=Sj | qt=Si) aij 0 and j=1N
aij=1
Initial probabilities
i P(q1=Si) j=1N i=1
Stochastic Automaton
5
T
P O Q | A , P q1 P qt | qt 1 q1aq1q2 aqT 1qT
t 2
Example: Balls and Urns
6
i
# sequences K
# transition s from Si to S j
aij
# transition s from Si
1
k t 1 t i
T- 1
q k
S and qt 1 S j
k
1q Si
T- 1 k
k t 1 t
Hidden Markov Models
8
N: Number of states
M: Number of observation symbols
A = [aij]: N by N state transition probability matrix
B = bj(m): N by M observation probability matrix
= [i]: N by 1 initial state probability vector
(Rabiner, 1989)
Evaluation
12
Forward variable:
t i P O1 Ot , qt Si |
Initializa tion :
1 i i bi O1
Recursion :
N
t 1 j t i aij b j Ot 1
i 1
N
P O | T i
i 1
Backward variable:
t i P Ot 1 OT | qt Si ,
Initializa tion :
T i 1
Recursion :
N
t i aij b j Ot 1 t 1 j
j 1
13
Finding the State Sequence
14
t i P qt Si O,
t i t i
N
j 1 t j t j
Choose the state that has the highest probability,
for each time step:
qt*= arg maxi t(i)
No!
Viterbis Algorithm
15
t i , j P qt Si , qt 1 S j | O ,
t i aij b j Ot 1 t 1 j
t i , j
k l t k akl bl Ot 1 t 1 l
Baum - Welch algorithm (EM) :
1 if qt Si 1 if qt Si and qt 1 S j
z
t
i zij
t
0 otherwise 0 otherwise
Baum-Welch (EM)
E step : E zit t i E zijt t i , j
M step :
K
1 i
k
k 1 t 1 t i , j
K Tk 1
k
i k 1
aij
k 1 t 1 t i
K Tk 1
K k
k 1 t 1 t
K
k
Tk 1
j 1Ot vm
k
b j m
k 1 t 1 t i
K Tk 1
k
17
Continuous Observations
18
Discrete:
rmt
1 if Ot vm
P Ot | qt S j , bj m
M
r
t
m
m 1 0 otherwise
l 1
~ N l , l
Continuous:
POt |qt S j , ~ N j , j2
j
j
t t
HMM with Input
19
Input-dependent observations:
POt | qt S j , x t , ~ N g j x t | j , j2
Time-delay input:
xt f Ot ,...,Ot 1
HMM as a Graphical Model
20
21
Model Selection in HMM
22
Left-to-right HMMs:
a11 a12 a13 0
0 a a a
A
22 23 24
0 0 a33 a34
0 0 0 a 44