Applications of Hidden Markov Model Stat-1
Applications of Hidden Markov Model Stat-1
Abstract
This paper performs a state-of-the-art literature scheme, which includes 73 papers published in 42
review to classify and interpret the ongoing and scholarly journals since 2003 to 2012.
emerging issues associated with the Hidden Mar- The rest of the paper is set out as follows: in
kov Model (HMM) in the last decade. HMM is a Section 2, we present the basics description of the
commonly used method in many scientific areas. It HMM method with basic conceptions but the with-
is a temporal probabilistic model in which the state out detailed mathematical definitions. In this sec-
of the process is described by a single discrete ran- tion, we show only the fundamental mathematical
dom variable. The theory of HMMs was developed formulas which are necessary to introduce HMM
in the late 1960s. Now, it is especially known for its method. Section 3 presents a methodology which is
application in temporal pattern recognition, i.e. used to paper selection. In this part, we present a
speech, handwriting, and bioinformatics. After a basic bibliographic parameters and statistics. Af-
brief description of the study methodology, this terwards, in Section 4, we show a set of most im-
paper comprehensively compares the most impor- portant selected publications in respect to primary
tant HMM publications by field of interest, most application areas and bibliographic parameters.
cited authors, authors' nationalities, and scientific Section 5 contains some concluding remarks.
journals. The comparison is based on papers in-
dexed in the Institute for Scientific Information 2. Markov Model description
(ISI) Web of Knowledge and ScienceDirect data- Consider a system which consists of a set of N
bases. distinct states S1 , S 2 ,...,S N . At each discrete time
moment t the system can be in a single state. We
Keywords: Markov Chains, Hidden Markov Model, denote a single state in time t as q t . In general case
application areas, literature review. the current state q t depends on the previous state
qt 1 and the whole history of all previous states. In
1. Introduction
Hidden Markov Model (HMM) is a statistical the case of Markov Chain the history is truncated to
model named after Russian mathematician Andrey just the predecessor state. Moreover we consider a
Markov. It is a large and useful class of stochastic system in which transition between states is con-
processes. It is characterized by Markov Property stant in time:
aij P qt S j | qt 1 S i , 1 i, j N
which means that future state of the process de-
(1)
pends only upon the present state, not on the se-
i , 1 i N
temporal pattern recognition such as speech,
handwriting, gesture recognition, part-of-speech (2)
tagging, musical score following, partial discharges
A,
sitory has been established based on a classification
(3)
model (where observation sequence is defined tent for the triplet A, B, in order to maxim-
the system is currently in that state and given the
ize PO | )?
as O S i , S i ,...,S i , S j , S i )?
1 2 d d 1
The above problems can be solved with following
methods, respectively:
The paper only briefly describes the method. An-
swers for above questions and solutions to basic 1. Forward-Backward Algorithm
problems (showed late in the section) will not be
described here but can be found in appropriate lite- 2. Viterbi Algorithm
rature.
In a Markov Model, states of the model corres- 3. Baum-Welch reestimation procedure
pond to observable events. In Hidden Markov
HMM described above can be called as non-
Model states are hidden and not observable. We
parametric discrete HMM. Instead of probability
can only see the sequence of observations. The set
defined by matrix B we can use almost any proba-
of observation symbols is finite and contains M
bilistic parametric distribution e.g.: binomial,
distinct elements. The observation symbol corres-
Gaussian, Poisson, etc. For example observation
pond to the physical output of the system being
emission probability for Poisson Discrete Hidden
e j nj
b j n , n N
stochastic process. The first process determines
transitions from one state to another and is identical (6)
n!
to process described above. The second stochastic
process produce the sequence of observations. The where b j (n) is a Poisson Model for state j with
b j ( x) N ( x, j , j )
j. The HMM is defined as a triplet:
A, B,
(7)
(5)
In discrete and continuous Hidden Markov Model
There are three basic problems of interest that must use of mixture probabilities is also possible. For
be solved for the model to be useful in real-word example Gaussian Mixture HMM has following
applications: form: