0% found this document useful (0 votes)
189 views15 pages

Markov Chain Model

A Markov chain is a mathematical system that undergoes transitions from one state to another. The next state depends only on the current state and not on the sequence of events that preceded it. Markov chains have many applications as statistical models of real-world processes.

Uploaded by

Chinmay Patel
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
189 views15 pages

Markov Chain Model

A Markov chain is a mathematical system that undergoes transitions from one state to another. The next state depends only on the current state and not on the sequence of events that preceded it. Markov chains have many applications as statistical models of real-world processes.

Uploaded by

Chinmay Patel
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

->Guided By Malaram Sir ->Prepared By Chinmay Patel [09BCE038]

 A Markov chain, named after Andrey Markov, is a

mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states.  It is a random process characterized as memoryless:  The next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov Property.  Markov chains have many applications as statistical models of real-world processes.

 The state of the system at time t+1 depends only on the state of the system at time t.

X1

X2

X3

X4

X5

 Since the system changes randomly, it is generally impossible to predict with certainty the state of a Markov chain at a given point in the future. However, the statistical properties of the system's future can be predicted. In many applications, it is these statistical properties that are important.

 There are two ways of describing Markov chains: through state transition diagrams or as simple graphical models.  The changes of state of the system are called transitions, and the probabilities associated with various statechanges are called transition probabilities.  A transition diagram is a directed graph over the possible states where the arcs between states specify all allowed transitions (those occuring with non-zero probability).  One can also represent it in transition matrix.

 Weather : raining today

40% rain tomorrow 60% no rain tomorrow

not raining today

20% rain tomorrow 80% no rain tomorrow


0.6 0.8

0.4

0.4 0.6 P! 0.2 0.8

rain

no rain

0.2

A simple two state markov chain represented by transition diagram

 In graphical models, on the other hand, one focus on explicating variables and their dependencies.  At each time point the random walk is in a particular state X(t). This is a random variable. Its value is only affected by the random variable X(t - 1) specifying the state of the random walk at the previous time point.  Graphically, we can therefore write a sequence of random variables where arcs specify how the values of the variables are influenced by others (dependent on others). X(t) X(t+1) X(t-1)

 A game of snakes and ladders or any other game whose moves are determined entirely by dice is a Markov chain.  In this dice games, the only thing that matters is the current state of the board. The next state of the board depends on the current state, and the next roll of the dice. It doesn't depend on how things got to their current state.  But in a game such as blackjack, a player can gain an advantage by remembering which cards have already been shown (and hence which cards are no longer in the deck), so the next state (or hand) of the game is not independent of the past states.

 A famous Markov chain is the so-called "drunkard's walk", a random walk on the number line where, at each step, the position may change by +1 or 1 with equal probability.  For example, the transition probabilities from 5 to 4 and 5 to 6 are both 0.5, and all other transition probabilities from 5 are 0. These probabilities are independent of whether the system was previously in 4 or 6.

 Discrete markov chain :- It is one in which the system evolves through discrete time steps. So changes to the system can only happen at one of those discrete time values. Eg. Snakes and Ladder.  Continuous-time Markov chain :- It is one in which changes to the system can happen at any time along a continuous interval. An example is the number of cars that have visited a drive-through at a local fast-food restaurant during the day. A car can arrive at any time t rather than at discrete time intervals.

 Ergodic (or irreducible) Markov chain:- A Markov chain with the property that the complete set of states S is itself irreducible. Equivalently, one can go from any state in S to any other state in S in a Finite number of steps.  Absorbing Markov chain :- It is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left.

 Markov chains are applied in a number of ways to many different fields. Often they are used as a mathematical model from some random physical process.  Markovian systems appear extensively in thermodynamics and statistical mechanics, whenever probabilities are used to represent unknown or unmodelled details of the system.  Markov chain methods have also become very important for generating sequences of random numbers to accurately reflect very complicated desired probability distributions, via a process called Markov chain Monte Carlo (MCMC).

 Markov chains are the basis for the analytical treatment of queues (queueing theory). This makes them critical for optimizing the performance of telecommunications networks, where messages must often compete for limited resources (such as bandwidth).  Markov chains are employed in algorithmic music composition, particularly in software programs such as CSound, Max or SuperCollider.  Markov chains are used in Finance and Economics to model a variety of different phenomena, including asset prices and market crashes.

 Markov chains can be used to project population in smaller geopolitical areas.  Use for forecasting elections result from current condition.  Ranking of webpages generated by Google is defined via a random surfer algorithm(markov process).  Markov models have also been used to analyze web navigation behavior of users. A user's web link transition on a particular website can be modeled using Markov models and can be used to make predictions regarding future navigation and to personalize the web page for an individual user.

 Markov chains: models, algorithms and applications By Wai Ki Ching, Michael K. Ng  en.wikipedia  ocw.mit  math.ucf   math.colgate math.stackexchange

You might also like