0% found this document useful (0 votes)
10 views2 pages

IE325 Recitation2

Questions on Discrete Time Markov Chains

Uploaded by

pokemoncu654
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views2 pages

IE325 Recitation2

Questions on Discrete Time Markov Chains

Uploaded by

pokemoncu654
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

IE 325 STOCHASTIC MODELS

RECITATION 2

FALL 2024

1. Cedric practices penalty shootouts everyday. Every day, he kicks the ball only once. Suppose that the
chance of scoring a goal tomorrow depends on the result of today's shootout. If Cedric scores today, then
he will also score a goal tomorrow with probability a and if he is unable to score today, then he will score
a goal tomorrow with probability b. The probability that whether Cedric scores a goal or not initially is
equally likely.
(a) Dene an appropriate Markov Chain and compute the one-step and the two-step transition prob-
ability matrices.
(b) What is the probability that Cedric will score a goal the day after tomorrow?
(c) Suppose Cedric scores a goal the day after tomorrow. What is the probability that he does not
score today but scores tomorrow?
(d) Suppose now the chance of scoring a goal tomorrow depends on the result of the shootout today
and yesterday. Describe how you would dene a Markov Chain and write its state space. (You
are NOT asked to write the rst step transition probability matrix P .)
2. Two players, A and B , play the game of matching pennies: at each time n, each player has a penny
and must secretly turn the penny to heads or tails. The players then reveal their choices simultaneously.
If the pennies match (both heads or both tails), Player A wins the penny. If the pennies do not match
(one heads and one tails), Player B wins the penny. Suppose the players have between them a total of 5
pennies. If at any time one player has all of the pennies, to keep the game going, he gives one back to
the other player and the game will continue. Show that this game can be formulated by a Markov chain
and calculate a one step transition probability matrix.

3. Lee has r umbrellas which he uses in going from his home to oce, and vice versa. If he is at home
(the oce) at the beginning (end) of a day and it is raining, then he will take an umbrella with him to the
oce (home), provided there is one to be taken. If it is not raining, then he does not take an umbrella.
Assume that, independent of the past, it rains at the beginning (end) of a day with probability p. Dene
a Markov chain with r + 1 states which will help us to determine the proportion of time that Lee gets
wet. (Note : He gets wet if it is raining, and all umbrellas are at his other location.)

4. Markov chains are used to model the completion time of projects that have several stages. Each stage
can take a short or a long time depending on the project team capabilities. Consider a project with four
stages corresponding to Preparation(P), Development(D), Implementation (I) and Completion(C). To
model the progress of this project, a Markov chain on the state space E = {P, D, I, C} with the following
transition matrix between the states:
P D I C
 
P 0.9 0.1 0 0
 
D 0 0.99 0.01 0 
I 
 0 0 0.95 0.05 

C 0 0 0 1
1
2 FALL 2024

(a) What is the expected amount of time it takes to complete the project?
(b) What is the probability that after three units of time, you have not yet nished the Preparation
stage?
5. A miner is trapped in a mine containing three doors. The rst door leads to a tunnel that takes him
to safety after two hours of travel. The second door leads to a tunnel that returns him to the mine after
three hours of travel. The third door leads to a tunnel that returns him to his mine after ve hours.
Assuming that the miner is at all times equally likely to choose any one of the doors,
(a) what is the expected number of travels he makes through tunnels?
(b) what is the expected length until the miner reaches safety?

6. In a good weather year the number of storms is Poisson distributed with mean 1; in a bad year it is
Poisson distributed with mean 3. Suppose that any year weather condition depends on past years only
through the previous year condition. Suppose that a good year is equally likely to be followed by either
a good or a bad year, and that a bad year is twice as likely to be followed by a bad year as by a good
year. Suppose that last year, call it year 0, was a good year.
(a) Find the expected total number of storms in the next two years (that is, in years 1 and 2).
(b) Find the probability that there are no storms in year 3.

You might also like