0% found this document useful (0 votes)
34 views

Tutorial 1

This document contains 15 tutorial problems related to discrete-time Markov chains. The problems cover topics such as determining if a stochastic process is a Markov chain, calculating transition probability matrices, finding the probability of being in a state after n steps, and classifying Markov chains as transient or recurrent.

Uploaded by

ayan.av0010
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views

Tutorial 1

This document contains 15 tutorial problems related to discrete-time Markov chains. The problems cover topics such as determining if a stochastic process is a Markov chain, calculating transition probability matrices, finding the probability of being in a state after n steps, and classifying Markov chains as transient or recurrent.

Uploaded by

ayan.av0010
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Tutorial sheet-1

MTL 725 (Stochastic Processes& its Applications)

1. Three white and three black balls are distributed in two urns in such a way that each
contains three balls. We say that the system is in state i, i = 0, 1, 2, 3, if the first urn
contains i white balls. At each step, we draw one ball from each urn and place the ball
drawn from the first urn into the second, and conversely with the ball from the second urn.
Let Xn denote the state of the system after the nth step. Explain why {Xn |n = 0, 1, 2, . . .}
is a Markov chain and calculate its transition probability matrix.

2. Let X0 be an integer-valued random variable, P (X0 = 0) = 1, that is independent of i.i.d.


sequence Z1 , Z2 , . . ., where P (Zn = 1) = p, P (Zn = −1) = q, and P (Zn = 0) = 1 − p − q.
Let Xn = max(0, Xn−1 + Zn ), n = 1, 2, . . .. Prove that {Xn | n = 0, 1, 2, . . .} is a discrete
time Markov chain. Write the one-step transition probability matrix.

3. Suppose that a machine can be in two states: 0 = working and 1 = out of order on a
day. The probability that a machine is working on a particular day depends on the state
of the machine during two previous days. Specifically assume that P (Xn+1 = 0 | Xn−1 =
j, Xn = k) = qjk , j, k = 0, 1 where Xn is the state of the machine on day n.

a. Show that {Xn | n = 1, 2, . . .} is not a discrete time Markov chain.


b. Define a new state space for the problem by taking the pairs (j, k) where j and k are
0 or 1. We say that machine is in state (j, k) on day n if the machine is in state j
on day (n − 1) and in state k on day n. Show that with this changed state space the
system is a discrete time Markov chain.
c. Suppose the machine was working on Monday and Tuesday. What is the probability
that it will be working on Thursday

4. Suppose that (Xn )n≥0 is Markov (λ, P ). If Yn = Xkn , for some fixed k, show that (Yn )n≥0
is Markov (λ, P k ).

5. Suppose that Z0 , Z1 , . . . are i.i.d. random variables such


P that Zi = 1 with probability p
and Zi = 0 with probability 1 − p. Set S0 = 0, Sn = ni=1 Zi . In each of the following
cases determine whether (Xn )n≥0 is a Markov chain: (i) Xn = Zn , (ii) Xn = Sn , (iii)
Xn = S0 + S1 + . . . + Sn . In the cases where (Xn )n≥0 is a Markov chain find its state space
and transition probabilities.

6. A Markov chain {Xn | n ≥ 0} with states 0, 1, 2 has the transition probability matrix
1 1 1
2 3 6
0 1 2
3 3 .
1 1
2 0 2

If P {X = 0} = P {X = 1} = 41 , find E[X3 ].

7. Consider the Markov chain with three states, S = {1, 2, 3}, that has the following transition
matrix
1 1 1
 
2 4 4
P = 1 0 2
.
3 3
1 1
2 2 0
If we know P (X1 = 1) = P (X1 = 2) = 14 , find P (X1 = 3, X2 = 2, X3 = 1).

1
8. In the Dark Ages, Harvard, Dartmouth, and Yale admitted only male students. Assume
that, at that time, 80 percent of the sons of Harvard men went to Harvard and the rest
went to Yale, 40 percent of the sons of Yale men went to Yale, and the rest split evenly
between Harvard and Dartmouth; and of the sons of Dartmouth men, 70 percent went to
Dartmouth, 20 percent to Harvard, and 10 percent to Yale. (i) Find the probability that
the grandson of a man from Harvard went to Harvard. (ii) Modify the above by assuming
that the son of a Harvard man always went to Harvard. Again, find the probability that
the grandson of a man from Harvard went to Harvard.

9. Two gamblers, A and B, bet on successive independent tosses of an unbiased coin that
lands heads up with probability p. If the coin turns up heads, gambler A wins a rupee from
gambler B, and if the coin turns up tails, gambler B wins a rupee from gambler A. Thus
the total number of rupees among the two gamblers stays fixed, say N . The game stops
as soon as either gambler is ruined; i.e., is left with no money. Assume the initial fortune
of gambler A is i. Let Xn be the amount of money gambler A has after the nth toss. If
Xn = 0, then gambler A is ruined and the game stops. If Xn = N , then gambler B is
ruined and the game stops. Otherwise the game continues. Prove that {Xn | n = 0, 1, . . .}
is a discrete time Markov chain. Write the one-step transition probability matrix.

10. Consider a two-state Markov chain defined by the following transition probability matrix
of the form  
1−α α
P = ,
β 1−β
where α ∈ [0, 1] and β ∈ [0, 1]. Derive n-step transition probability matrix.

11. Consider an experiment of mating rabbits. We watch the evolution of a particular gene
that appears in two types, G or g. A rabbit has a pair of genes, either GG (dominant),
Gg (hybrid–the order is irrelevant, so gG is the same as Gg) or gg (recessive). In mating
two rabbits, the offspring inherits a gene from each of its parents with equal probability.
Thus, if we mate a dominant (GG) with a hybrid (Gg), the offspring is dominant with
probability 1/2 or hybrid with probability 1/2. Start with a rabbit of given character
(GG, Gg, or gg) and mate it with a hybrid. The offspring produced is again mated with a
hybrid, and the process is repeated through a number of generations, always mating with
a hybrid.
(i) Write down the transition probabilities of the Markov chain thus defined.
(ii) Assume that we start with a hybrid rabbit. Let µn be the probability distribution
of the character of the rabbit of the n-th generation. In other words, µn (GG), µn (Gg),
µn (gg) are the probabilities that the n-th generation rabbit is GG, Gg, or gg, respectively.
Compute µ1 , µ2 , µ3 . Can you do the same for µn for general n?

12. Assume that a man’s profession can be classified as professional, skilled labourer, or un-
skilled labourer. Assume that, of the sons of professional men, 80 percent are professional,
10 percent are skilled labourers, and 10 percent are unskilled labourers. In the case of
sons of skilled labourers, 60 percent are skilled labourers, 20 percent are professional, and
20 percent are unskilled. Finally, in the case of unskilled labourers, 50 percent of the sons
are unskilled labourers, and 25 percent each are in the other two categories. Assume that
every man has at least one son, and form a Markov chain by following the profession of
a randomly chosen son of a given family through several generations. Set up the matrix
of transition probabilities. Find the probability that a randomly chosen grandson of an
unskilled labourer is a professional man.

2
13. Suppose that ξ0 , ξ1 , ξ2 , . . . are independent random variables with common probability
function f (k) = P (ξ0 = k) where k belongs to integers. Let S = {1, 2, . . . , N }. Let
X0 be another random variable, independent of sequence (ξn ), taking values in S and let
f : S × Z → S be a certain function. Define new random variables X1 , X2 , . . . by

Xn+1 = f (Xn , ξn ), n = 0, 1, 2, . . .

(i) Show that {Xn | n ≥ 0} form a Markov chain. (ii) Find its transition probabilities.

14. Specify the classes of the following Markov chains, and determine whether they are tran-
sient or recurrent:  
0 0 0 1
0 12 21
 
 0 0 0 1
P1 =  12 0 12  , P2 =  1 1 0 0 ,

1 1 2 2
2 2 0 0 0 1 0
1 1   1 3 
2 0 2 0 0 4 4 0 0 0
 1 1 1 0 0  1 1 0 0 0
 14 2 41  2 2 
P3 = 
2 0 2 0 0  , P4 =  0 0 1 0 0 .
  
0 0 0 1 1   0 0 1 2 0
2 2 3 3
0 0 0 21 12 1 0 0 0 0

15. Let (Xn )n≥0 be a Markov chain with state space {1, 2, 3} whose transition probability
matrix is given by  
0 1 0
2 1
P = 0 3 3 .
p 1−p 0
1
Calculate P (Xn = 1 | X0 = 1) in each of the following cases: (a), p = 16 , (b) p = 16 , (c)
1
p = 12 .

You might also like