Tutorial 1
Tutorial 1
1. Three white and three black balls are distributed in two urns in such a way that each
contains three balls. We say that the system is in state i, i = 0, 1, 2, 3, if the first urn
contains i white balls. At each step, we draw one ball from each urn and place the ball
drawn from the first urn into the second, and conversely with the ball from the second urn.
Let Xn denote the state of the system after the nth step. Explain why {Xn |n = 0, 1, 2, . . .}
is a Markov chain and calculate its transition probability matrix.
3. Suppose that a machine can be in two states: 0 = working and 1 = out of order on a
day. The probability that a machine is working on a particular day depends on the state
of the machine during two previous days. Specifically assume that P (Xn+1 = 0 | Xn−1 =
j, Xn = k) = qjk , j, k = 0, 1 where Xn is the state of the machine on day n.
4. Suppose that (Xn )n≥0 is Markov (λ, P ). If Yn = Xkn , for some fixed k, show that (Yn )n≥0
is Markov (λ, P k ).
6. A Markov chain {Xn | n ≥ 0} with states 0, 1, 2 has the transition probability matrix
1 1 1
2 3 6
0 1 2
3 3 .
1 1
2 0 2
If P {X = 0} = P {X = 1} = 41 , find E[X3 ].
7. Consider the Markov chain with three states, S = {1, 2, 3}, that has the following transition
matrix
1 1 1
2 4 4
P = 1 0 2
.
3 3
1 1
2 2 0
If we know P (X1 = 1) = P (X1 = 2) = 14 , find P (X1 = 3, X2 = 2, X3 = 1).
1
8. In the Dark Ages, Harvard, Dartmouth, and Yale admitted only male students. Assume
that, at that time, 80 percent of the sons of Harvard men went to Harvard and the rest
went to Yale, 40 percent of the sons of Yale men went to Yale, and the rest split evenly
between Harvard and Dartmouth; and of the sons of Dartmouth men, 70 percent went to
Dartmouth, 20 percent to Harvard, and 10 percent to Yale. (i) Find the probability that
the grandson of a man from Harvard went to Harvard. (ii) Modify the above by assuming
that the son of a Harvard man always went to Harvard. Again, find the probability that
the grandson of a man from Harvard went to Harvard.
9. Two gamblers, A and B, bet on successive independent tosses of an unbiased coin that
lands heads up with probability p. If the coin turns up heads, gambler A wins a rupee from
gambler B, and if the coin turns up tails, gambler B wins a rupee from gambler A. Thus
the total number of rupees among the two gamblers stays fixed, say N . The game stops
as soon as either gambler is ruined; i.e., is left with no money. Assume the initial fortune
of gambler A is i. Let Xn be the amount of money gambler A has after the nth toss. If
Xn = 0, then gambler A is ruined and the game stops. If Xn = N , then gambler B is
ruined and the game stops. Otherwise the game continues. Prove that {Xn | n = 0, 1, . . .}
is a discrete time Markov chain. Write the one-step transition probability matrix.
10. Consider a two-state Markov chain defined by the following transition probability matrix
of the form
1−α α
P = ,
β 1−β
where α ∈ [0, 1] and β ∈ [0, 1]. Derive n-step transition probability matrix.
11. Consider an experiment of mating rabbits. We watch the evolution of a particular gene
that appears in two types, G or g. A rabbit has a pair of genes, either GG (dominant),
Gg (hybrid–the order is irrelevant, so gG is the same as Gg) or gg (recessive). In mating
two rabbits, the offspring inherits a gene from each of its parents with equal probability.
Thus, if we mate a dominant (GG) with a hybrid (Gg), the offspring is dominant with
probability 1/2 or hybrid with probability 1/2. Start with a rabbit of given character
(GG, Gg, or gg) and mate it with a hybrid. The offspring produced is again mated with a
hybrid, and the process is repeated through a number of generations, always mating with
a hybrid.
(i) Write down the transition probabilities of the Markov chain thus defined.
(ii) Assume that we start with a hybrid rabbit. Let µn be the probability distribution
of the character of the rabbit of the n-th generation. In other words, µn (GG), µn (Gg),
µn (gg) are the probabilities that the n-th generation rabbit is GG, Gg, or gg, respectively.
Compute µ1 , µ2 , µ3 . Can you do the same for µn for general n?
12. Assume that a man’s profession can be classified as professional, skilled labourer, or un-
skilled labourer. Assume that, of the sons of professional men, 80 percent are professional,
10 percent are skilled labourers, and 10 percent are unskilled labourers. In the case of
sons of skilled labourers, 60 percent are skilled labourers, 20 percent are professional, and
20 percent are unskilled. Finally, in the case of unskilled labourers, 50 percent of the sons
are unskilled labourers, and 25 percent each are in the other two categories. Assume that
every man has at least one son, and form a Markov chain by following the profession of
a randomly chosen son of a given family through several generations. Set up the matrix
of transition probabilities. Find the probability that a randomly chosen grandson of an
unskilled labourer is a professional man.
2
13. Suppose that ξ0 , ξ1 , ξ2 , . . . are independent random variables with common probability
function f (k) = P (ξ0 = k) where k belongs to integers. Let S = {1, 2, . . . , N }. Let
X0 be another random variable, independent of sequence (ξn ), taking values in S and let
f : S × Z → S be a certain function. Define new random variables X1 , X2 , . . . by
Xn+1 = f (Xn , ξn ), n = 0, 1, 2, . . .
(i) Show that {Xn | n ≥ 0} form a Markov chain. (ii) Find its transition probabilities.
14. Specify the classes of the following Markov chains, and determine whether they are tran-
sient or recurrent:
0 0 0 1
0 12 21
0 0 0 1
P1 = 12 0 12 , P2 = 1 1 0 0 ,
1 1 2 2
2 2 0 0 0 1 0
1 1 1 3
2 0 2 0 0 4 4 0 0 0
1 1 1 0 0 1 1 0 0 0
14 2 41 2 2
P3 =
2 0 2 0 0 , P4 = 0 0 1 0 0 .
0 0 0 1 1 0 0 1 2 0
2 2 3 3
0 0 0 21 12 1 0 0 0 0
15. Let (Xn )n≥0 be a Markov chain with state space {1, 2, 3} whose transition probability
matrix is given by
0 1 0
2 1
P = 0 3 3 .
p 1−p 0
1
Calculate P (Xn = 1 | X0 = 1) in each of the following cases: (a), p = 16 , (b) p = 16 , (c)
1
p = 12 .