MT Sol 04
MT Sol 04
Stopping
4.1 Let (Ω, F, (Fn )n≥0 , P ) be a ltered probability space and S and T two stopping times.
Prove (without consulting the lecture notes) that S ∧ T := min{S, T }, S ∨ T := max{S, T },
and S + T are also stopping times.
SOLUTION:
Since the events on the right hand side of these equations are all Fn -measurable, the
statement of the problem follows.
4.2 Martingales for simple symmetric random walk on Z.
Let n 7→ Xn be a simple symmetric random walk on the one-dimensional integer lattice Z
and (Fn )n≥0 its natural ltration.
(b) Find a deterministic sequence an ∈ R such that Zn := Xn3 +an Xn be an (Fn )-martingale.
SOLUTION:
Let ξj , j = 1, 2, . . . be i.i.d. random variables with the common distribution P (ξj = ±1) =
1/2, Fn := σ(ξj : 1 ≤ j ≤ n) and write
n
X
X0 = 0, Xn := ξj .
j=1
1
(a)
E Xn+1 Fn = E Xn + ξn+1 Fn
= Xn + E ξn+1 Fn
= Xn .
2
E Yn+1 Fn = E Xn+1 − (n + 1) Fn
= E Xn2 − n + 2Xn ξn+1 + ξn+1
2
− 1 Fn
= Xn2 − n + 2Xn E ξn+1 Fn + E ξn+12 Fn − 1
= Yn .
Hence,
E Zn+1 Fn = · · · = Zn + (an+1 − an + 3)Xn .
That is,
Zn = Xn3 − 3nXn
is a martingale.
(c) Proceed similarly as in (b). Write
Vn+1 = (Xn + ξn+1 )4 + bn+1 (Xn + ξn+1 )2 + cn+1
= Vn + 4Xn3 ξn+1 + 6Xn2 ξn+1
2 3
+ 4Xn ξn+1 4
+ ξn+1
+ (bn+1 − bn )Xn2 + 2bn+1 Xn ξn+1 + bn+1 ξn2 + (cn+1 − cn ).
Hence,
E Vn+1 Fn = · · · = Vn + (bn+1 − bn + 6)Xn2 + (cn+1 − cn + bn+1 + 1).
2
Thus, in order that n 7→ Zn be a martingale we must choose
bn = b0 − 6n,
Xn
cn = c0 − bk − n = c0 + (b0 − 1)n − 3n(n − 1) = c0 + (b0 + 2)n − 3n2 .
k=1
That is,
Vn = Xn4 − 6nXn2 − 3n2 + 2n
is a martingale.
4.3 Gambler's Ruin, 1
A gambler wins or looses one pound in each round of betting, with equal chances and
independently of the past events. She starts betting with the rm determination that she
will stop gambling when either she won n pounds or she lost m pounds.
(a) What is the probability that she will be winning when she stops playing further.
(b) What is the expected number of her betting rounds before she will stop playing further.
SOLUTION: Model the experiment with simple symmetric random walk. Let ξj ,
j = 1, 2, . . . be i.i.d. random variables with common distribution
1
P (ξi = +1) = = P (ξi = −1) ,
2
and Fn = σ(ξj , 0 ≤ j ≤ n), n ≥ 0, their natural ltration. Denote
n
X
S0 = 0, Sn := ξj , n ≥ 1.
j=1
Note that
{the gambler wins a pounds} = {T = TR },
{the gambler looses b pounds} = {T = TL }.
Hence
−bP (T = TL ) + aP (T = Tr ) = 0.
3
On the other hand,
P (T = TL ) + P (T = Tr ) = 1.
Let the lazy random walk Xn be a Markov chain on Z with the following transition proba-
bilities
3 1
P Xn+1 = i ± 1 Xn = i = , P Xn+1 = i Xn = i = .
8 4
Denote
Tk := inf{n ≥ 0 : Xn = k}.
SOLUTION:
Answer the same questions as in problem 3 when the probability of winning or loosing one
pound in each round is p, respectively, 1 − p, with p ∈ (0, 1).
Hint: Use the martingales constructed in problem 3.1.
SOLUTION: Model the experiment with simple biased random walk. Let ξj , j =
1, 2, . . . be i.i.d. random variables with common distribution
4
and Fn = σ(ξj , 0 ≤ j ≤ n), n ≥ 0, their natural ltration. Denote
n
X
S0 = 0, Sn := ξj , n ≥ 1.
j=1
Note that
{the gambler wins a pounds} = {T = TR },
{the gambler looses b pounds} = {T = TL }.
(a) Use the Optional Stopping Theorem for the martingale (q/p)Sn :
1 = E (q/p)Sn = (p/q)b P (T = TL ) + (q/p)a P (T = TR ) .
(b) Now, apply the Optional Stopping Theorem to the martingale Sn − (p − q)n.
Hence
1 − (p/q)b 1 − (q/p)a
−1 −1
E (T ) = (p − q) E (ST ) = (p − q) a − b
(q/p)a − (p/q)b (p/q)b − (q/p)a
a(1 − (p/q)b ) + b(1 − (q/p)a )
= (p − q)−1 .
(q/p)a − (p/q)b
4.6 Let (Ω, F, (Fn )n≥0 , P) be a ltered probability space and S and T two stopping times such
that P (S ≤ T ) = 1.
(a) Dene the process n 7→ Cn := 11{S<n≤T } . Prove that (Cn )n≥1 is predictable process.
That is: for all n ≥ 1, Cn is Fn−1 -measurable.
(b) Let the process (Xn )n≥0 be (Fn )-supermartingale and dene the process n 7→ Yn as
follows n X
Y0 := 0, Yn := Ck (Xk − Xk−1 ).
k=1
5
Prove that (Yn )n≥0 is also (Fn )-supermartingale.
(c) Prove that if S and T two stopping times such that P (S ≤ T ) = 1 and (Xn )n≥0 is
supermartingale then for all n ≥ 0, E (Xn∧T ) ≤ E (Xn∧S ).
SOLUTION:
(a)
{ω : Cn (ω) = 1} = {ω : S(ω) < n, T (ω) ≥ n}
= {ω : S(ω) ≤ n − 1} ∩ {ω : T (ω) ≤ n − 1}c .
Since both events on the right hand side are Fn−1 -measurable, the process n 7→ Cn
is predictable, indeed.
(b) Yn is clearly Fn measurable. We check the supermartingale condition:
E Yn+1 Fn = Yn + E Cn+1 (Xn+1 − Xn ) Fn
= Yn + Cn+1 E Xn+1 − Xn Fn
≤ Yn .
In the second step we use the result from (a). In the last step we use Cn+1 ≥ 0.
(c) Note that
Xn∧T − Xn∧S = Yn ,
and use (b).
4.7 A two-dimensional random walk.
HW
Let Xn be the following two dimensional random walk: n 7→ Xn is a Markov chain on the
two dimensional integer lattice Z2 with the following transition probabilities:
1
P Xn+1 = (i ± 1, j) Xn = (i, j) = ,
8
1
P Xn+1 = (i, j ± 1) Xn = (i, j) = .
8
1
P Xn+1 = (i ± 1, j ± 1) Xn = (i, j) = .
8
TR := inf{n ≥ 0 : |Xn |2 ≥ R2 }.
6
Give sharp lower and upper bounds for
E TR X0 = (0, 0) .
SOLUTION:
Let ξj , j = 1, 2, . . . be i.i.d. random two-dimensional vectors with the common
distribution
P (ξj = (±1, 0)) = P (ξj = (0, ±1)) = P (ξj = (±1, ±1)) = f rac18,
Note that
3
E (ξj ) = 0, E |ξj |2 = .
2
(a)
3
2
2 2 3
E |Xn+1 | − (n + 1) Fn = E |Xn | + 2Xn · ξn+1 + |ξn+1 | − (n + 1) Fn
2 2
3 3 3
= |Xn |2 − n + E |ξn+1 |2 Fn = |Xn |2 − n.
2 2 2
(b) Note rst that √
R2 ≤ |XTR |2 ≤ (R + 2)2 .
Apply the Optional Stopping Theorem,
2
E (TR ) = E |XTR |2 .
3
From these two relations we get
2 2 2 √
R ≤ E (TR ) ≤ (R + 2)2 .
3 3
(a) What is the expected number of tosses until we have seen the pattern HTHT for the
rst time?
(b) Give an example of a four letter pattern of H-s and T-s that has the maximal expected
number of tosses, of any four letter patterns, until it is seen.
7
SOLUTION:
(a) Apply the "Monkey Typing ABRACADABRA" method. You will nd
E (THTHT ) = 24 + 22 = 20.
(b) Obviously, from the same argument, the expected waiting time is maximal for
the sequences HHHH and TTTT:
E (THHHH ) = E (TTTTT ) = 24 + 23 + 22 + 2 = 30.
4.9 HW
We throw two fair dice and record the their sum at consecutive rounds. Compute the
expected number of rounds before the string 7,2,12,7,2 is recorded
SOLUTION:
This is yet again a "Monkey Typing ABRACADABRA" type of problem. Now, the
alphabet is 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, with non-uniform distribution:
1 2 3
P (2) = P (12) = ; P (3) = P (11) = ; P (4) = P (10) = ;
36 36 36
4 6 6
P (5) = P (9) = ; P (6) = P (8) = ; P (7) = .
36 36 36
So, the winnings should be changed accordingly: 2 and 12 pay 36-to-1; 3 and 11 pay
18-to-1; 4 and 10 pay 12-to-1; 5 and 9 pay 9-to-1; 7 pays 6-to-1.
Applying the method of the "Monkey Typing ABRACADABRA" problem we get
E (T7,2,12,7,2 ) = 6 · 36 · 36 · 6 · 36 + 6 · 36 = 1, 679, 832
Bonus.
Let Xn , n ≥ 0 be a (discrete time) birth and death process and (Fn )n≥0 its natural ltration.
That is: n 7→ Xn is a Markov chain on the state space S : {0, 1, 2, . . . } with transition
probabilities:
P Xn+1 = k + 1 Xn = k = pk , = 1 − P Xn+1 = k − 1 Xn = k ,
where pk ∈ (0, 1), k ∈ S , are xed and p0 = 1 is assumed. Let Fn := σ(Xj , 0 ≤ j ≤ n),
n ≥ 0, be tha natural σ -algebra generated by the process.
Denote qk := 1 − pk and dene the function g : S → R,
j
k−1 Y
X qi
g(k) := 1 + .
pi
j=1 i=1
8
(As always, empty sum is equal to 0, empty product is equal to 1.)
is an (Fn )-martingale.
(b) Denote
Tk := inf{n ≥ 0 : Xn = k}, k ∈ S,
That is: the probability that the process starting from k hits K before hitting 0.
SOLUTION:
Bonus.
N gentlemen throw their identical bowler hats in a heap and collect them in random order.
(That is: the hats get randomly permuted between them, with uniform distribution among
all N ! possibilities.) Those gentlemen who by chance get back their own hats happily go
home. The remaining ones yet again throw their hats in a heap and collect them randomly.
Those who get back their own hats happily go home. . . . The procedure continues till all
gentlemen go home with their own hats on. Compute the expected number of rounds before
the happy ending.
Hint: Compute rst the expected number of xed points in a random permutation of n
elements (uniformly distributed among all n! possibilities).
SOLUTION:
4.12 HW
Let m ∈ N and m ≥ 2. At time n = 0, an urn contains 2m balls, of which m are red and
m are blue. At each time n = 1, . . . , 2m we draw a randomly chosen ball from the urn and
record its colour. We do not replace it. Therefore, at time n the urn contains 2m − n balls.
For n = 0, . . . , 2m − 1 let Nn denote the number and
Nn
Pn =
2m − n
9
be the fraction of red balls remaining in the urn after time n. Let (Gn )0≤n≤2m be the natural
σ -algebra generated by the process (Nn )0≤n≤2m .
(b) Let T be the rst time at which the ball that we draw is red. (Note that T < 2m,
because the urn initially contains m > 1 red balls.) Show that the probability that the
(T + 1)-st ball is red is 12 .
SOLUTION:
(a) n 7→ Nn , 0 ≤ n < 2m, is a time-inhomogeneous Markov chain with transition
probabilities
if l = k − 1,
k
2m−n k
if l = k,
P Nn+1 = l Nn = k = 1 − 2m−n
otherwise.
0
Hence we compute
1
E Pn+1 Fn = E Nn+1 Fn
2m − (n + 1)
1
= (Nn (1 − Pn ) + (Nn − 1)Pn )
2m − (n + 1)
2m − n 1
= Pn − Pn
2m − (n + 1) 2m − (n + 1)
= Pn .
(b)
P (ball drawn at T + 1 is red) = E P ball drawn at T + 1 is red FT
= E (PT )
1
= P0 =
2
In the rst step we condition on the (random) state at the stopping time. In the last
step we use OST.
10