0% found this document useful (0 votes)
8 views3 pages

Exercise 06

Uploaded by

Trần Minh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views3 pages

Exercise 06

Uploaded by

Trần Minh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

MS-C2111 Stochastic processes J Kohonen

Department of Mathematics and Systems Analysis Fall 2024


Aalto University Exercise 6

6 Generating functions
This exercise set develops your skills in working with generating functions.

Classroom problems
6.1 Passage time to the origin for a random walk . Consider a random walk (Xt )t∈Z+ on the
set Z = {. . . , −2, −1, 0, 1, 2, . . . } which jumps to the left and to the right with equal
probabilities 21 . Let T0 = min {t ≥ 0 | Xt = 0} be the passage time to the origin for the
random walk, and denote by

X
z j P T0 = j X0 = x
 
ϕx (z) =
j=0

the generating function of T0 for a random walk started at x.

(a) Compute ϕx (z) for x = 0.


(b) Show that limx→∞ ϕx (z) = 0 for all 0 ≤ z < 1.
(c) Show that the generating functions satisfy
z 
ϕx (z) = ϕx+1 (z) + ϕx−1 (z) , x = 1, 2, . . .
2

(d) Find all numbers α ∈ R for which the function f (x) = αx dened on the integers
solves the dierence equation f (x) = z2 f (x + 1) + f (x − 1) .
(e) Apply the results of (a)(d) to derive a formula for ϕx (z) for all x = 0, 1, 2, . . .

1/3
MS-C2111 Stochastic processes J Kohonen
Department of Mathematics and Systems Analysis Fall 2024
Aalto University Exercise 6

Homework problems
6.2 (2p) Binomial distribution.Recall that a Bin(n, p) distributed random variable N has
the distribution ofPa sum of n independent copies of Ber(p) distributed random variable
X . That is, N = nk=1 Xk , where X1 , X2 , . . . Xn are independent with distribution

P[Xk = 1] = p, P[Xk = 0] = 1 − p.

(a) Compute the probability generating functions ϕX (s) = E[sX ] and ϕN (s) = E[sN ].
(b) Using the probability generating function ϕN , reconstruct the distribution of N .
That is, compute the probabilities P[N = k] for every k ∈ N.
derivatives of ϕN .
an−k bk . You don't need to compute k k=0 Hint: Recall the binomial theorem: (a + b) =
 n Pn n

(c) Assume N1 and N2 are independent Bin(n1 , p1 ) and Bin(n2 , p2 ) distributed random
variables, respectively. Compute the probability generating function of their sum,
ϕN1 +N2 (s) = E[sN1 +N2 ]. When is the sum N1 + N2 distributed as Bin(m, q) for some
m and q ? Justify your answer (but you don't need to present a mathematical proof
unless you want to).
(d) Suppose X and Y are independent random positive integers such that their sum
X + Y is Bin(n, p) distributed for some n ∈ N and q ∈ [0, 1]. What can we say
about distributions of X and Y ? Justify your answer (but you don't need to present
a mathematical proof unless you want to).

2/3
MS-C2111 Stochastic processes J Kohonen
Department of Mathematics and Systems Analysis Fall 2024
Aalto University Exercise 6

6.3 (2p) Moment generating functions and central limit theorem. A moment generating
function of a real-valued random variable Z is dened as MZ (s) = E[esZ ]. It is closely
related to the generating function ϕZ (s) = E[sZ ] we have seen in the lectures.
(a) Show that MZ (s) = ϕZ (es ).
(b) Assuming Z is a random integer in Z, show that MZ (0) = 1, MZ′ (0) = E[Z] and
(n)
MZ′′ (0) = E[Z 2 ]. (In general, for a real valued random variable Z we have MZ =
E[Z n ]. This is why MZ is called the moment generating function.)
(c) Either using (a) or by direct computation, show that if X and Y are independent
real-valued random variables, then MX+Y = MX MY , and that for a deterministic
constant λ ∈ R we have MλX (s) = MX (λs).
One way to prove the central limit theorem is by using moment generating functions1 .
Here we sketch the proof in the case when the random variable is a centred integer and
has a moment generating function.
Let XPa random number in Z with mean E[X] = 0 and variance Var(X) = 1. Denote by
Sn = nj=1 Xj the sum of n independent copies of X .
Do at least one of the following steps.
(d) Show that the second order Taylor polynomial of MX is given by
s2
MX (s) ≈ 1 + .
2
= 0. x2 + g(x), where limx→0 2 x f (x) = f (0) + f ′ (0)x +
g(x) f ′′ (0) 2
Recall: if a function f is twice dierentiable, its second order Taylor expansion at x = 0 is Hint:

(e) Show that the moment generating function of Nn = √1n Sn can be written as
 s n
MNn (s) = MX √ .
n
 n 1 2
(f) Using the identity e = limn→∞ 1 + n , show that limn→∞ MNn (s) = e 2 s .
x x

This is the moment generating function of the standard normal random variable
Z ∼ N (0, 1). Using Lévy's continuity theorem stated below, conclude that the
sequence √Sn
n
converges to a standard normal random variable in distribution as
n → ∞.

Theorem (Lévy's continuity theorem for moment generating functions ) Let Z1 , Z2 , . . .


be a sequence of random variables such that their moment generating functions
MZ1 , MZ2 , . . . converge pointwise to a continuous function f , i.e. for every s ∈ R we
have limn→∞ MZn (s) = f (s). Then there exists a unique random variable Z such
that Z1 , Z2 , . . . converges to Z in distribution. Furthermore, f = MZ .
1 Orrather the characteristic function φZ (s) = E[eisZ ], where i is the imaginary unit. This is because
unlike MZ , φZ exists for every real-valued random variable Z . If MZ exists, then MZ (s) = φZ (−is).

3/3

You might also like