Topic 2_ Generating Functions
Topic 2_ Generating Functions
2.0 Objectives
By the end of the topic, the learner should be able to;
i) understand the definition of the generating function and find the sum of Geometric,
Binomial, and Exponential series;
ii) understand the definition of the PGF, and use it to calculate the mean, variance,
and probabilities;
iii) calculate the PGF for Geometric, Binomial, and Poisson distributions
iv) understand the concept of Bivariate Generating Functions
2.1 Introduction
A drunk person staggers to the left and right as he walks. This process is called the
Random Walk in stochastic processes. Probability generating functions are particularly
useful for processes such as the random walk, because the process is defined as the sum
of a single repeating step. The repeating step is a move of one unit, left or right at
random. The sum of the first t steps gives the position at time t.
2.2 Univariate Generating Functions
Definition:
Let 𝑎0 , 𝑎1 , 𝑎2 , … be a sequence of real numbers. Using a variable S, we may define a
function:
∞
𝐴 𝑠 = 𝑎0 + 𝑎1 𝑠 + 𝑎2 𝑠 2 + 𝑎3 𝑠 3 + ⋯ = 𝑘=0 𝑎𝑘 𝑠𝑘 ……………………(i)
If this power series converges in some interval −𝑠0 < 𝑠 < 𝑠0 , then 𝐴 𝑠 is called the
generating function of the sequence 𝑎0 , 𝑎1 , 𝑎2 , …(i.e.,for those values of the parameter 𝑠
for which the sum converges, 𝐴 𝑠 is called a generating function).
1 𝑑𝑘 𝐴 𝑠
𝑎𝑘 =
𝑘! 𝑑𝑠 𝑘 𝑠=0
Page 1 of 16
Stochastic Processes
Example 1
𝑃𝑋 𝑠 = 𝑝0 + 𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + ⋯ = 𝑝𝑘 𝑠 𝑘
𝑘=0
∞
= 𝑃 𝑋 = 𝑘 𝑠 𝑘 = 𝐸(𝑠 𝑋 )
𝑘=0
where 𝐸(𝑠 𝑋 ) is the expectation of the function 𝑠 𝑋 ( a random variable) of the random
variable 𝑋. The series 𝑃(𝑠) converges at least −1 ≤ 𝑠 ≤ 1 (converges absolutely for
𝑠 ≤ 1).
Clearly 𝑃𝑋 1 = 1. Also 𝑃𝑋 0 = 𝑝0
N/B:
It is possible to have a sequence of probabilities which does not form a p.g.f. eg. if
1 1 1 3
𝑝0 = 𝑜, 𝑝1 = 4 , 𝑝2 = 2 , 𝑝3 = 3 , 𝑝4 = 4, then
Page 2 of 16
Stochastic Processes
1 1 1 3
𝐴 𝑠 = 𝑝0 + 𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + 𝑝4 𝑠 4 = 0 + 4 𝑠 + 2 𝑠 2 + 3 𝑠 3 + 4 𝑠 4 .
By definition
∞
𝑋
𝑃𝑋 𝑠 = 𝐸(𝑠 ) = 𝑝𝑘 𝑠 𝑘
𝑘=0
∞
𝑃′ 𝑠 = 𝑘𝑝𝑘 𝑠 𝑘−1
𝑘=0
∞
Mean
The expectation 𝐸(𝑋) is given by
∞
′
𝐸 𝑋 =𝑃 𝑠 𝑠=1
= 𝑘𝑝𝑘 𝑠 𝑘−1
𝑘=0 𝑠=1
′ ∞
Thus 𝐸 𝑋 =𝑃 1 = 𝑘=0 𝑘𝑝𝑘
Variance
∞
′′ 1
𝐸 𝑋 𝑋−1 =𝑃 = 𝑘(𝑘 − 1)𝑝𝑘
𝑘=0
𝑣𝑎𝑟 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋 2
But
∞ ∞
𝐸 𝑋2 = 𝐸 𝑋 𝑋 − 1 + 𝐸 𝑋 = 𝑘 𝑘 − 1 𝑝𝑘 + 𝑘𝑝𝑘
𝑘=0 𝑘=0 𝑠=1
= 𝑃′′ 1 + 𝑃′(1)
2
Thus 𝑣𝑎𝑟 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋 2
= 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1
The Kth factorial moment of 𝑋 is given by
Page 3 of 16
Stochastic Processes
𝑑𝑘 𝑃 𝑠
𝐸 𝑋 𝑋−1 … 𝑋−𝐾+1 = for 𝑖 = 1,2, …
𝑑 𝑠𝑘 𝑠=1
Mean of X
∞
′
𝑃 1 = 𝑘𝑝𝑘 𝑞1−𝑘 𝑠 𝑘−1 = 𝑝 ⟹ 𝐸 𝑋 = 𝑃′ 1 = 𝑝
𝑘=0 𝑠=1
Variance of X
∞
Mean 𝑋
𝑃′ 𝑠 = 𝑛 𝑞 + 𝑝𝑠 𝑛−1
𝑝 ⟹ 𝐸 𝑋 = 𝑃′ 1 = 𝑛𝑝 𝑞 + 𝑝 = 𝑛𝑝
Variance of X
𝑃′′ 𝑠 = 𝑛 𝑛 − 1 𝑞 + 𝑝𝑠 𝑛−1 2
𝑝 ⟹ 𝑃′′ 1 = 𝑛 𝑛 − 1 𝑝2 𝑞 + 𝑝 𝑛−2
= 𝑛 𝑛 − 1 𝑝2
2
∴ 𝑣𝑎𝑟 𝑋 = 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1 = 𝑛 𝑛 − 1 𝑝2 + 𝑛𝑝 − 𝑛2 𝑝2 = 𝑛𝑝𝑞
c) Geometric Distribution
Let 𝑋 have a Geometric distribution then
Page 4 of 16
Stochastic Processes
𝑃 𝑋 = 𝑘 = 𝑝𝑘 = 𝑞 𝑘 𝑝 ,
The pgf of X is given by
∞ ∞ ∞ ∞
𝑘 𝑘 𝑘 𝑘 𝑘
𝑃 𝑠 = 𝑝𝑘 𝑠 = 𝑞 𝑝𝑠 = 𝑝 𝑞 𝑠 =𝑝 (𝑞𝑠)𝑘 = 𝑝(1 + 𝑞𝑠 + 𝑞𝑠 2
+⋯
𝑘=0 𝑘=0 𝑘=0 𝑘=0
𝑝
=
1 − 𝑞𝑠
Mean of X
𝑝𝑞 𝑞 𝑝𝑞 𝑞
𝑃′ 𝑠 = 2
= ⟹ 𝐸 𝑋 = 𝑃′ 1 = 2
=
1 − 𝑞𝑠 𝑝 1−𝑞 𝑝
Variance of X
′′
2𝑝𝑞 2 2𝑝𝑞 2
′′
2𝑞 2
𝑃 𝑠 = 3
⟹𝑃 1 = 3
= 2
1 − 𝑞𝑠 1−𝑞 𝑝
2 2𝑞 2 𝑞 𝑞 2 𝑞
∴ 𝑣𝑎𝑟 𝑋 = 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1 = 2
+ − 2= 2
𝑝 𝑝 𝑝 𝑝
d) Poisson Distribution
Let 𝑋 have a Poisson distribution then
𝑒 −𝜆 𝜆𝑘
𝑃 𝑋 = 𝑘 = 𝑝𝑘 = 𝑘 = 0,1,2, …
𝑘!
∞ ∞ ∞
𝑘
𝑒 −𝜆 𝜆𝑘 𝑘 (𝜆𝑠)𝑘 𝜆𝑠 2
𝑃 𝑠 = 𝑝𝑘 𝑠 = 𝑠 = 𝑒 −𝜆 = 𝑒 −𝜆 [1 + 𝜆𝑠 + + ⋯ = 𝑒 −𝜆 𝑒 𝜆𝑠
𝑘! 𝑘! 2!
𝑘=0 𝑘=0 𝑘=0
= 𝑒 −𝜆(1−𝑠)
Mean of X
𝑃′ 𝑠 = 𝜆𝑒 −𝜆(1−𝑠) ⟹ 𝐸 𝑋 = 𝑃′ 1 = 𝜆𝑒 0 = 𝜆
Variance of X
2
∴ 𝑣𝑎𝑟 𝑋 = 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1 = 𝜆2 + 𝜆 − 𝜆2 = 𝜆
Page 5 of 16
Stochastic Processes
This means that the p.g.f of the sum of two independent random variables X and Y is
the product of the p.g.f of X and Y.
Corollary
If 𝑋1 , . . . , 𝑋𝑛 are independent count r.v.s with PGFs 𝑃𝑋1 (𝑠), . . . , 𝑃𝑋𝑛 (𝑠) respectively (and n
is a known integer), then𝑃𝑋1 +...+𝑋𝑛 (𝑠) = 𝑃𝑋1 (𝑠). . . . 𝑃𝑋𝑛 (𝑠)
If the random variables 𝑋1 , . . . , 𝑋𝑛 are also identically distributed each with pgf 𝑃(𝑠)
then the p.g.f. of 𝑆𝑛 = 𝑋1 +. . . +𝑋𝑛 is 𝑃 𝑠 𝑛
.
Example 2
a) Poisson Distribution
Let X and Y be independent Poisson variates with parameters 𝜆 and 𝜇 respectively, so
that their pgf’s are
Page 6 of 16
Stochastic Processes
Therefore the sum of two poisson variables with parameters 𝜆 and 𝜇 is a poisson
variable with parameter (𝜆 + 𝜇).
b) Bernoulli Distribution
Let X be a Bernoulli variable with success parameter 𝑝 (0 < 𝑝 < 1). Let the trials be
𝑋1 , 𝑋2 , … , 𝑋𝑛 and let 𝑌 = 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 . The probability of obtaining exactly 𝑘
successes (and 𝑛 − 𝑘 failures) in all is:
𝑛 𝑘 𝑛−𝑘
𝑃 𝑋 = 𝑘 = 𝑝𝑘 = 𝑝 1−𝑝
𝑘
∞ ∞
𝑛 𝑘
∴ 𝐺𝑌 𝑠 = 𝑝𝑘 𝑠 𝑘 = 𝑝 𝑞(1 − 𝑝)𝑛−𝑘 𝑠 𝑘 = (1 − 𝑝) + 𝑝𝑠 𝑛
= 𝑞 + 𝑝𝑠
𝑘
𝑘=0 𝑘=0
Thus the sum of Bernoulli variables is a binomial variable with parameters 𝑛 and 𝑝.
𝑖=1
OR
Since 𝐻 𝑠 = 𝐺 𝑃 𝑠 and 𝑃 𝑠 = 𝑒 −𝜆 𝑖 1−𝑆
Then 𝐺 𝑃 𝑠 = 𝑒 −𝜆 𝑖 1−𝑃(𝑠)
𝑛 𝑛
This is the pgf of a Poisson r.v., i.e. 𝑖=1 𝑋𝑖 ∼ 𝑃𝑜𝑖𝑠𝑠𝑜𝑛( 𝑖=1 𝜆𝑖 )
𝐸 𝑆𝑁 = 𝐸 𝑋𝑖 . 𝐸(𝑁) = 𝐸 𝑋𝑖 𝜆 = 𝜆𝐸 𝑋𝑖
Page 7 of 16
Stochastic Processes
∞
𝑃 𝑠 = 𝑘=0 𝑝𝑘 𝑠 𝑘 and
𝑞𝑘 = 𝑃 𝑋 > 𝑘 = 𝑝𝑘+1 + 𝑝𝑘+2 + ⋯ 𝑘 = 0,1,2, …. with generating function of the
sequence 𝑞𝑘
∞
𝜙 𝑠 = 𝑞𝑘 𝑠 𝑘
𝑘=0
𝜙 𝑠 = 𝑞0 + 𝑞1 𝑠 + 𝑞2 𝑠 + 𝑞3 𝑠 +
= 𝑝1 + 𝑝2 + ⋯ + 𝑝2 + 𝑝3 + ⋯ 𝑠 + 𝑝3 + 𝑝4 + ⋯ 𝑠 2
= 𝑝1 + 𝑝2 1 + 𝑠 + 𝑝3 1 + 𝑠 + 𝑠 2 + 𝑝4 1 + 𝑠 + 𝑠 2 + 𝑠 3 + ⋯
𝑝2 1 − 𝑠 2 𝑝3 1 − 𝑠 3 𝑝4 1 − 𝑠 4
= 𝑝1 + + + +⋯
1−𝑠 1−𝑠 1−𝑠
𝑝1 1 − 𝑠 + 𝑝2 1 − 𝑠 2 + 𝑝3 1 − 𝑠 3 + 𝑝4 1 − 𝑠 4
=
1−𝑠
(𝑝1 + 𝑝2 + 𝑝3 + 𝑝4 + ⋯ − (𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + 𝑝4 𝑠 4 + ⋯ )
=
1−𝑠
𝑝0 + 𝑝1 + 𝑝2 + 𝑝3 + 𝑝4 + ⋯ − (−𝑝0 + 𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + 𝑝4 𝑠 4 + ⋯ )
=
1−𝑠
1−𝑝 𝑠
= −1<𝑠 <1
1−𝑠
𝐺𝑋 𝑠 = 𝐸 𝑠 𝑋 = 𝑝𝑋 𝑠 𝑥 = 𝑝0 + 𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + 𝑝4 𝑠 4 + . ..
Page 8 of 16
Stochastic Processes
In general:
1 𝑛 1 𝑑𝑛
𝑝𝑛 = 𝑃 𝑋 = 𝑛 = 𝐺 0 = 𝐺𝑋 𝑠
𝑛! 𝑋 𝑛! 𝑑𝑠 𝑛 𝑠=0
Example 3
Let X be a discrete random variable with PGF
𝑠
𝐺𝑋 (𝑠) = 5 (2 + 3𝑠 2 ).
Page 9 of 16
Stochastic Processes
a) Definition:
𝐺𝑋 (𝑠) = 𝐸(𝑠 𝑋 )
b) Used for:
Discrete random variables with values 0, 1, 2, . ..
c) Moments:
(𝑘)
𝐸 𝑋 = 𝐺𝑋′ 1 , 𝐸 𝑋 𝑋 − 1 . . . 𝑋 − 𝑘 + 1 = 𝐺𝑋 (1)
d) Probabilities:
1 (𝑛)
𝑃(𝑋 = 𝑛) = 𝑛! 𝐺𝑋 (0)
e) Sums:
And
𝑝𝑗𝑘 = 1
𝑗 𝑘
Page 10 of 16
Stochastic Processes
𝑗
𝑃 𝑠1 , 𝑠2 = 𝑝𝑗𝑘 𝑠1 𝑠2𝑘
𝑗 𝑘
𝑃 𝑠, 𝑠 = 𝐸 𝑠 𝑋 , 𝑠 𝑌 = 𝐸 𝑠 𝑋+𝑌
If X and Y are identically, independently distributed (iid) with common pgf 𝑔(𝑠), then
we have:
𝑑𝑃 𝑠1 , 1 𝑑𝑃 𝑠1 , 1
𝑃′(𝑠1 ) = ⟹ 𝐸 𝑋 = 𝑃′(1) =
𝑑𝑠1 𝑑𝑠1 𝑠1 =1
And
𝑑𝑃 1, 𝑠2 𝑑𝑃 1, 𝑠2
𝑃′(𝑠2 ) = ⟹ 𝐸 𝑌 = 𝑃′(1) =
𝑑𝑠2 𝑑𝑠2 𝑠2 =1
𝑑 2 𝑃 𝑠1 ,1 𝑑𝑃 1,𝑠2
𝐸 𝑋 𝑋−1 = and 𝐸 𝑌 𝑌 − 1 =
𝑑𝑠12 𝑆1 =1 𝑑𝑠2 𝑠2 =1
Page 11 of 16
Stochastic Processes
Thus
𝑑 2 𝑃 𝑠1 , 𝑠2
𝐸 𝑋, 𝑌 =
𝑑𝑠1 𝑑𝑠2 𝑠1 =𝑠2 =1
𝜎𝑋2 = 𝐸 𝑋 𝑋 − 1 + 𝐸 𝑋 − 𝐸 𝑋 2
𝜎𝑌2 = 𝐸 𝑌 𝑌 − 1 + 𝐸 𝑌 − 𝐸 𝑌 2
𝑐𝑜𝑣 𝑋, 𝑌
𝜌𝑋𝑌 =
𝜎𝑋 𝜎𝑌
Example 4
Given that the joint distribution of X and Y is
𝑝𝑗𝑘 = 𝑃 𝑋 = 𝑗, 𝑌 = 𝑘 = 𝑞 𝑗 +𝑘 𝑝2 , 𝑗 = 0,1,2, … , 𝑘 = 0,1,2, … , 𝑝+𝑞 =1
𝑗 𝑗 1 1
𝑃(𝑠1 , 𝑠2 ) = 𝑞 𝑗 +𝑘 𝑝2 𝑠1 𝑠2𝑘 = 𝑝2 𝑞 𝑗 𝑠1 𝑞 𝑘 𝑠2𝑘 = 𝑝2
1 − 𝑞𝑠1 1 − 𝑞𝑠2
𝑗 𝑘 𝑗 𝑘
𝑝2
=
(1 − 𝑞𝑠1 )(1 − 𝑞𝑠2 )
The pgf of X is
𝑝2 𝑝2 𝑝
𝑃(𝑠1 ) = 𝑃 𝑠1 , 1 = = =
(1 − 𝑞𝑠1 )(1 − 𝑞) (1 − 𝑞𝑠1 )𝑝 (1 − 𝑞𝑠1 )
The pgf of Y is
𝑝2 𝑝2 𝑝
𝑃(𝑠2 ) = 𝑃 1, 𝑠2 = = =
(1 − 𝑞)(1 − 𝑞𝑠2 ) (1 − 𝑞𝑠2 )𝑝 (1 − 𝑞𝑠2 )
𝑝𝑞 ′
𝑝𝑞 𝑞
∴ 𝑃′(𝑠1 ) = ⟹ 𝐸 𝑋 = 𝑃 1 = =
1 − 𝑞𝑠1 2 1−𝑞 2 𝑝
Similarly
𝑝𝑞 𝑝𝑞 𝑞
𝑃′(𝑠2 ) = 2
⟹ 𝐸 𝑌 = 𝑃′ 1 = 2
=
1 − 𝑞𝑠2 1−𝑞 𝑝
Second moment
Page 12 of 16
Stochastic Processes
2𝑝𝑞 2 ′′
2𝑝𝑞 2 2𝑞 2
𝑃′′(𝑠1 ) = 3
⟹𝐸 𝑋 𝑋−1 =𝑃 1 = 3
= 2
1 − 𝑞𝑠1 1−𝑞 𝑝
Similarly
2𝑝𝑞 2 ′′
2𝑝𝑞 2 2𝑞 2
𝑃′′(𝑠2 ) = 3
⟹𝐸 𝑌 𝑌−1 =𝑃 1 = 3
= 2
1 − 𝑞𝑠2 1−𝑞 𝑝
Thus
2𝑞 2 𝑞 𝑞 2 2𝑞 2 + 𝑞𝑝 − 𝑞 𝑞
𝜎𝑋2 = 2
+ − 2= 2
= 2
𝑝 𝑝 𝑝 𝑝 𝑝
𝑞
Similarly 𝜎𝑋2 = 𝑝 2
𝑝2
Further differentiating, 𝑃(𝑠1 , 𝑠2 ) = (1−𝑞𝑠 we get,
1 )(1−𝑞𝑠2 )
𝑑𝑃 𝑠1 , 𝑠2 𝑝2 𝑞
𝑃′(𝑠1 ) = =
𝑑𝑠1 1 − 𝑞𝑠1 2 (1 − 𝑞𝑠2 )
𝑑 2 𝑃 𝑠1 , 𝑠2 𝑝2 𝑞 2
𝑃′(𝑠1 , 𝑠2 ) = =
𝑑𝑠1 𝑑𝑠2 1 − 𝑞𝑠1 2 1 − 𝑞𝑠2 2
Thus
𝑑2 𝑃 𝑠1 = 1, 𝑠2 = 1 𝑝2 𝑞 2 𝑞2
𝐸 𝑋, 𝑌 = 𝑃′ 1,1 = = =
𝑑𝑠1 𝑑𝑠2 1−𝑞 2 1−𝑞 2 𝑝2
𝑞2 𝑞 𝑞
𝑐𝑜𝑣 𝑋, 𝑌 = 𝜎𝑋𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸 𝑌 = 2 − × = 0
𝑝 𝑝 𝑝
𝑐𝑜𝑣 𝑋,𝑌
Hence 𝜌𝑋𝑌 = =0
𝜎𝑋 𝜎𝑌
Page 13 of 16
Stochastic Processes
Exercise
1. Let 𝑋 have a Bernoulli distribution with parameter 𝑝 such that;
𝑝𝑘 = 𝑃 𝑋 = 𝑘 = 𝑝𝑘 𝑞1−𝑘 , 𝑞 = 1 − 𝑝, 𝑘 = 0, 1
Obtain the probability generating function of 𝑋. Hence determine the mean and
variance of 𝑋 using the probability generating function.
2. Define the term probability generating function.
3. Find the generating function for the sequence; {0, 0, 0, 3, 3, 3, … . }.
4. The probability generating function of a discrete random variable 𝑋 is given by;
𝐺𝑋 𝑡 = 𝑘 1 + 𝑡 2 .Find the value of 𝑘 and hence write down the probability
distribution of 𝑋
5. Given that 𝑎𝑛 = 𝑛, for 𝑛 = 1, 2, 3, … Show that the generating function for the
𝑠
sequence {𝑎𝑛 } is given by .
1−𝑠 2
Solution
If 𝑎𝑘 = 𝑘, then 𝑎1 = 1, 𝑎2 = 2, 𝑎3 = 3 … , therefore the generating function of 𝑎𝑘 is
𝐴 𝑠 = 𝑠 + 2𝑠 2 + 3𝑠 3 + 4𝑠 4 + ⋯ = 𝑠(1 + 2𝑠 + 3𝑠 2 + 4𝑠 4 + ⋯ )
1
Define 𝑓 𝑠 = 1 + 𝑠 + 𝑠 2 + 𝑠 3 + 𝑠 4 + ⋯ = 1−𝑠
1
Then 𝑓 ′ 𝑠 = 1 + 2𝑠 + 3𝑠 2 + 4𝑠 3 + ⋯ = 1−𝑠 2
𝑠
Thus 𝐴 𝑠 = 𝑠 1 + 2𝑠 + 3𝑠 2 + 4𝑠 4 + ⋯ = 1−𝑠 2
6.
Page 14 of 16
Stochastic Processes
6.
7.
Page 15 of 16
Stochastic Processes
Page 16 of 16