0% found this document useful (0 votes)
2 views16 pages

Topic 2_ Generating Functions

The document discusses generating functions in stochastic processes, focusing on their definitions, applications, and calculations for various distributions including Geometric, Binomial, and Poisson. It explains how to derive the probability generating function (PGF) and use it to calculate mean, variance, and probabilities. Additionally, it covers the concept of bivariate generating functions and provides examples for different types of distributions.

Uploaded by

raphaelkamau83
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views16 pages

Topic 2_ Generating Functions

The document discusses generating functions in stochastic processes, focusing on their definitions, applications, and calculations for various distributions including Geometric, Binomial, and Poisson. It explains how to derive the probability generating function (PGF) and use it to calculate mean, variance, and probabilities. Additionally, it covers the concept of bivariate generating functions and provides examples for different types of distributions.

Uploaded by

raphaelkamau83
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Stochastic Processes

TOPIC TWO: GENERATING FUNCTIONS

2.0 Objectives
By the end of the topic, the learner should be able to;

i) understand the definition of the generating function and find the sum of Geometric,
Binomial, and Exponential series;
ii) understand the definition of the PGF, and use it to calculate the mean, variance,
and probabilities;
iii) calculate the PGF for Geometric, Binomial, and Poisson distributions
iv) understand the concept of Bivariate Generating Functions

2.1 Introduction
A drunk person staggers to the left and right as he walks. This process is called the
Random Walk in stochastic processes. Probability generating functions are particularly
useful for processes such as the random walk, because the process is defined as the sum
of a single repeating step. The repeating step is a move of one unit, left or right at
random. The sum of the first t steps gives the position at time t.
2.2 Univariate Generating Functions
Definition:
Let 𝑎0 , 𝑎1 , 𝑎2 , … be a sequence of real numbers. Using a variable S, we may define a
function:

𝐴 𝑠 = 𝑎0 + 𝑎1 𝑠 + 𝑎2 𝑠 2 + 𝑎3 𝑠 3 + ⋯ = 𝑘=0 𝑎𝑘 𝑠𝑘 ……………………(i)
If this power series converges in some interval −𝑠0 < 𝑠 < 𝑠0 , then 𝐴 𝑠 is called the
generating function of the sequence 𝑎0 , 𝑎1 , 𝑎2 , …(i.e.,for those values of the parameter 𝑠
for which the sum converges, 𝐴 𝑠 is called a generating function).

Remark (see section 1.2.5 for illustration)


Differentiating (i) k times, we obtain 𝑎𝑘 (constant a sub k) i.e.,

1 𝑑𝑘 𝐴 𝑠
𝑎𝑘 =
𝑘! 𝑑𝑠 𝑘 𝑠=0

Page 1 of 16
Stochastic Processes

Example 1

a) If 𝑎𝑘 = 1 for all k, i.e., 𝑎𝑘 = {1,1,1, … }, we have 𝑎0 = 1, 𝑎1 = 1, 𝑎2 = 1, … Then the


generating function is given by
1
𝐴 𝑠 = 1 + 𝑠 + 𝑠 2 + 𝑠 3 + ⋯ = 1−𝑠 , 𝑠 < 1 (Geometric series)
1 1 1
b) Let 𝑎𝑘 = , then 𝑎0 = 1, 𝑎1 = 1, 𝑎2 = 2! , 𝑎3 = 3! … , therefore the generating
𝑘!
function of 𝑎𝑘 is
𝑠2 𝑠3 𝑥2 𝑥3
𝐴 𝑠 = 1 + 𝑠 + 21 + 3! + ⋯ = 𝑒 𝑠 (Remember 𝑒 𝑥 = 1 + 𝑥 + 21 + +⋯)
3!
𝑛
c) Let 𝑎𝑘 = 𝑘
for fixed n, then
∞ ∞
𝑛 𝑘
𝐴 𝑠 = 𝑎𝑘 𝑠 𝑘 = 𝑠 = 1+𝑠 𝑘
𝑘
𝑘=0 𝑘=0

2.2.1 Probability Generating Functions


Definition
Suppose that 𝑋 is a random variable which assumes non-negative integral values
0,1,2, …. and that the probability 𝑃 𝑋 = 𝑘 = 𝑝𝑘 such that 𝑘 𝑝𝑘 = 1 , then the probability
generating function 𝑃(𝑠) of the sequence of probabilities {𝑝𝑘 } is given by

𝑃𝑋 𝑠 = 𝑝0 + 𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + ⋯ = 𝑝𝑘 𝑠 𝑘
𝑘=0

= 𝑃 𝑋 = 𝑘 𝑠 𝑘 = 𝐸(𝑠 𝑋 )
𝑘=0

where 𝐸(𝑠 𝑋 ) is the expectation of the function 𝑠 𝑋 ( a random variable) of the random
variable 𝑋. The series 𝑃(𝑠) converges at least −1 ≤ 𝑠 ≤ 1 (converges absolutely for
𝑠 ≤ 1).
Clearly 𝑃𝑋 1 = 1. Also 𝑃𝑋 0 = 𝑝0

N/B:
It is possible to have a sequence of probabilities which does not form a p.g.f. eg. if
1 1 1 3
𝑝0 = 𝑜, 𝑝1 = 4 , 𝑝2 = 2 , 𝑝3 = 3 , 𝑝4 = 4, then

Page 2 of 16
Stochastic Processes

1 1 1 3
𝐴 𝑠 = 𝑝0 + 𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + 𝑝4 𝑠 4 = 0 + 4 𝑠 + 2 𝑠 2 + 3 𝑠 3 + 4 𝑠 4 .

∴ 𝐴 𝑠 is a generating function of {𝑝𝑘 } but not a pgf because 𝑝0 + 𝑝1 + 𝑝2 + 𝑝3 + 𝑝4 ≠ 1

Determining mean and variance using the p.g.f.

By definition

𝑋
𝑃𝑋 𝑠 = 𝐸(𝑠 ) = 𝑝𝑘 𝑠 𝑘
𝑘=0

𝑃′ 𝑠 = 𝑘𝑝𝑘 𝑠 𝑘−1
𝑘=0

𝑃′′ 𝑠 = 𝑘(𝑘 − 1)𝑝𝑘 𝑠 𝑘−2


𝑘=0

Mean
The expectation 𝐸(𝑋) is given by


𝐸 𝑋 =𝑃 𝑠 𝑠=1
= 𝑘𝑝𝑘 𝑠 𝑘−1
𝑘=0 𝑠=1
′ ∞
Thus 𝐸 𝑋 =𝑃 1 = 𝑘=0 𝑘𝑝𝑘

Variance

′′ 1
𝐸 𝑋 𝑋−1 =𝑃 = 𝑘(𝑘 − 1)𝑝𝑘
𝑘=0

𝑣𝑎𝑟 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋 2

But
∞ ∞

𝐸 𝑋2 = 𝐸 𝑋 𝑋 − 1 + 𝐸 𝑋 = 𝑘 𝑘 − 1 𝑝𝑘 + 𝑘𝑝𝑘
𝑘=0 𝑘=0 𝑠=1

= 𝑃′′ 1 + 𝑃′(1)
2
Thus 𝑣𝑎𝑟 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋 2
= 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1
The Kth factorial moment of 𝑋 is given by

Page 3 of 16
Stochastic Processes

𝑑𝑘 𝑃 𝑠
𝐸 𝑋 𝑋−1 … 𝑋−𝐾+1 = for 𝑖 = 1,2, …
𝑑 𝑠𝑘 𝑠=1

2.2.2 The PGF and Moments of Standard Distributions


a) Bernoulli Distribution
Let 𝑋 have a Bernoulli distribution with parameter p, then
𝑃 𝑋 = 𝑘 = 𝑝𝑘 = 𝑝𝑘 𝑞1−𝑘 , 𝑞 = 1 − 𝑝, 𝑘 = 0,1,2, …
Therefore the pgf of V is given by
∞ ∞
𝑘
𝑃 𝑠 = 𝑝𝑘 𝑠 = 𝑝𝑘 𝑞1−𝑘 𝑠 𝑘 = 𝑞 + 𝑝𝑠
𝑘=0 𝑘=0

Mean of X


𝑃 1 = 𝑘𝑝𝑘 𝑞1−𝑘 𝑠 𝑘−1 = 𝑝 ⟹ 𝐸 𝑋 = 𝑃′ 1 = 𝑝
𝑘=0 𝑠=1

Variance of X

𝑃′′ 𝑠 = 𝑘(𝑘 − 1)𝑝𝑘 𝑞1−𝑘 𝑠 𝑘−2 = 0 + 0 = 0 ⟹ 𝑃′′ 1 = 0


𝑘=0
2
∴ 𝑣𝑎𝑟 𝑋 = 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1 = 0 + 𝑝 − 𝑝2 = 𝑝 − 𝑝2 = 𝑝𝑞
b) Binomial Distribution
Let 𝑋 have a Binomial distribution with parameter 𝑛 and 𝑝, then
𝑛
𝑃 𝑋 = 𝑘 = 𝑝𝑘 = 𝑘
𝑝𝑘 𝑞 𝑛−𝑘 , 𝑞 = 1 − 𝑝, 𝑘 = 0,1,2, … , 𝑛
The pgf of X is given by
∞ ∞
𝑘
𝑛 𝑘 𝑛−𝑘 𝑘 𝑛
𝑃 𝑠 = 𝑝𝑘 𝑠 = 𝑝 𝑞 𝑠 = 𝑞 + 𝑝𝑠
𝑘
𝑘=0 𝑘=0

Mean 𝑋
𝑃′ 𝑠 = 𝑛 𝑞 + 𝑝𝑠 𝑛−1
𝑝 ⟹ 𝐸 𝑋 = 𝑃′ 1 = 𝑛𝑝 𝑞 + 𝑝 = 𝑛𝑝
Variance of X
𝑃′′ 𝑠 = 𝑛 𝑛 − 1 𝑞 + 𝑝𝑠 𝑛−1 2
𝑝 ⟹ 𝑃′′ 1 = 𝑛 𝑛 − 1 𝑝2 𝑞 + 𝑝 𝑛−2
= 𝑛 𝑛 − 1 𝑝2
2
∴ 𝑣𝑎𝑟 𝑋 = 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1 = 𝑛 𝑛 − 1 𝑝2 + 𝑛𝑝 − 𝑛2 𝑝2 = 𝑛𝑝𝑞
c) Geometric Distribution
Let 𝑋 have a Geometric distribution then

Page 4 of 16
Stochastic Processes

𝑃 𝑋 = 𝑘 = 𝑝𝑘 = 𝑞 𝑘 𝑝 ,
The pgf of X is given by
∞ ∞ ∞ ∞
𝑘 𝑘 𝑘 𝑘 𝑘
𝑃 𝑠 = 𝑝𝑘 𝑠 = 𝑞 𝑝𝑠 = 𝑝 𝑞 𝑠 =𝑝 (𝑞𝑠)𝑘 = 𝑝(1 + 𝑞𝑠 + 𝑞𝑠 2
+⋯
𝑘=0 𝑘=0 𝑘=0 𝑘=0
𝑝
=
1 − 𝑞𝑠
Mean of X
𝑝𝑞 𝑞 𝑝𝑞 𝑞
𝑃′ 𝑠 = 2
= ⟹ 𝐸 𝑋 = 𝑃′ 1 = 2
=
1 − 𝑞𝑠 𝑝 1−𝑞 𝑝
Variance of X

′′
2𝑝𝑞 2 2𝑝𝑞 2
′′
2𝑞 2
𝑃 𝑠 = 3
⟹𝑃 1 = 3
= 2
1 − 𝑞𝑠 1−𝑞 𝑝
2 2𝑞 2 𝑞 𝑞 2 𝑞
∴ 𝑣𝑎𝑟 𝑋 = 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1 = 2
+ − 2= 2
𝑝 𝑝 𝑝 𝑝
d) Poisson Distribution
Let 𝑋 have a Poisson distribution then

𝑒 −𝜆 𝜆𝑘
𝑃 𝑋 = 𝑘 = 𝑝𝑘 = 𝑘 = 0,1,2, …
𝑘!

The pgf of X is given by

∞ ∞ ∞
𝑘
𝑒 −𝜆 𝜆𝑘 𝑘 (𝜆𝑠)𝑘 𝜆𝑠 2
𝑃 𝑠 = 𝑝𝑘 𝑠 = 𝑠 = 𝑒 −𝜆 = 𝑒 −𝜆 [1 + 𝜆𝑠 + + ⋯ = 𝑒 −𝜆 𝑒 𝜆𝑠
𝑘! 𝑘! 2!
𝑘=0 𝑘=0 𝑘=0

= 𝑒 −𝜆(1−𝑠)

Mean of X
𝑃′ 𝑠 = 𝜆𝑒 −𝜆(1−𝑠) ⟹ 𝐸 𝑋 = 𝑃′ 1 = 𝜆𝑒 0 = 𝜆

Variance of X

𝑃′′ 𝑠 = 𝜆2 𝑒 −𝜆(1−𝑠) ⟹ 𝑃′′ 1 = 𝜆2 𝑒 0 = 𝜆2

2
∴ 𝑣𝑎𝑟 𝑋 = 𝑃′′ 1 + 𝑃′ 1 − 𝑃′ 1 = 𝜆2 + 𝜆 − 𝜆2 = 𝜆

Page 5 of 16
Stochastic Processes

2.2.3 P.G.F’s of sums of Independent Random Variables


Case 1
We can get the generating function of the sum of independent random variables
𝑆𝑁 = 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 where n is a fixed number, using two methods:
Method 1: Expectation method:
Let𝑋 and 𝑌 be two independent random variables. Let 𝑍 = 𝑋 + 𝑌. If
𝐴 𝑠 , 𝐵(𝑠) and 𝐶(𝑠) are the pgf’s of 𝑋, 𝑌 and 𝑍 respectively, then by definition
𝐶 𝑠 = 𝐸 𝑠 𝑍 = 𝐸 𝑠 𝑋+𝑌 = 𝐸 𝑠 𝑋 𝐸(𝑠 𝑌 ) (by independence)
= 𝐴 𝑠 𝐵(𝑠)
Suppose 𝑍 = 𝑋 − 𝑌, then
1 1
𝐶 𝑠 = 𝐸 𝑠 𝑍 = 𝐸 𝑠 𝑋−𝑌 = 𝐸 𝑠 𝑋 𝐸 𝑠 −𝑌 = 𝐸 𝑠 𝑋 𝐸 𝑌
=𝐴 𝑠 𝐵
𝑠 𝑠

This means that the p.g.f of the sum of two independent random variables X and Y is
the product of the p.g.f of X and Y.

Corollary

If 𝑋1 , . . . , 𝑋𝑛 are independent count r.v.s with PGFs 𝑃𝑋1 (𝑠), . . . , 𝑃𝑋𝑛 (𝑠) respectively (and n
is a known integer), then𝑃𝑋1 +...+𝑋𝑛 (𝑠) = 𝑃𝑋1 (𝑠). . . . 𝑃𝑋𝑛 (𝑠)

If the random variables 𝑋1 , . . . , 𝑋𝑛 are also identically distributed each with pgf 𝑃(𝑠)
then the p.g.f. of 𝑆𝑛 = 𝑋1 +. . . +𝑋𝑛 is 𝑃 𝑠 𝑛
.

Example 2
a) Poisson Distribution
Let X and Y be independent Poisson variates with parameters 𝜆 and 𝜇 respectively, so
that their pgf’s are

𝑒 −𝜆(1−𝑠) and 𝑒 −𝜇 (1−𝑠)

Then the pgf of 𝑍 = 𝑋 + 𝑌 is given by

𝐺𝑍 𝑠 = 𝐺𝑋 𝑠 𝐺𝑌 𝑠 = 𝑒 −𝜆(1−𝑠) 𝑒 −𝜇 (1−𝑠) = 𝑒 −(𝜆+𝜇 )(1−𝑠)

Page 6 of 16
Stochastic Processes

Therefore the sum of two poisson variables with parameters 𝜆 and 𝜇 is a poisson
variable with parameter (𝜆 + 𝜇).

b) Bernoulli Distribution

Let X be a Bernoulli variable with success parameter 𝑝 (0 < 𝑝 < 1). Let the trials be
𝑋1 , 𝑋2 , … , 𝑋𝑛 and let 𝑌 = 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 . The probability of obtaining exactly 𝑘
successes (and 𝑛 − 𝑘 failures) in all is:

𝑛 𝑘 𝑛−𝑘
𝑃 𝑋 = 𝑘 = 𝑝𝑘 = 𝑝 1−𝑝
𝑘
∞ ∞
𝑛 𝑘
∴ 𝐺𝑌 𝑠 = 𝑝𝑘 𝑠 𝑘 = 𝑝 𝑞(1 − 𝑝)𝑛−𝑘 𝑠 𝑘 = (1 − 𝑝) + 𝑝𝑠 𝑛
= 𝑞 + 𝑝𝑠
𝑘
𝑘=0 𝑘=0

Thus the sum of Bernoulli variables is a binomial variable with parameters 𝑛 and 𝑝.

c) Find the distribution of the sum of 𝑛 independent r.v.s 𝑋𝑖 , 𝑖 = 1, . . . 𝑛, where


𝑋𝑖 ∼ 𝑃𝑜𝑖𝑠𝑠𝑜𝑛(𝜆𝑖).
Solution
𝐺𝑋 𝑖 (𝑠) = 𝑒 −𝜆 𝑖 (1−𝑆)
So
𝑛

𝐺𝑋 1+𝑋 2+···+𝑋𝑛 (𝑠) = 𝑒 −𝜆 𝑖 1−𝑆


= 𝑒 − 𝜆 1 +···+𝜆 𝑛 1−𝑆

𝑖=1

OR
Since 𝐻 𝑠 = 𝐺 𝑃 𝑠 and 𝑃 𝑠 = 𝑒 −𝜆 𝑖 1−𝑆

Then 𝐺 𝑃 𝑠 = 𝑒 −𝜆 𝑖 1−𝑃(𝑠)

𝑛 𝑛
This is the pgf of a Poisson r.v., i.e. 𝑖=1 𝑋𝑖 ∼ 𝑃𝑜𝑖𝑠𝑠𝑜𝑛( 𝑖=1 𝜆𝑖 )

Note that the mean is

𝐸 𝑆𝑁 = 𝐸 𝑋𝑖 . 𝐸(𝑁) = 𝐸 𝑋𝑖 𝜆 = 𝜆𝐸 𝑋𝑖

2.2.4 Cumulative Probabilities (or Tailed)


Upper Tail
Let 𝑋 have a pdf 𝑃 𝑋 = 𝑘 = 𝑝𝑘 where 𝑘 = 0,1,2, … with pgf

Page 7 of 16
Stochastic Processes


𝑃 𝑠 = 𝑘=0 𝑝𝑘 𝑠 𝑘 and
𝑞𝑘 = 𝑃 𝑋 > 𝑘 = 𝑝𝑘+1 + 𝑝𝑘+2 + ⋯ 𝑘 = 0,1,2, …. with generating function of the
sequence 𝑞𝑘

𝜙 𝑠 = 𝑞𝑘 𝑠 𝑘
𝑘=0

𝜙 𝑠 = 𝑞0 + 𝑞1 𝑠 + 𝑞2 𝑠 + 𝑞3 𝑠 +
= 𝑝1 + 𝑝2 + ⋯ + 𝑝2 + 𝑝3 + ⋯ 𝑠 + 𝑝3 + 𝑝4 + ⋯ 𝑠 2
= 𝑝1 + 𝑝2 1 + 𝑠 + 𝑝3 1 + 𝑠 + 𝑠 2 + 𝑝4 1 + 𝑠 + 𝑠 2 + 𝑠 3 + ⋯
𝑝2 1 − 𝑠 2 𝑝3 1 − 𝑠 3 𝑝4 1 − 𝑠 4
= 𝑝1 + + + +⋯
1−𝑠 1−𝑠 1−𝑠
𝑝1 1 − 𝑠 + 𝑝2 1 − 𝑠 2 + 𝑝3 1 − 𝑠 3 + 𝑝4 1 − 𝑠 4
=
1−𝑠
(𝑝1 + 𝑝2 + 𝑝3 + 𝑝4 + ⋯ − (𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + 𝑝4 𝑠 4 + ⋯ )
=
1−𝑠
𝑝0 + 𝑝1 + 𝑝2 + 𝑝3 + 𝑝4 + ⋯ − (−𝑝0 + 𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + 𝑝4 𝑠 4 + ⋯ )
=
1−𝑠
1−𝑝 𝑠
= −1<𝑠 <1
1−𝑠

2.2.5 Using the Probability Generating Function to Calculate Probabilities


The probability generating function gets its name because the power series can be
expanded and differentiated to reveal the individual probabilities. Thus, given only the
PGF 𝐺𝑋 (𝑠) = 𝐸(𝑠 𝑋 ) , we can find all probabilities 𝑃(𝑋 = 𝑥). Which may be written as
𝑝𝑥 = 𝑃(𝑋 = 𝑥).
Then

𝐺𝑋 𝑠 = 𝐸 𝑠 𝑋 = 𝑝𝑋 𝑠 𝑥 = 𝑝0 + 𝑝1 𝑠 + 𝑝2 𝑠 2 + 𝑝3 𝑠 3 + 𝑝4 𝑠 4 + . ..

Thus 𝑝0 = 𝑃(𝑋 = 0) = 𝐺𝑋 (0).


First derivative: 𝐺𝑋′ (𝑠) = 𝑝1 + 2𝑝2 𝑠 + 3𝑝3 𝑠 2 + 4𝑝4 𝑠 3 + . . .
Thus 𝑝1 = 𝑃(𝑋 = 1) = 𝐺𝑋′ (0).
Second derivative: 𝐺𝑋′′ 𝑠 = 2𝑝2 + 3 × 2 𝑝3 𝑠 + 4 × 3 𝑝4 𝑠 2 + . . .
1
Thus 𝑝2 = 𝑃 𝑋 = 2 = 2 𝐺𝑋′′ (0).

Page 8 of 16
Stochastic Processes

Third derivative: 𝐺𝑋′′′ (𝑠) = 3 × 2 × 1 𝑝3 + 4 × 3 × 2 𝑝4 𝑠 + . ..


1
Thus 𝑝3 = 𝑃(𝑋 = 3) = 3! 𝐺𝑋′′′ (0).

In general:
1 𝑛 1 𝑑𝑛
𝑝𝑛 = 𝑃 𝑋 = 𝑛 = 𝐺 0 = 𝐺𝑋 𝑠
𝑛! 𝑋 𝑛! 𝑑𝑠 𝑛 𝑠=0

Example 3
Let X be a discrete random variable with PGF
𝑠
𝐺𝑋 (𝑠) = 5 (2 + 3𝑠 2 ).

Find the distribution of 𝑋.


Solution
1 𝑤𝑖𝑡𝑕 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 2/5,
𝑋=
3 𝑤𝑖𝑡𝑕 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 3/5

2.2.6 Special Sequence

Page 9 of 16
Stochastic Processes

2.2.7 Summary: Properties of the PGF

a) Definition:
𝐺𝑋 (𝑠) = 𝐸(𝑠 𝑋 )
b) Used for:
Discrete random variables with values 0, 1, 2, . ..
c) Moments:
(𝑘)
𝐸 𝑋 = 𝐺𝑋′ 1 , 𝐸 𝑋 𝑋 − 1 . . . 𝑋 − 𝑘 + 1 = 𝐺𝑋 (1)
d) Probabilities:
1 (𝑛)
𝑃(𝑋 = 𝑛) = 𝑛! 𝐺𝑋 (0)

e) Sums:

GX+Y (s) = GX s GY (s) for independent X, Y

2.3 Bivariate Generating Function


Suppose X and Y are integral valued random varaibles with joint probability
distribution (J.p.d).

𝑃 𝑋 = 𝑗, 𝑌 = 𝑘 = 𝑝𝑗𝑘 𝑘 = 0,1,2, … , 𝑗 = 0,1,2, …

And

𝑝𝑗𝑘 = 1
𝑗 𝑘

Then the joint pgf of X and Y is given by

Page 10 of 16
Stochastic Processes

𝑗
𝑃 𝑠1 , 𝑠2 = 𝑝𝑗𝑘 𝑠1 𝑠2𝑘
𝑗 𝑘

Assuming convergence for some values of 𝑠1 and 𝑠2 i.e., 𝑠1 < 1, 𝑠2 < 1

𝑃(𝑠1 , 𝑠2 ) = 𝐸 𝑠1𝑋 , 𝑠2𝑌 ……………………………………………………………………(1)

If 𝑠1 = 𝑠2 = 𝑠 then (1) becomes

𝑃 𝑠, 𝑠 = 𝐸 𝑠 𝑋 , 𝑠 𝑌 = 𝐸 𝑠 𝑋+𝑌

If X and Y are mutually independent random variables, then 𝑃(𝑠1 , 𝑠2 ) becomes

𝑃(𝑠1 , 𝑠2 ) = 𝐸 𝑠1𝑋 𝐸 𝑠2𝑌

If X and Y are identically, independently distributed (iid) with common pgf 𝑔(𝑠), then
we have:

𝑃(𝑠1 , 𝑠2 ) = 𝐸 𝑠1𝑋 𝐸 𝑠2𝑌 = 𝑔 𝑠 𝑔 𝑠 = 𝑔 𝑠 2

The pgf of X is given by

𝑃(𝑠1 ) = 𝐸 𝑠1𝑋 = 𝑃(𝑠1 , 1)

Similarly the pgf of Y is given by

𝑃(𝑠2 ) = 𝐸 𝑠2𝑌 = 𝑃(1, 𝑠2 )

Now 𝑃(𝑠1 ) = 𝑃(𝑠1 , 1), thus

𝑑𝑃 𝑠1 , 1 𝑑𝑃 𝑠1 , 1
𝑃′(𝑠1 ) = ⟹ 𝐸 𝑋 = 𝑃′(1) =
𝑑𝑠1 𝑑𝑠1 𝑠1 =1

And

𝑑𝑃 1, 𝑠2 𝑑𝑃 1, 𝑠2
𝑃′(𝑠2 ) = ⟹ 𝐸 𝑌 = 𝑃′(1) =
𝑑𝑠2 𝑑𝑠2 𝑠2 =1

The second moment is:

𝑑 2 𝑃 𝑠1 ,1 𝑑𝑃 1,𝑠2
𝐸 𝑋 𝑋−1 = and 𝐸 𝑌 𝑌 − 1 =
𝑑𝑠12 𝑆1 =1 𝑑𝑠2 𝑠2 =1

Page 11 of 16
Stochastic Processes

Thus

𝑑 2 𝑃 𝑠1 , 𝑠2
𝐸 𝑋, 𝑌 =
𝑑𝑠1 𝑑𝑠2 𝑠1 =𝑠2 =1

𝜎𝑋2 = 𝐸 𝑋 𝑋 − 1 + 𝐸 𝑋 − 𝐸 𝑋 2

𝜎𝑌2 = 𝐸 𝑌 𝑌 − 1 + 𝐸 𝑌 − 𝐸 𝑌 2

𝑐𝑜𝑣 𝑋, 𝑌 = 𝜎𝑋𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸(𝑌)

𝑐𝑜𝑣 𝑋, 𝑌
𝜌𝑋𝑌 =
𝜎𝑋 𝜎𝑌

Example 4
Given that the joint distribution of X and Y is
𝑝𝑗𝑘 = 𝑃 𝑋 = 𝑗, 𝑌 = 𝑘 = 𝑞 𝑗 +𝑘 𝑝2 , 𝑗 = 0,1,2, … , 𝑘 = 0,1,2, … , 𝑝+𝑞 =1

The bivariate pgf is

𝑗 𝑗 1 1
𝑃(𝑠1 , 𝑠2 ) = 𝑞 𝑗 +𝑘 𝑝2 𝑠1 𝑠2𝑘 = 𝑝2 𝑞 𝑗 𝑠1 𝑞 𝑘 𝑠2𝑘 = 𝑝2
1 − 𝑞𝑠1 1 − 𝑞𝑠2
𝑗 𝑘 𝑗 𝑘
𝑝2
=
(1 − 𝑞𝑠1 )(1 − 𝑞𝑠2 )
The pgf of X is
𝑝2 𝑝2 𝑝
𝑃(𝑠1 ) = 𝑃 𝑠1 , 1 = = =
(1 − 𝑞𝑠1 )(1 − 𝑞) (1 − 𝑞𝑠1 )𝑝 (1 − 𝑞𝑠1 )
The pgf of Y is
𝑝2 𝑝2 𝑝
𝑃(𝑠2 ) = 𝑃 1, 𝑠2 = = =
(1 − 𝑞)(1 − 𝑞𝑠2 ) (1 − 𝑞𝑠2 )𝑝 (1 − 𝑞𝑠2 )
𝑝𝑞 ′
𝑝𝑞 𝑞
∴ 𝑃′(𝑠1 ) = ⟹ 𝐸 𝑋 = 𝑃 1 = =
1 − 𝑞𝑠1 2 1−𝑞 2 𝑝
Similarly
𝑝𝑞 𝑝𝑞 𝑞
𝑃′(𝑠2 ) = 2
⟹ 𝐸 𝑌 = 𝑃′ 1 = 2
=
1 − 𝑞𝑠2 1−𝑞 𝑝
Second moment

Page 12 of 16
Stochastic Processes

2𝑝𝑞 2 ′′
2𝑝𝑞 2 2𝑞 2
𝑃′′(𝑠1 ) = 3
⟹𝐸 𝑋 𝑋−1 =𝑃 1 = 3
= 2
1 − 𝑞𝑠1 1−𝑞 𝑝
Similarly
2𝑝𝑞 2 ′′
2𝑝𝑞 2 2𝑞 2
𝑃′′(𝑠2 ) = 3
⟹𝐸 𝑌 𝑌−1 =𝑃 1 = 3
= 2
1 − 𝑞𝑠2 1−𝑞 𝑝
Thus
2𝑞 2 𝑞 𝑞 2 2𝑞 2 + 𝑞𝑝 − 𝑞 𝑞
𝜎𝑋2 = 2
+ − 2= 2
= 2
𝑝 𝑝 𝑝 𝑝 𝑝
𝑞
Similarly 𝜎𝑋2 = 𝑝 2

𝑝2
Further differentiating, 𝑃(𝑠1 , 𝑠2 ) = (1−𝑞𝑠 we get,
1 )(1−𝑞𝑠2 )

𝑑𝑃 𝑠1 , 𝑠2 𝑝2 𝑞
𝑃′(𝑠1 ) = =
𝑑𝑠1 1 − 𝑞𝑠1 2 (1 − 𝑞𝑠2 )
𝑑 2 𝑃 𝑠1 , 𝑠2 𝑝2 𝑞 2
𝑃′(𝑠1 , 𝑠2 ) = =
𝑑𝑠1 𝑑𝑠2 1 − 𝑞𝑠1 2 1 − 𝑞𝑠2 2

Thus
𝑑2 𝑃 𝑠1 = 1, 𝑠2 = 1 𝑝2 𝑞 2 𝑞2
𝐸 𝑋, 𝑌 = 𝑃′ 1,1 = = =
𝑑𝑠1 𝑑𝑠2 1−𝑞 2 1−𝑞 2 𝑝2
𝑞2 𝑞 𝑞
𝑐𝑜𝑣 𝑋, 𝑌 = 𝜎𝑋𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸 𝑌 = 2 − × = 0
𝑝 𝑝 𝑝

𝑐𝑜𝑣 𝑋,𝑌
Hence 𝜌𝑋𝑌 = =0
𝜎𝑋 𝜎𝑌

2.4 Competence Statements

Page 13 of 16
Stochastic Processes

Exercise
1. Let 𝑋 have a Bernoulli distribution with parameter 𝑝 such that;

𝑝𝑘 = 𝑃 𝑋 = 𝑘 = 𝑝𝑘 𝑞1−𝑘 , 𝑞 = 1 − 𝑝, 𝑘 = 0, 1
Obtain the probability generating function of 𝑋. Hence determine the mean and
variance of 𝑋 using the probability generating function.
2. Define the term probability generating function.
3. Find the generating function for the sequence; {0, 0, 0, 3, 3, 3, … . }.
4. The probability generating function of a discrete random variable 𝑋 is given by;
𝐺𝑋 𝑡 = 𝑘 1 + 𝑡 2 .Find the value of 𝑘 and hence write down the probability
distribution of 𝑋
5. Given that 𝑎𝑛 = 𝑛, for 𝑛 = 1, 2, 3, … Show that the generating function for the
𝑠
sequence {𝑎𝑛 } is given by .
1−𝑠 2

Solution
If 𝑎𝑘 = 𝑘, then 𝑎1 = 1, 𝑎2 = 2, 𝑎3 = 3 … , therefore the generating function of 𝑎𝑘 is
𝐴 𝑠 = 𝑠 + 2𝑠 2 + 3𝑠 3 + 4𝑠 4 + ⋯ = 𝑠(1 + 2𝑠 + 3𝑠 2 + 4𝑠 4 + ⋯ )
1
Define 𝑓 𝑠 = 1 + 𝑠 + 𝑠 2 + 𝑠 3 + 𝑠 4 + ⋯ = 1−𝑠
1
Then 𝑓 ′ 𝑠 = 1 + 2𝑠 + 3𝑠 2 + 4𝑠 3 + ⋯ = 1−𝑠 2
𝑠
Thus 𝐴 𝑠 = 𝑠 1 + 2𝑠 + 3𝑠 2 + 4𝑠 4 + ⋯ = 1−𝑠 2

6.

Page 14 of 16
Stochastic Processes

6.

7.

Page 15 of 16
Stochastic Processes

Page 16 of 16

You might also like