Math5846 Chapter9
Math5846 Chapter9
UNSW Sydney
OPEN LEARNING
Chapter 9
Branching Processes
2 / 40
Outline:
9.1 Introduction
9.2 What is Probability of Extinction?
9.3 Calculation of Probability of Extinction
9.4 Probability Generating Functions
9.5 Supplementary Material
3 / 40
9.1 Introduction
4 / 40
Branching processes are used in biological, sociological, and engineering
sciences.
P(Z = j) = Pj , j = 0, 1, 2, . . . (1)
P
where Pj ≥ 0 and j Pj = 1.
Suppose that all offspring act independently of each other and produce their
offspring according to the probability law given by Equation (1).
Notation: In this chapter and the next, we will use P instead of P to avoid
the confusion of the notation of Pj = P (Z = j). The notation, P, is used in
advanced courses.
5 / 40
Definition
Consider a population consisting of the organisms described above.
All offspring of the zeroth generation constitute the first generation, and
their number is denoted by X1 .
6 / 40
State 0 is an absorbing and thus recurrent state since
7 / 40
9.2 What is the Probability of Extinction?
8 / 40
Population extinctions were first raised in 1889 by Galton1 in connection
with the extinction of family surnames.
9 / 40
P
Let µ = j jPj be the mean number of offspring of a single individual,
and
Result
From Equation (2), we see that
E(Xn ) = µn ,
and ( n
−1
σ 2 µn−1 µµ−1 if µ ̸= 1
V ar(Xn ) =
n σ2 if µ = 1.
10 / 40
Proof
By conditioning on Xn−1 , we have
E(Xn ) = E E(Xn Xn−1 )
Xn−1
X
= E E( Zi Xn−1 )
i=1
= E(Xn−1 µ)
= µ E(Xn−1 ).
E(X1 ) = µ,
E(X2 ) = µE(X1 ) = µ2 ,
E(X3 ) = µE(X2 ) = µ3 ,
..
.
E(Xn ) = µE(Xn−1 ) = µn .
11 / 40
Proof - continued
Similarly, we can find V ar(Xn ) by the conditional variance formula
12 / 40
Proof.
However, V ar(X0 ) = V ar(1) = 0, by mathematical induction,
Therefore, the variance of the population size increases geometrically if µ > 1 and
increases linearly if µ = 1 and decreases geometrically if µ < 1.
13 / 40
9.3 Calculation of Probability of Extinction
14 / 40
The population becomes extinct if the population size is reduced to zero.
Let π0 denote the probability that the population will eventually die out under the
assumption that X0 = 1. Then
π0 = lim P(Xn = 0 X0 = 1).
n→∞
15 / 40
When µ > 1, it turns out that π0 < 1.
Given that X1 = j, the population will eventually die out if and only if each
of the j families started by members of the first generation eventually dies
out.
16 / 40
Since each family is assumed to act independently and since the probability
that any particular family dies out is just π0 , this yields
Thus, π0 satisfies
∞
X
π0 = π0j Pj . (3)
j=0
In fact, when µ > 1, it can be shown that π0 is the smallest positive number
satisfying Equation (3).
17 / 40
Example
If P0 = 1/2, P1 = 1/4 and P2 = 1/4, then determine π0 .
18 / 40
Example
If P0 = 1/2, P1 = 1/4 and P2 = 1/4, then determine π0 .
Solution:
Using Equation (3) , we have
18 / 40
Example
If P0 = 1/4, P1 = 1/4 and P2 = 1/2, then determine π0 .
19 / 40
Example
If P0 = 1/4, P1 = 1/4 and P2 = 1/2, then determine π0 .
Solution:
Using Equation (3) , we have
19 / 40
Example
In the previous two examples, what is the probability that the population
will die out if it initially consists of k individuals?
20 / 40
Example
In the previous two examples, what is the probability that the population
will die out if it initially consists of k individuals?
Solution:
Recall that the population will die if and only if the families of each initial
generation die out. So, the desired probability is π0k .
For the example with P0 = 1/4, P1 = 1/4 and P2 = 1/2 , it is π0k = (1/2)k .
20 / 40
9.4 Probability Generating Functions
21 / 40
Probability generating functions are very useful tools when studying
branching processes.
Definition
The probability generating function (pgf ) of a non-negative
integer-valued random variable X is defined by
∞
X
X
GX (s) = E(s ) = sj P(X = j),
j=0
22 / 40
Properties
GX (1) = 1, and
(j)
2 GX (0)/j! = P(X = j) and
(j)
GX (1) = E(X (X − 1) · · · (X − j + 1)).
23 / 40
Properties - continued
➍ Suppose that X = N
P
i=1 Xi , a compound random variable, or a random
sum, with N a positive integer-valued random variable and
{Xi , i = 0, 1, 2, . . . } are independent identically distributed random
variables and independent of N . Then,
24 / 40
The probability-generating functions of the branching process Xn is
∞
X
Xn
Gn (s) ≡ E(s )= sj P (Xn = j), n = 1, 2, 3, . . . , 0 ≤ s ≤ 1.
j=0
25 / 40
Write
π0 = lim P(Xn = 0 X0 = 1)
n→∞
as the probability that the population will eventually die out, or we say it is
the ultimate probability that the population will die out.
Denote π0 (k) = limn→∞ P(Xn = 0 X0 = k), then it can be easily shown that
π0 (k) = π0k .
26 / 40
The following theorem shows the properties of π0 . It also provides a method
on how to find π0 .
The proof of this theorem can be found in Ross, S.M. (1970) Applied Probability Models
with Optimization Applications, Dover Publication, N.Y. USA on page 77.
Theorem
Suppose P0 > 0, and P0 + P1 < 1. Then
G1 (p) = p.
2 π0 = 1 if and only if µ ≤ 1.
27 / 40
Example
Let G(s) = (1 − p + p s), where 0 < p < 1 (this is the offspring distribution of
a Bernoulli(p)).
28 / 40
Example
Solution:
1 Given G1 (s) = (1 − p + p s), we obtain by induction,
Hence, Xn ∼ Bernoulli(pn ).
29 / 40
Example
Solution - continued:
➋ Let T be the random time of extinction. It is the first time n when
Xn = 0 and obviously, Xn = 0 for all n ≥ T . That is,
T = min{n ≥ 0 : Xn = 0}.
We have
P(Xn = 0) = P(T ≤ n).
30 / 40
Example
Solution - continued:
➋ Let the initial population size X0 = 1. Then
31 / 40
Example
Solution - continued:
➋ Suppose now the initial population size X0 = k. We can determine the
probability distribution of the time to extinction:
32 / 40
Example
Consider a branching process with Z following the distribution
P(Z = j) = (1 − p)j p, j = 0, 1, 2, . . .
33 / 40
Example
Solution:
Let q = 1 − p. Firstly, we need to find the generating function of Z,i.e.,
∞
X
Z
G(s) = E(s ) = sj P (Z = j)
j=0
∞
X ∞
X
j j
= s q p=p (sq)j
j=0 j=0
1
= p for all s such that sq < 1
1 − sq
p 1
= for |s| < .
1 − sq q
34 / 40
Example
Solution:
1
We will only consider the case when p = q, so G1 (s) = 2−s
.
1
G2 (s) = G1 (G1 (s)) = 1
2 − 2−s
2−s 2−s
= =
4 − 2s − 1 3 − 2s
1
2− 2−s
G3 = G2 (G1 (s)) = 1
3− 2 2−s
4 − 2s − 1 3 − 2s
= = .
6 − 3s − 2 4 − 3s
Now, we see a pattern.
35 / 40
Example
Solution - continued
We have
n − (n − 1)s
Gn (s) = , for n = 1, 2, . . . .
(n + 1) − ns
We will prove this by proof of induction. Assume Gn (s). Consider
1
n − (n − 1) 2−s
Gn+1 (s) = Gn (G(s)) =
1
(n + 1) − n 2−s
2n − ns − (n − 1)
=
2n + 2 − ns − s − n
(n + 1) − ns
= .
(n + 2) − (n + 1)s
36 / 40
Example
Solution - continued:
n − (n − 1)s
Gn (s) = , for n = 1, 2, . . . .
(n + 1) − ns
For n ≥ 1 and p = q,
1
P(T = n) = P(Zn = 0) − P(Zn−1 = 0) = .
n(n + 1)
Solution - continued:
For p ̸= q, we have the number Zn of the nth generation satisfies
q (pn − q n )
P(Zn = 0) = Gn (0) = .
pn+1 − q n+1
For n ≥ 1,
pn−1 q n (p − q)2
P (T = n) = P(Zn = 0) − P(Zn−1 = 0) = .
(pn − q n ) (pn+1 − q n+1 )
38 / 40
9.5 Supplementary Material
39 / 40
Supplementary Material - Quadratic Formula
The quadratic formula helps us solve any quadratic equation in the form
ax2 + bx + c = 0, where a, b, and c are coefficients. The formula is
√
−b ± b2 − 4 a c
.
2a
40 / 40