3 - Poisson Process

Download as pdf or txt
Download as pdf or txt
You are on page 1of 64

Poisson processes

1 / 49
Outline

I Textbook: Section 5.2 Dimitri


I Content
I Definition of Poisson process as a counting process
I Distribution of inter - arrival time
I Conditional distribution of arrival time
I Compound Poisson process

2 / 49
A motivation - Insurance risk model

I In a portfolio insurance risk such as a portfolio of motor


insurance policies, interest quantities are number of claims
arriving in a fixed period of time and the sizes of those
claim

3 / 49
A motivation - Insurance risk model

I In a portfolio insurance risk such as a portfolio of motor


insurance policies, interest quantities are number of claims
arriving in a fixed period of time and the sizes of those
claim
I modelling number of claims by a counting process such as
Poisson process
I modelling the financial losses which can be suffered by
individuals and insurance companies as a result of
insurable events such as storm or fire damage to property,
theft of personal property and vehicle accidents. One
candidate is compound Poisson process

3 / 49
Table of Contents

Poisson processes
Poisson processes
Arrival, inter-arrival time of a Poisson process

Compound Poisson processes

Simulation

4 / 49
5 / 49
Arrival, inter-arrival time

5 / 49
Arrival, inter-arrival time

I Inter-arrival time X1 , X2 , X3 . . .

5 / 49
Arrival, inter-arrival time

I Inter-arrival time X1 , X2 , X3 . . .
I Arrival time

S0 = 0
S1 = X1
S2 = X1 + X2
...

5 / 49
Arrival, inter-arrival time

I Inter-arrival time X1 , X2 , X3 . . .
I Arrival time

S0 = 0
S1 = X1
S2 = X1 + X2
...

I Number of arrival up to time t



X
Nt = 1{Sn ≤t} = max{n : Sn ≤ t}
n=1

5 / 49
Poisson processes

A Poisson process with intensity (or rate) λ is a random


(counting) process (Nt )t≥0 with the following properties:
1. N0 = 0.
2. For all t > 0, Nt has a Poisson distribution with
parameter λt.
3. (Stationary increments) For all s, t > 0, Ns+t − Ns has the
same distribution as Nt . That is,
e−λt (λt)k
P (Ns+t − Ns = k) = P (Nt = k) = , for
k!
k = 0, 1, . . .
4. (Independent increments) For 0 ≤ s < t, Nt − Ns and
Nr − Nq are independent random variables.

6 / 49
Some properties of Poisson processes

I The distribution of the number of arrivals in an interval


depends only on the length of the interval

7 / 49
Some properties of Poisson processes

I The distribution of the number of arrivals in an interval


depends only on the length of the interval
I The number of arrivals on disjoint intervals are
independent random variables

7 / 49
Some properties of Poisson processes

I The distribution of the number of arrivals in an interval


depends only on the length of the interval
I The number of arrivals on disjoint intervals are
independent random variables
I E(Nt ) = λt

7 / 49
Some properties of Poisson processes

I The distribution of the number of arrivals in an interval


depends only on the length of the interval
I The number of arrivals on disjoint intervals are
independent random variables
I E(Nt ) = λt
I λ is the average number of arrivals per unit of time.

7 / 49
Some properties of Poisson processes

I The distribution of the number of arrivals in an interval


depends only on the length of the interval
I The number of arrivals on disjoint intervals are
independent random variables
I E(Nt ) = λt
I λ is the average number of arrivals per unit of time.
I when h  0 (very small) then

P (Nh = 1) = λh + o(h)

and
P (Nh ≥ 2) = o(h)

7 / 49
Construct by tossing a low-probability coin very fast

I Pick n large
I A coin with low Head probability λ
n
I Toss this coin at times which are positive integer multiples
of n1
I Nt be number of Head on [0, t]. Then Nt is binomial
distributed and converges to P oiss(λt) as n → ∞
I For, Nt+s − Ns is indepdendent of the past and Poisson
distributed P oiss(λt)

8 / 49
Construct by tossing a low-probability coin very fast

I Pick n large
I A coin with low Head probability λ
n
I Toss this coin at times which are positive integer multiples
of n1
I Nt be number of Head on [0, t]. Then Nt is binomial
distributed and converges to P oiss(λt) as n → ∞
I For, Nt+s − Ns is indepdendent of the past and Poisson
distributed P oiss(λt)
Simulation Poisson process?

8 / 49
Example

Starting at 6 a.m., customers arrive at Martha’s bakery


according to a Poisson process at the rate of 3 customers per
hour. Find the probability that more than 2 customers arrive
between 9 a.m and 11 a.m.

9 / 49
Solution
I Initial time t = 0 (corresponding to 6 a.m)
I Number of customers: Poisson process (Nt )t≥0 at rate of 3
customers per hour
I Number of customers up to 9 a.m (t = 3): N3
I Number of customers up to 11 a.m (t = 5): N5
I Number of customers between 9 a.m and 11 a.m:
N5 − N3 ,→ P oiss((5 − 3)λ) = P oiss(6)
I

P (N5 − N3 > 2) = 1 − P (N5 − N2 ≤ 2)


= 1 − [P (N5 − N2 = 0) + P (N5 − N2 = 1) + P (N5 − N2 = 2)]
60 61 62
 
= 1 − e−6 + e−6 + e−6
0! 1! 2!

10 / 49
Example
Joe receives text messages starting at 10 a.m. at the rate of 10
texts per hour according to a Poisson process. Find the
probability that he will receive exactly 18 texts by noon (12 a.m.)
and 70 texts by 5 p.m.

11 / 49
Example
Joe receives text messages starting at 10 a.m. at the rate of 10
texts per hour according to a Poisson process. Find the
probability that he will receive exactly 18 texts by noon (12 a.m.)
and 70 texts by 5 p.m.
Solution
I Initial time t = 0 (10 a.m)
I Text message arrival: Poisson process (Nt )t≥0 with rate
λ = 10

11 / 49
Example
Joe receives text messages starting at 10 a.m. at the rate of 10
texts per hour according to a Poisson process. Find the
probability that he will receive exactly 18 texts by noon (12 a.m.)
and 70 texts by 5 p.m.
Solution
I Initial time t = 0 (10 a.m)
I Text message arrival: Poisson process (Nt )t≥0 with rate
λ = 10
I Number of message up to 12a.m (corresponding to time
t = 2) is N2
I Number of message up to 5 p.m (corresponding to time
t = 7) is N7

11 / 49
Example
Joe receives text messages starting at 10 a.m. at the rate of 10
texts per hour according to a Poisson process. Find the
probability that he will receive exactly 18 texts by noon (12 a.m.)
and 70 texts by 5 p.m.
Solution
I Initial time t = 0 (10 a.m)
I Text message arrival: Poisson process (Nt )t≥0 with rate
λ = 10
I Number of message up to 12a.m (corresponding to time
t = 2) is N2
I Number of message up to 5 p.m (corresponding to time
t = 7) is N7
I Need to find P (N2 = 18, N7 = 70)

11 / 49
P (N2 = 18, N7 = 70) = P (N7 = 70|N2 = 18)P (N2 = 18)
| {z }
multiplication rule

= P (N7 − N2 + N2 = 70|N2 = 18)P (N2 = 18)


= P( N7 − N2 + 18
|{z} = 70|N2 = 18)P (N2 = 18)
| {z }
independent of N2 substitute N2 by 18

= P( N7 − N2 = 52)P ( N2 = 18)
| {z } |{z}
P ois((7−2)λ)=P oiss(50) P ois(2λ)=P oiss(20)

e−50 (50)52 e−20 (20)18


=
52! 18!

12 / 49
Another approach

P (N2 = 18, N7 = 70) = P (N2 = 18, N7 − N2 = 52)


| {z }
independent

= P (N2 = 18)P ( N7 − N2 = 52)


| {z }
same distribution as N5
= P (N2 = 18)P (N5 = 52)
e−20 (20)18 e−50 (50)52
= = 0.0045 = 0.45%,
18! 52!

13 / 49
Example

On election day, people arrive at a voting center according to a


Poisson process. On average, 100 voters arrive every hour. If
150 people arrive during the first hour, what is the probability
that at most 350 people arrive before the third hour?

14 / 49
Example

On election day, people arrive at a voting center according to a


Poisson process. On average, 100 voters arrive every hour. If
150 people arrive during the first hour, what is the probability
that at most 350 people arrive before the third hour?
Solution
I Number of arrivals (Nt )t≥0 is a Poisson process at rate of
λ = 100 per hour.
I 150 people arrive during the first hour: N1 = 150
I at most 350 people arrive before the third hour: N3 ≤ 350
I Need to find P (N3 ≤ 350|N1 = 150)

14 / 49
P (N3 ≤ 350|N1 = 150) = P (N3 − N1 + N1 ≤ 350|N1 = 150)
= P (N3 − N1 ≤ 200|N1 = 150)
= P (N3 − N1 ≤ 200) = P (N2 ≤ 200)
200
X
= P (N2 = k)
k=0
200
X e−100∗2 (100 ∗ 2)k
= = 0.519.
k!
k=0

Painful to compute these term directly (out of memory). One


solution is using normal approximation for Poisson distribution

15 / 49
Example
You get email according to a Poisson process at a rate of λ = 5
messages per hour. You check your email every thirty minutes.
Find
1. P(no message)
2. P(one message)

16 / 49
Inter-arrival time are i.i.d exponential RVs

I P (X1 > t) = P (Nt = 0) = e−λt . Hence X1 ,→ Exp(λ)


I

P (X2 > t|X1 = s) = P (no event in (s,s+t)|X1 = s)


= P (no event in (s,s+t)) (by independent increment)
= P (no event in (0,t)) (by stationary increment)
= P (Nt = 0) = e−λt

Hence X2 ,→ Exp(λ) and independent of X1

17 / 49
Construction by exponential interarrival times

I X1 , X2 , . . . , Xn are i.i.d Exp(λ)


I Arrival time

S0 = 0
S1 = X1
S2 = X1 + X2
...
Sn = X1 + X2 + · · · + Xn

I Stop when Sn ≤ t < Sn+1

18 / 49
Arrival time or waiting time Sn

The density function of Sn is given by

(λt)n−1
fSn (t) = λe−λt
(n − 1)!

i.e. Sn is Gamma distributed with parameter (n, λ) (also called


Erlang)

19 / 49
Proof
I the nth event will occur prior to or at time t if and only if the
number of events occurring by time t is at least n

Sn ≤ t ⇔ Nt ≥ n

I cdf of Sn
∞ ∞
X X (λt)k
FSn (t) = P (Sn ≤ t) = P (Nt ≥ n) = P (Nt = k) = e−λt
k!
k=n k=n

I pdf of Sn
∞ 
(λt)k (λt)k−1

dFSn (t) X
fSn (t) = = −λe−λt + λe−λt
dt k! (k − 1)!
k=n
∞ ∞
−λt
X (λt)k −λt
X (λt)k−1
= −λe + λe =
k! (k − 1)!
k=n k=n

20 / 49
We have
∞ ∞ ∞
X (λt)k−1 X (λt)k (λt)n−1 X (λt)k
= = +
(k − 1)! k! (n − 1)! k!
k=n k=n−1 k=n

So
(λt)n−1
fSn (t) = λe−λt
(n − 1)!

21 / 49
Example

Consider a Poisson process with rate λ = 1. Compute


1. E(time of the 10’th event),
2. P (the 10th event occurs 2 or more time units after the 9th
event),
3. P (the 10th event occurs later than time 20)

22 / 49
Solution
1.
S10 = X1 + · · · + X10
where Xi ,→ Exp(λ)
We have Z ∞
1
E(Xi ) = xλe−λx dx =
0 λ
So
10
E(S10 ) = E(X1 ) + · · · + E(X10 ) = = 10
λ
2.
Z ∞
P (S10 −S9 ≥ 2) = P (X10 > 2) = λe−λxdx = e−2λ = e−2
2

3.
∞ ∞
(λt)10−1
Z Z
P (S10 > 20) = fS10 (t)dt = λe−λt dt
20 20 (10 − 1)!
23 / 49
Example
LetNt be a Poisson process with intensity λ = 2, and let
X1 , X2 , . . . be the corresponding inter-arrival times.
1. Find the probability that the first arrival occurs after t = 0.5,
i.e., P (X1 > 0.5).
2. Given that we have had no arrivals before t = 1, find the
probability that there is no arrivel up to time 3.
3. Given that the third arrival occurred at time t = 2, find the
probability that the fourth arrival occurs after t = 5.
4. I start watching the process at time t = 10. Let T be the
time of the first arrival that I see. In other words, T is the
first arrival after t = 10. Find E(T ) and V ar(T ).
5. I start watching the process at time t = 10. Let T be the
time of the first arrival that I see. Find the conditional
expectation and the conditional variance of T given that I
am informed that the last arrival occurred at time t = 9.

24 / 49
Order statistic

Let X1 , X2 , ..., Xn be rv then X(1) , X(2) , ..., X(n) are the order
statistics corresponding to X1 , X2 , ..., Xn if X(k) is the
k-smallest value among X1 , X2 , ..., Xn .

Property
If U1 , U2 , ..., Un are i.i.d uniformly distributed U ([0, t]) then the
joint pdf of U(1) , U(2) , ..., U(n) is

n!
f (x1 , ..., xn ) =
tn
for 0 ≤ x1 ≤ x2 ... ≤ xn ≤ t

25 / 49
Proof

I joint pdf of U1 , U2 , ..., Un is t1n


I U(1) , U(2) , ..., U(n) is equal to (x1 , x2 ..., xn ) when
U1 , U2 , ..., Un is equal to any of n! permutation of
(x1 , x2 ..., xn ).
I the joint pdf of U(1) , U(2) , ..., U(n) is

n!
f (x1 , ..., xn ) =
tn

26 / 49
Conditional distribution of arrival times

Given that Nt = n, the n arrival times S1 , ..., Sn have the same


distribution as order statitics corresponding to n independent
random variables uniformly distributed on [0, t].

27 / 49
Proof

Let 0 < t1 < t2 < ... < tn < tn+1 = t and hi be small enough such
ti + hi < ti+1

P (ti ≤ Si ≤ ti + hi , i = 1, ..., n|Nt = n)


P (ti ≤ Si ≤ ti + hi , i = 1, ..., n, Nt = n)
=
P (Nt = n)
P (exact 1 event in [ti , ti + hi ]), i = 1, ..., n, no event elsewhere in [0, t]
= n
e−λt (λt)
n!
λh1 e−λh1 ...λh2 e−λh2 ...λhn e−λhn e−λ(t−h1 −...−hn )
= n
e−λt (λt)
n!
n!
= h1 h2 ...hn
tn

28 / 49
∂ n F(S1 ,...,Sn )|Nt =n
f(S1 ,...,Sn )|Nt =n (t1 , . . . , tn ) = (t1 , . . . , tn )
∂t1 . . . ∂tn
F(S1 ,...,Sn )|Nt =n (t1 + h1 , . . . , tn + hn ) − F(S1 ,...,Sn )|Nt =n (t1 , . . . , tn
= lim
h1 ,...,hn →0 h1 . . . hn
P (ti ≤ Si ≤ ti + hi , i = 1, ..., n|Nt = n) n!
= lim = n
h1 ,...,hn →0 h1 ...hn t

So given Nt = n, (S1 , . . . , Sn ) has the same distribution as


(U(1) , . . . , U(n) )

29 / 49
Construction by conditional distribution of arrival times

I Determine the number of arrivals Nt which is Poisson


distributed P oiss(λt)
I Nt i.i.d U ni([0, t]): U1 , U2 , . . . , UNt
I Sort Ui to obtain arrival time S1 , S2 , . . .

30 / 49
Example
Let Nt be a Poisson process with rate λ = 2 with arrival time
S1 , S2 , . . . . Find

E(S1 + S2 + · · · + S10 |N4 = 10)

31 / 49
Example
Let Nt be a Poisson process with rate λ = 2 with arrival time
S1 , S2 , . . . . Find

E(S1 + S2 + · · · + S10 |N4 = 10)

Solution
I Given N4 = 10, (S1 , . . . , S)10) has the same joint
distribution as (U(1) , U(2) , . . . U(10) ) where Ui are i.i.d
U ni(0, 4)
I E(S1 + S2 + · · · + S10 |N4 = 10) = E(U(1) + · · · + U(10) )
I U(1) + . . . U(10) = U1 + . . . U10
I E(S1 + S2 + · · · + S10 |N4 = 10) = E(U1 + · · · + U10 ) =
E(U1 ) + · · · + E(U10 )
I Ui ,→ U ni([0, 4]) ⇒ E(Ui ) = 0+4 2 =2
I E(S1 + S2 + · · · + S10 |N4 = 10) = 10 × 2 = 20
31 / 49
Example

Suppose that travelers arrive at a train depot accordance with a


Poisson process with rate λ. If the train departs at time t,
compute the expected sum of the waiting times of travelers
arriving in (0, t)
Nt
!
X
E (t − Si )
i=1

32 / 49
Solution
Nt Nt
! " !#
X X
Using property E (t − Si ) = E E (t − Si )|Nt
i=1 ! i=1
Nt
X
Find E (t − Si )|Nt
i=1
I
Nt n
! !
X X
E (t − Si )|Nt = n =E (t − Si |Nt = n)
i=1 i=1
n
!
X
=E (t − U(i) ) where Ui ,→ U ([0, t])
i=1
n
!
X
=E (t − Ui ) = nE(t − U1 ) = n(t − t/2) = nt/2
i=1
Nt
!
X tNt
I E (t − Si )|Nt =
2
i=1
33 / 49
Hence
Nt
!
λt2
 
X tNt t t
E (t − Si ) =E = E(Nt ) = (λt) =
2 2 2 2
i=1

34 / 49
Table of Contents

Poisson processes

Compound Poisson processes

Simulation

35 / 49
Compound Poisson Processes

Let W1 , W2 , ... be a sequence of i.i.d rv with cdf F and


independent of a Poisson process (Nt )t≥0 with rate λ then the
process (Rt )t≥0 with
XNt
Rt = Wi
i=1

is called by a compound Poisson process.

36 / 49
Example

Suppose that health claims are filed with a health insurer at the
Poisson rate per day, and that the independent severities W of
each claim are exponential random variables . Then the
aggregate R of claims is a compound Poisson process.

37 / 49
Properties of compound Poisson processes

1. E(Rt ) = λtE(W ) (Tower property)


2. V ar(Rt ) = λtE(W 2 )

38 / 49
Proof
Using property
E(X) = E(E(X|Y ))
for Y = Nt
1. Compute E(Rt |Nt )
Nt Nt
! !
X X
E(Rt |Nt = n) = E Wi =E Wi |Nt = n
i=1 i=1
 
 n independent of Nt 
 X z}|{ 
=E
 Wi |Nt = n

 i=1 
|{z}
substitute Nt by n

39 / 49
n
!
X
E(Rt |Nt = n) = E Wi
i=1
n
X
= E(W ) = nE(W )
| {z }i
i=1
=E(W )

So
E(Rt |Nt ) = Nt E(W )
Hence

E(Rt ) = E(E(Rt |Nt )) = E(Nt E(W ) = E(Nt )E(W ) = λtE(W )


| {z }
constant

40 / 49
2.

V ar(Rt ) = E(Rt2 ) − (E(Rt ))2 = E(Rt2 ) − λ2 t2 (E(W ))2

Compute E(Rt2 |Nt )



Nt
!2 
X
E(Rt2 |Nt = n) = E  Wi |Nt = n
i=1
 !2 
n
X
=E Wi |Nt = n
i=1
n
!2
X
=E Wi
i=1

41 / 49
We have
n
!2 n n
X X X
Wi = Wi2 + Wi Wj
i=1 i=1 i6=j,i,j=1

So
n
!2 n n
X X X
E Wi = E(Wi2 ) + E(Wi Wj )
i=1 i=1 i6=j,i,j=1
Xn X n
= E(Wi2 ) + E(Wi )E(Wj )
i=1 i6=j,i,j=1

= nE(W 2 ) + n(n − 1)(E(W ))2

Hence E(Rt2 |Nt = n) = nE(W 2 ) + n(n − 1)(E(W ))2 then


E(Rt2 |Nt ) = Nt E(W 2 ) + Nt (Nt − 1)(E(W ))2

42 / 49
E(Rt2 ) = E(E(Rt |Nt )) = E(Nt )E(W 2 ) + E(Nt (Nt − 1))(E(W ))2

Because Nt ,→ P oiss(λt),

E(Nt ) = λt, V ar(Nt ) = E(Nt2 ) − (E(Nt ))2 = λt

we have E(Nt2 ) = V ar(Nt ) + (E(Nt ))2 = λt + (λt)2 and then

ENt (Nt − 1) = E(Nt2 ) − E(Nt ) = λt + (λt)2 − λt = (λt)2


⇒ E(Rt2 ) = λtE(W 2 ) + (λt)2 (E(W ))2
⇒ V ar(Rt ) = E(Rt2 ) − λ2 t2 (E(W ))2 = λtE(W 2 )

43 / 49
Example

Consider the compound Poisson process modeling aggregate


health claims; frequency N is a Poisson process with rate
λ = 20 per day and severity W is an Exponential random
variable with mean θ = 500. Suppose that you are interested in
the aggregate claims S10 during the first 10 days.
1. Find E(R10 )
2. Find V ar(R10 )

44 / 49
Solution

1. E(R10 ) = E(N10 )E(W ) = (20 × 10)(500) = 100, 000


because N10 ,→ P ois(λt) = P ois(20 × 10)
2. V ar(R10 ) = E(N10 )E(W 2 ) = 200 × (5002 ) = 100, 000, 000

45 / 49
An application of compound Poisson process in
insurance: Cramer-Lundberg model

In insurance, compound Poisson process is used to model total


claim amount on [0, t]. If premium arrives with rate c then the
insurer’s surplus level with inital surplus x is
Nt
X
Ut = x + ct − Wi
i=1

A central object is to find the ruin probability that the insurer’s


surplus falls below 0 (firm bankrupts)

46 / 49
Table of Contents

Poisson processes

Compound Poisson processes

Simulation

47 / 49
Simulation practice
1. Simulate a path of Poisson process with rate λ = 2 on
interval time [0, 10] by simulating inter-arrival time
2. Simulate a path of Poisson process with rate λ = 2 on
interval time [0, 10] by simulating number of event Nt first
and then arrival times (using conditional distribution of
arrival times)
3. Simulate a path of insurance surplus on [0, 10] with
I (Nt )t is a poisson process with rate λ = 2
I Claim size Wk ,→ Exp(1)
4. Estimate ruin probability of the previous problem on finite
horizon time [0, 10] with c = 1, x = 10, (Nt )t is a poisson
process with rate λ = 2, claim size Wk ,→ Exp(1)
5. Which value of c should be to guarantee that the ruin
probability over horizon time [0, 10] is less or equal to 10−3 .
Use set up as the previous as (except value of c)

48 / 49
Practice

Consider the compound Poisson process modeling aggregate


health claims; frequency N is a Poisson process with rate
λ = 20 per day and severity W is an Exponential random
variable with mean θ = 500. Simulate 10000 scenarios for the
aggregate claims S10 during the first 10 days.
1. Estimate E(R10 ) and V ar(R10 ) from simulated sample.
2. Plot histogram for simulated sample of R10 . What can you
say about the distribution of R10 .
3. Propose an approximation or estimation for
P (R10 > 120, 000).

49 / 49

You might also like