0% found this document useful (0 votes)
27 views8 pages

The Exponential Distribution and The Poisson Process (Part 2)

UW STAT333

Uploaded by

ianchau379
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views8 pages

The Exponential Distribution and The Poisson Process (Part 2)

UW STAT333

Uploaded by

ianchau379
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

4.

2 The Poisson Process


Definition: A counting process {N (t), t 0} is a stochastic process in which N (t) represents
the number of events that happen (or occur) by time t, where the index t measures time
over a continuous range.

Some examples of counting processes {N (t), t 0} might include:

(1) N (t) represents the number of automobile accidents at a specified intersection by week
t,

(2) N (t) represents the number of births by year t in Canada,

(3) N (t) represents the number of visits to a particular webpage by time t,

(4) N (t) represents the number of customers who enter a store by time t,

(5) N (t) represents the number of accident claims reported to an insurance company by
time t.

Basic Properties of Counting Processes:

(1) N (0) = 0.

(2) N (t) is a non-negative integer 8 t 0 (i.e., N (t) 2 N 8 t 0).

(3) If s < t, then N (s)  N (t).

(4) N (t) N (s) counts the number of events to occur in the time interval (s, t] for s < t.

We now introduce two important properties associated with counting processes.

Definition: A counting process {N (t), t 0} has independent increments if N (t1 ) N (s1 )


is independent of N (t2 ) N (s2 ) whenever (s1 , t1 ] \ (s2 , t2 ] = ; for all choices of s1 , t1 , s2 , t2
(i.e., the number of events in non-overlapping time intervals are assumed to be independent
of each other).

Definition: A counting process {N (t), t 0} has stationary increments if the distribution


of the number of events in (s, s + t] (i.e., N (s + t) N (s)) depends only on t, the length
of the time interval. In this case, N (s + t) N (s) has the same probability distribution as
N (0 + t) N (0) = N (t), the number of events occurring in the interval [0, t].

101
Remark: As the diagram below indicates, the assumption of stationary and independent in-
crements is essentially equivalent to stating that, at any point in time, the process {N (t), t
0} probabilistically restarts itself.

Some Arbitrary
Point In Time
Now
Irrelevant
Time
0
Future Evolution
Of The Process
“New” Time 0

Before introducing the formal definition of a Poisson process, we first introduce a few math-
ematical tools which are needed.
Definition: A function y = f (x) is said to be “o(h)” (i.e., of order h) if
f (h)
lim
= 0.
h!0 h

Remark: An o(h) function y = f (x) is one in which f (h) approaches 0 faster than h does.
Examples:
(1) y = f (x) = x. Note that
f (h) h
lim = lim = lim 1 6= 0.
h!0 h h!0 h h!0

Thus, y = x is not of order h.


(2) y = f (x) = x2 . Note that
f (h) h2
lim = lim = lim h = 0.
h!0 h h!0 h h!0

Thus, y = x2 is of order h. In fact, a function of the form y = xr is clearly of order h


provided that r > 1.
(3) Suppose that {fi (x)}ni=1 is a sequence
Pnof o(h) functions. Consider the linear combina-
tion of o(h) functions, namely y = i=1 ci fi (x), and note that
Pn n
ci fi (h) X fi (h)
lim i=1 = ci lim = 0.
h!0 h i=1
h!0
| {z } h
=0

Thus, a linear combination of o(h) functions is still of order h.


Remark: In most cases, this result is even true when n = 1.

102
Definition: A counting process {N (t), t 0} is said to be a Poisson process at rate if
the following three conditions hold true:

(1) The process has both independent and stationary increments.

(2) For h > 0, P (N (h) = 1) = h + o(h).

(3) For h > 0, P (N (h) 2) = o(h).

Remarks:

(1) Condition (2) in the above definition implies that in a “small” interval of time, the
probability of a single event occurring is essentially proportional to the length of the
interval.

(2) Condition (3) in the above definition implies that two or more events occurring in a
“small” interval of time is rare.

(3) Conditions (2) and (3) yield

P (N (h) = 0) = 1 P (N (h) > 0)


=1 P (N (h) = 1) P (N (h) 2)
=1 ( h + o(h)) o(h)
=1 h o(h) o(h)
=1 h + o(h).

Ultimately, for a Poisson process {N (t), t 0} at rate , we would like to know the distri-
bution of the rv N (s + t) N (s), representing the number of events occurring in the interval
(s, s + t], s, t 0. Since a Poisson process has stationary increments, this rv has the same
probability distribution as N (t). The following theorem specifies the distribution of N (t).

Theorem 4.3. If {N (t), t 0} is a Poisson process at rate , then N (t) ⇠ POI( t).

Proof:

103
104
Remark: As a direct consequence of Theorem 4.3, for all s, t 0, we have
t
e ( t)k
P (N (s + t) N (s) = k) = P (N (t) = k) = , k = 0, 1, 2, . . . .
k!
Interarrival Times: Define T1 to be the elapsed time (from time 0) until the first event
occurs. In general, for i 2, let Ti be the elapsed time between the occurrences of the
(i 1)th event and the ith event. The sequence {Ti }1
i=1 is called the interarrival or interevent
time sequence. The diagram below depicts the relationship between N (t) and {Ti }1 i=1 .

N (t)

T5
4

T4
3

T3
2

T2
1

T1
Time
0 1st event 2nd event 3rd event 4th event 5th event
occurs occurs occurs occurs occurs

A very important result linking a Poisson process to its interarrival time sequence now
follows.

Theorem 4.4. If {N (t), t 0} is a Poisson process at rate > 0, then {Ti }1


i=1 is a sequence
of iid EXP( ) rvs.

Proof:

105
For n 2 Z+ , define Sn to be the total elapsed time until the nth event occurs. In other words,
Sn denotes the Parrival time of the nth event, or the waiting time until the nth event occurs.
Clearly, Sn = ni=1 Ti . If {N (t), t 0} is a Poisson process at rate , then {Ti }1 i=1 is a
sequence of iid EXP( ) rvs by Theorem 4.4, implying that
n
X
Sn = Ti ⇠ Erlang(n, ).
i=1

From our earlier results on the Erlang distribution, we have E[Sn ] = n/ , Var(Sn ) = n/ 2 ,
and
n 1
X ( t)j
P (Sn > t) = e t , t 0.
j=0
j!

Remarks:
(1) The above formula for the tpf of Sn could have been obtained without reference to the
Erlang distribution. In particular, note that
P (Sn > t) = P (arrival time of the nth event occurs after time t)
= P (at most n 1 events occur by time t)
= P (N (t)  n 1)
n 1
X t
e ( t)j
= since N (t) ⇠ POI( t)
j=0
j!
n 1
X
t ( t)j
=e .
j=0
j!

(2) If {Xi }1
i=1 represents an iid sequence of EXP( ) rvsP and one constructs a counting pro-
cess {N (t), t 0} defined by N (t) = max{n 2 N : ni=1 Xi  t}, then {N (t), t 0} is
actually a Poisson process at rate . In other words, {N (t), t 0} has both indepen-
dent and stationary increments (due to the memoryless property that the sequence of
rvs {Xi }1i=1 possesses), in addition to the fact that
k+1
! k k+1
X X ( t)j X
t
P (N (t)  k) = P Xi > t = e since Xi ⇠ Erlang(k + 1, ),
i=1 j=0
j! i=1

which subsequently leads to


P (N (t) = k) = P (N (t)  k) P (N (t)  k 1)
Xk k 1
X
t ( t)j t ( t)j
= e e
j=0
j! j=0
j!
t
e ( t)k
= , k = 0, 1, 2, . . . .
k!
106
Example 4.4. At a local insurance company, suppose that fire damage claims come into
the company according to a Poisson process at rate 3.8 expected claims per year.

(a) What is the probability that exactly 5 claims occur in the time interval (3.2, 5] (measured
in years)?

Solution:

(b) What is the probability that the time between the 2nd and 4th claims is between 2 and
5 months?

Solution:

107

You might also like