0% found this document useful (0 votes)
7 views5 pages

The Poisson Process

Uploaded by

Carmen Ramírez
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views5 pages

The Poisson Process

Uploaded by

Carmen Ramírez
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Connexions module: m11255 1

The Poisson Process


Version 1.4: 2003/08/08 15:36:39 GMT-5

Don Johnson
This work is produced by The Connexions Project and licensed under the

Creative Commons Attribution License

Abstract

The Poisson Process


Some signals have no waveform. Consider the measurement of when lightning strikes occur
within some region; the random process is the sequence of event times, which has no intrinsic
waveform. Such processes are termed point processes, and have been shown (see Snyder [2]
) to have simple mathematical structure. Dene some quantities rst. Let Nt be the number
of events that have occurred up to time t (observations are by convention assumed to start
at t = 0). This quantity is termed the counting process, and has the shape of a staircase
function: The counting function consists of a series of plateaus always equal to an integer,
with jumps between plateaus occuring when events occur. The increment Nt1 ,t2 = Nt2 −Nt1
corresponds to the number of events in the interval [t1 , t2 ). Consequently, Nt = N0,t . The
event times comprise the random vector W; the dimension of this vector is Nt , the number
of events that have occured. The occurrence of events is governed by a quantity known as
the intensity λ (t; Nt ; W) of the point process through the probability law
Pr [ Nt,t+∆t = 1 | Nt ; W] = λ (t; Nt ; W) ∆t

for suciently small ∆t. Note that this probability is a conditional probability; it can
depend on how many events occurred previously and when they occurred. The intensity
can also vary with time to describe non-stationary point processes. The intensity has units
of events, and it can be viewed as the instantaneous rate at which events occur.
The simplest point process from a structural viewpoint, the Poisson process, has no
dependence on process history. A stationary Poisson process results when the intensity
equals a constant: λ (t; Nt ; W) = λ0 . Thus, in a Poisson process, a coin is ipped every ∆t
seconds, with a constant probability of heads (an event) occuring that equals λ0 ∆t and is
independent of the occurrence of past (and future) events. When this probability varies with
time, the intensity equals λ (t), a non-negative signal, and a nonstationary Poison process
results. 1

From the Poisson process's denition, we can derive the probability laws that govern
event occurrence. These fall into two categories: the count statistics Pr [Nt1 ,t2 = n], the
∗ http://creativecommons.org/licenses/by/1.0
1 In the literature, stationary Poisson processes are sometimes termed homogeneous, nonstationary ones
inhomogeneous.

http://cnx.rice.edu/content/m11255/latest/
Connexions module: m11255 2

probability of obtaining n events in an interval [t1 , t2 ), and the time of occurrence statis-
tics p W(n) ( w ), the joint distribution of the rst n event times in the observation interval.
These times form the vector W (n) , the occurrence time vector of dimension n. From these
two probability distributions, we can derive the sample function density.

1 Count Statistics
We derive a dierentio-dierence equation that Pr [Nt1 ,t2 = n], t1 < t2 , must satisfy for
event occurrence in an interval to be regular and independent of event occurrences in disjoint
intervals. Let t1 be xed and consider event occurrence in the intervals [t1 , t2 ) and [t2 , t2 + δ),
and how these contribute to the occurrence of n events in the union of the two intervals. If
k events occur in [t1 , t2 ), then n − k must occur in [t2 , t2 + δ). Furthermore, the scenarios
for dierent values of k are mutually exclusive. Consequently,
Pn
Pr [Nt1 ,t2 +δ = n] = k=0 (Pr [Nt1 ,t2 = k, Nt2 ,t2 +δ = n − k])
= Pr [ Nt2 ,t2 +δ = 0 | Nt1 ,t2 = n] Pr [Nt1 ,t2 = n] + Pr [ Nt2 ,t2 +δ = 1 | Nt1 ,t2 = n − 1] Pr [Nt1 ,t2 =
(1)
Because of the independence of event occurrence in disjoint intervals, the conditional prob-
abilities in this expression equal the unconditional ones. When δ is small, only the rst two
will be signicant to the rst order in δ . Rearranging and taking the obvious limit, we have
the equation dening the count statistics.
d
(Pr [Nt1 ,t2 = n]) = − (λ (t2 ) Pr [Nt1 ,t2 = n]) + λ (t2 ) Pr [Nt1 ,t2 = n − 1]
dt2
To solve this equation, we apply a z -transform to both sides. Dening the transform of
Pr [Nt1 ,t2 = n] to be P (t2 , z), we have 2


P (t2 , z) = − λ (t2 ) 1 − z −1 P (t2 , z)
 
∂t2
Applying the boundary condition that P (t1 , z) = 1, this simple rst-order dierential equa-
tion has the solution −1
R t2
P (t2 , z) = e−((1−z ) t1 λ(α)dα)

To evaluate the inverse z -transform, we simply exploit the Taylor series expression for the
exponential, and we nd that a Poisson probability mass function governs the count statistics
for a Poisson process.
R n
t2
t1
λ (α) dα R t2
Pr [Nt1 ,t2 = n] = e−( t1 λ(α)dα)
(2)
n!
The integral of the intensity occurs frequently, and we succinctly denote it by Λtt21 . When
the Poisson process is stationary, the intensity equals a constant, and the count statistics
depend only on the dierence t2 − t1 .

2 Time of occurrence statistics


To derive the multivariate distribution of W, we use the count statistics and the indepen-
dence properties of the Poisson process. The density we seek satises
Z w1 +δ1 Z wn +δn
... p W(n) ( v ) dvdv = Pr [W1 ∈ [w1 , w1 + δ1 ) , . . . , Wn ∈ [wn , wn + δn )]
w1 wn
2 Remember, t is xed and can be suppressed notationally.
1

http://cnx.rice.edu/content/m11255/latest/
Connexions module: m11255 3

The expression on the right equals the probability that no events occur in [t1 , w1 ), one
event in [w1 , w1 + δ1 ), no event in [w1 + δ1 , w2 ), etc. Because of the independence of event
occurrence in these disjoint intervals, we can multiply together the probability of these event
occurrences, each of which is given by the count statistics.
w1 w1 +δ1 w2 w2 +δ2 wn +δ
Pr [W1 ∈ [w1 , w1 + δ1 ) , . . . , Wn ∈ [wn , wn + δn )] = e−Λt1 Λw
w1 +δ1 −Λw1
1
e e−Λw1 +δ1 Λw
w2 +δ2 −Λw2
2
e . . . Λw
wn
n +δn −Λwn
e

for small δk . From this approximation, we nd that the joint distribution of the rst n event
times equals
( “R
wn

− λ(α)dα
if t1 ≤ w1 ≤ w2 ≤ · · · ≤ wn
Qn
( k=1 (λ (wk ))) e (3)
t1
p W (n) ( w ) =
0 otherwise

3 Sample function density


For Poisson processes, the sample function density describes the joint distribution of counts
and event times within a specied time interval. Thus, it can be written as
Pr [ Nt | t1 ≤ t < t2 ] = Pr [ Nt1 ,t2 = n | {W1 = w1 , . . . , Wn = wn }] p W (n) ( w )

The second term in the product equals the distribution derived previously for the time
of occurrence statistics. The conditional probability equals the probability that no events
occur between wn and t2 ; from the Poisson process's count statistics, this probability equals
e−Λwn . Consequently, the sample function density for the Poisson process, be it stationary
t2

or not, equals !
n R t2
(λ (wk )) e−( λ(α)dα)
(4)
Y
Pr [ Nt | t1 ≤ t < t2 ] = t1

k=1

4 Properties
From the probability distributions derived on the previous pages, we can discern many
structural properties of the Poisson process. These properties set the stage for delineating
other point processes from the Poisson. They, as described subsequently, have much more
structure and are much more dicult to handle analytically.

4.1 The Counting Process

The counting process Nt is an independent increment process. For a Poisson process, the
number of events in disjoint intervals are statistically independent of each other, meaning
that we have an independent increment process. When the Poisson process is stationary,
increments taken over equi-duration intervals are identically distributed as well as being
statistically independent. Two important results obtain from this property. First, the
counting process's covariance function KN (t, u) equals σ 2 min { t, u }. This close relation
to the Wiener waveform process indicates the fundamental nature of the Poisson process
in the world of point processes. Note, however, that the Poisson counting process is not
continuous almost surely. Second, the sequence of counts forms an ergodic process, meaning
we can estimate the intensity parameter from observations.

http://cnx.rice.edu/content/m11255/latest/
Connexions module: m11255 4

The mean and variance of the number of events in an interval can be easily calculated
from the Poisson distribution. Alternatively, we can calculate the characteristic function
and evaluate its derivatives. The characteristic function of an increment equals
iv t2
ΦNt1 ,t2 (v) = e(e −1)Λt1

The rst two moments and variance of an increment of the Poisson process, be it stationary
or not, equal
E [Nt1 ,t2 ] = Λtt21 (5)
2
E Nt1 ,t2 2 = Λtt21 + Λtt21
 

2
σ(Nt1 ,t2 ) = Λtt21
Note that the mean equals the variance here, a trademark of the Poisson process.

4.2 Poisson process event times from a Markov process

Consider the conditional density pWn |Wn−1 ,...,W1 (wn |wn−1 , . . . , w1 ). This density equals the
ratio of the event time densities for the n- and (n − 1)-dimensional event time vectors.
Simple substitution yields
“R ”
wn
− w
(6)
λ(α)dα
∀wn , wn ≥ wn−1 : pWn |Wn−1 ,...,W1 (wn |wn−1 , . . . , w1 ) = λ (wn ) e n−1

Thus the nth event time depends only on when the (n − 1) event occurs, meaning that we
th

have a Markov process. Note that event times are ordered: the nth event must occur after
the (n − 1) , etc. Thus, the values of this Markov process keep increasing, meaning that
th

from this viewpoint, the event times form a nonstationary Markovian sequence. When
the process is stationary, the evolutionary density is exponential. It is this special form of
event occurence time density that denes a Poisson process.

4.3 Interevent intervals in a Poisson process form a white sequence.

Exploiting the previous property, the duration of the nth interval τn = wn − wn−1 does
not depend on the lengths of previous (or future) intervals. Consequently, the sequence
of interevent intervals forms a "white" sequence. The sequence may not be identically
distributed unless the process is stationary. In the stationary case, interevent intervals are
truly white - they form an IID sequence - and have an exponential distribution.
∀τ, τ ≥ 0 : p τn ( τ ) = λ0 e−(λ0 τ ) (7)
To show that the exponential density for a white sequence corresponds to the most "random"
distribution, Parzen[1] proved that the ordered times of n events sprinkled independently
and uniformly over a given interval form a stationary Poisson process. If the density of event
sprinkling is not uniform, the resulting ordered times constitute a nonstationary Poisson
process with an intensity proportional to the sprinkling density.

5 Doubly stochastic Poisson processes


Here, the intensity λ (t) equals a sample function drawn from some waveform process. In
waveform processes, the analogous concept does not have nearly the impact it does here.
Because intensity waveforms must be non-negative, the intensity process must be nonzero
mean and non-Gaussian. The authors shall assume throughout that the intensity process

http://cnx.rice.edu/content/m11255/latest/
Connexions module: m11255 5

is stationary for simplicity. This model arises in those situations in which the event occur-
rence rate clearly varies unpredictably with time. Such processes have the property that the
variance-to-mean ratio of the number of events in any interval exceeds one. In the process
of deriving this last property, we illustrate the typical way of analyzing doubly stochastic
processes: Condition on the intensity equaling a particular sample function, use the statis-
tical characteristics of nonstationary Poisson processes, then "average" with respect to the
intensity process. To calculate the expected number Nt1 ,t2 of events in an interval, we use
conditional expected values:
E [Nt1 ,t2 ] = E h[E [Nt1 ,t2 | ∀λi (t) , t1 ≤ t < t2 ]]
(8)
Rt
= E t12 λ (α) dα
= (t2 − t1 ) E [λ (t)]

This
 result can also be written as the expected value of the integrated intensity: E [Nt1 ,t2 ] =
E Λtt21 . Similar calculations yield the increment's second moment and variance.


h 2i
E Nt1 ,t2 2 = E Λtt21 + E Λtt21
   

2 2
σ(Nt1 ,t2 ) = E Λtt21 + σ Λtt21
 

Using the last result, we nd that the variance-to-mean ratio in a doubly stochastic process
always exceeds unity, equaling one plus the variance-to-mean ratio of the intensity process.
The approach of sample-function conditioning can also be used to derive the density
of the number of events occurring in an interval for a doubly stochastic Poisson process.
Conditioned on the occurrence of a sample function, the probability of n events occurring
in the interval [t1 , t2 ) equals (Equation 1)
n
Λt2 t2
Pr [ Nt1 ,t2 = n | ∀λ (t) , t1 ≤ t < t2 ] = t1 e−Λt1
n!
Because Λtt21 is a random variable, the unconditional distribution equals this conditional
probability averaged with respect to this random variable's density. This average is known
as the Poisson Transform of the random variable's density.

αn −α
Z
Pr [Nt1 ,t2 = n] = e p Λt2 ( α ) dα (9)
0 n! t1

References
[1] E. Parzen. Stochastic Processes. Holden-Day, San Francisco, 1962.
[2] D. L. Snyder. Random Point Processes. Wiley, New York, 1975.

http://cnx.rice.edu/content/m11255/latest/

You might also like