0% found this document useful (0 votes)
11 views

ROSSIntroductiontoProbabilityModels PDFCORTADO

The document discusses Poisson processes and their properties. A Poisson process is a counting process where the time between events is exponentially distributed. The key properties are that the increments are independent and the probability of an event in a short time interval is proportional to the length of the interval. For a Poisson process, the number of events in any time period will be Poisson distributed.

Uploaded by

Kevin Mamani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

ROSSIntroductiontoProbabilityModels PDFCORTADO

The document discusses Poisson processes and their properties. A Poisson process is a counting process where the time between events is exponentially distributed. The key properties are that the increments are independent and the probability of an event in a short time interval is proportional to the length of the interval. For a Poisson process, the number of events in any time period will be Poisson distributed.

Uploaded by

Kevin Mamani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

The Exponential Distribution and the Poisson Process 297

Coxian random variables often arise in the following manner. Suppose that an item
must go through m stages of treatment to be cured. However, suppose that after each
stage there is a probability that the item will quit the program. If we suppose that
the amounts of time that it takes the item to pass through the successive stages are
independent exponential random variables, and that the probability that an item that
has just completed stage n quits the program is (independent of how long it took to
go through the n stages) equal to r (n), then the total time that an item spends in the
program is a Coxian random variable. 

5.3 The Poisson Process


5.3.1 Counting Processes
A stochastic process {N (t), t  0} is said to be a counting process if N (t) represents
the total number of “events” that occur by time t. Some examples of counting processes
are the following:
(a) If we let N (t) equal the number of persons who enter a particular store at or prior
to time t, then {N (t), t  0} is a counting process in which an event corresponds
to a person entering the store. Note that if we had let N (t) equal the number of
persons in the store at time t, then {N (t), t  0} would not be a counting process
(why not?).
(b) If we say that an event occurs whenever a child is born, then {N (t), t  0} is a
counting process when N (t) equals the total number of people who were born by
time t. (Does N (t) include persons who have died by time t? Explain why it must.)
(c) If N (t) equals the number of goals that a given soccer player scores by time t, then
{N (t), t  0} is a counting process. An event of this process will occur whenever
the soccer player scores a goal.
From its definition we see that for a counting process N (t) must satisfy:
(i) N (t)  0.
(ii) N (t) is integer valued.
(iii) If s < t, then N (s)  N (t).
(iv) For s < t, N (t) − N (s) equals the number of events that occur in the interval
(s, t].
A counting process is said to possess independent increments if the numbers of
events that occur in disjoint time intervals are independent. For example, this means
that the number of events that occur by time 10 (that is, N (10)) must be independent
of the number of events that occur between times 10 and 15 (that is, N (15) − N (10)).
The assumption of independent increments might be reasonable for example (a),
but it probably would be unreasonable for example (b). The reason for this is that if in
example (b) N (t) is very large, then it is probable that there are many people alive at
time t; this would lead us to believe that the number of new births between time t and
time t + s would also tend to be large (that is, it does not seem reasonable that N (t)
298 Introduction to Probability Models

is independent of N (t + s) − N (t), and so {N (t), t  0} would not have independent


increments in example (b)). The assumption of independent increments in example (c)
would be justified if we believed that the soccer player’s chances of scoring a goal
today do not depend on “how he’s been going.” It would not be justified if we believed
in “hot streaks” or “slumps.”
A counting process is said to possess stationary increments if the distribution of the
number of events that occur in any interval of time depends only on the length of the
time interval. In other words, the process has stationary increments if the number of
events in the interval (s, s + t) has the same distribution for all s.
The assumption of stationary increments would only be reasonable in example (a)
if there were no times of day at which people were more likely to enter the store.
Thus, for instance, if there was a rush hour (say, between 12 P.M. and 1 P.M.) each day,
then the stationarity assumption would not be justified. If we believed that the earth’s
population is basically constant (a belief not held at present by most scientists), then
the assumption of stationary increments might be reasonable in example (b). Stationary
increments do not seem to be a reasonable assumption in example (c) since, for one
thing, most people would agree that the soccer player would probably score more goals
while in the age bracket 25–30 than he would while in the age bracket 35–40. It may,
however, be reasonable over a smaller time horizon, such as one year.

5.3.2 Definition of the Poisson Process


One of the most important types of counting process is the Poisson process. As a prelude
to giving its definition, we define the concept of a function f (·) being o(h).
Definition 5.1 The function f (·) is said to be o(h) if
f (h)
lim =0
h→0 h
Example 5.12
(a) The function f (x) = x 2 is o(h) since
f (h) h2
lim = lim = lim h = 0
h→0 h h→0 h h→0

(b) The function f (x) = x is not o(h) since


f (h) h
lim = lim = lim 1 = 1 = 0
h→0 h h→0 h h→0

(c) If f (·) is o(h) and g(·) is o(h), then so is f (·) + g(·). This follows since
f (h) + g(h) f (h) g(h)
lim = lim + lim =0+0=0
h→0 h h→0 h h→0 h

(d) If f (·) is o(h), then so is g(·) = c f (·). This follows since


c f (h) f (h)
lim = c lim =c·0=0
h→0 h h
The Exponential Distribution and the Poisson Process 299

(e) From (c) and (d) it follows that any finite linear combination of functions, each of
which is o(h), is o(h). 
In order for the function f (·) to be o(h) it is necessary that f (h)/h go to zero as h
goes to zero. But if h goes to zero, the only way for f (h)/h to go to zero is for f (h) to
go to zero faster than h does. That is, for h small, f (h) must be small compared with h.
The o(h) notation can be used to make statements more precise. For instance, if
X is continuous with density f and failure rate function λ(t), then the approximate
statements

P(t < X < t + h) ≈ f (t) h


P(t < X < t + h|X > t) ≈ λ(t) h

can be precisely expressed as

P(t < X < t + h) = f (t) h + o(h)


P(t < X < t + h|X > t) = λ(t) h + o(h)

We are now in position to define the Poisson process.


Definition 5.2 The counting process {N (t), t  0} is said to be a Poisson process
with rate λ > 0 if the following axioms hold:
(i) N (0) = 0
(ii) {N (t), t  0} has independent increments
(iii) P(N (t + h) − N (t) = 1) = λh + o(h)
(iv) P(N (t + h) − N (t)  2) = o(h)
The preceding is called a Poisson process because the number of events in any interval
of length t is Poisson distributed with mean λt, as is shown by the following important
theorem.
Theorem 5.1 If {N (t), t  0} is a Poisson process with rate λ > 0, then for all
s > 0, t > 0, N (s + t) − N (s) is a Poisson random variable with mean λt. That is, the
number of events in any interval of length t is a Poisson random variable with mean λt.
Proof. We begin by deriving E[e−u N (t) ], the Laplace transform of N (t). To do so,
fix u > 0 and define

g(t) = E[e−u N (t) ]

We will obtain g(t) by deriving a differential equation as follows.

g(t + h) = E[e−u N (t+h) ]


= E[e−u(N (t)+N (t+h)−N (t) ]
= E[e−u N (t) e−u(N (t+h)−N (t)) ]
= E[e−u N (t) ] E[e−u(N (t+h)−N (t)) ]
(by independent increments)
= g(t) E[e−u(N (t+h)−N (t)) ] (5.10)
300 Introduction to Probability Models

Now, from Axioms (iii) and (iv)


P{N (t + h) − N (t) = 0} = 1 − λh + o(h)
P{N (t + h) − N (t) = 1} = λh + o(h)
P{N (t + h) − N (t)  2} = o(h)
Conditioning on which of these three possibilities occurs gives that
E[e−u[N (t+h)−N (t)] ] = 1 − λh + o(h) + e−u (λh + o(h)) + o(h)
= 1 − λh + e−u λh + o(h) (5.11)
Therefore, from Equations (5.10) and (5.11) we obtain
g(t + h) = g(t)(1 + λh(e−u − 1) + o(h))
which can be written as
g(t + h) − g(t) o(h)
= g(t)λ(e−u − 1) +
h h
Letting h → 0 yields the differential equation
g  (t) = g(t) λ(e−u − 1)
or
g  (t)
= λ(e−u − 1)
g(t)
Noting that the left side is the derivative of log(g(t)) yields, upon integration, that
log(g(t)) = λ(e−u − 1)t + C
Because g(0) = E[e−u N (0) ] = 1 it follows that C = 0, and so the Laplace transform
of N (t) is
−u −1)
E[e−u N (t) ] = g(t) = eλt(e
However, if X is a Poisson random variable with mean λt, then its Laplace transform is

E[e−u X ] = e−ui e−λt (λt)i /i!
i
 −u −u −1)
−λt
=e (λte−u )i /i! = e−λt eλte = eλt(e
i

Because the Laplace transform uniquely determines the distribution, we can thus con-
clude that N (t) is Poisson with mean λt.
To show that N (s + t) − N (s) is also Poisson with mean λt, fix s and let Ns (t) =
N (s + t) − N (s) equal the number of events in the first t time units when we start
our count at time s. It is now straightforward to verify that the counting process
{Ns (t), t  0} satisfies all the axioms for being a Poisson process with rate λ. Conse-
quently, by our preceding result, we can conclude that Ns (t) is Poisson distributed with
mean λt. 
The Exponential Distribution and the Poisson Process 301

Remarks
(i) The result that N (t), or more generally N (t + s) − N (s), has a Poisson distribution
is a consequence of the Poisson approximation to the binomial distribution (see
Section 2.2.4). To see this, subdivide the interval [0, t] into k equal parts where k
is very large (Figure 5.1). Now it can be shown using axiom (iv) of Definition 5.2
that as k increases to ∞ the probability of having two or more events in any of
the k subintervals goes to 0. Hence, N (t) will (with a probability going to 1) just
equal the number of subintervals in which an event occurs. However, by stationary
and independent increments this number will have a binomial distribution with
parameters k and p = λt/k + o(t/k). Hence, by the Poisson approximation to the
binomial we see by letting k approach ∞ that N (t) will have a Poisson distribution
with mean equal to
 
t t to(t/k)
lim k λ + o = λt + lim
k→∞ k k k→∞ t/k
= λt
by using the definition of o(h) and the fact that t/k → 0 as k → ∞.
(ii) Because the distribution of N (t + s) − N (s) is the same for all s, it follows that
the Poisson process has stationary increments.

5.3.3 Interarrival and Waiting Time Distributions


Consider a Poisson process, and let us denote the time of the first event by T1 . Further,
for n > 1, let Tn denote the elapsed time between the (n − 1)st and the nth event. The
sequence {Tn , n = 1, 2, . . .} is called the sequence of interarrival times. For instance,
if T1 = 5 and T2 = 10, then the first event of the Poisson process would have occurred
at time 5 and the second at time 15.
We shall now determine the distribution of the Tn . To do so, we first note that the
event {T1 > t} takes place if and only if no events of the Poisson process occur in the
interval [0, t] and thus,
P{T1 > t} = P{N (t) = 0} = e−λt
Hence, T1 has an exponential distribution with mean 1/λ. Now,
P{T2 > t} = E[P{T2 > t|T1 }]
However,
P{T2 > t | T1 = s} = P{0 events in (s, s + t] | T1 = s}
= P{0 events in (s, s + t]}
= e−λt (5.12)

Figure 5.1
302 Introduction to Probability Models

where the last two equations followed from independent and stationary increments.
Therefore, from Equation (5.12) we conclude that T2 is also an exponential random
variable with mean 1/λ and, furthermore, that T2 is independent of T1 . Repeating the
same argument yields the following.
Proposition 5.1 Tn , n = 1, 2, . . . , are independent identically distributed exponential
random variables having mean 1/λ.
Remark The proposition should not surprise us. The assumption of stationary and
independent increments is basically equivalent to asserting that, at any point in time,
the process probabilistically restarts itself. That is, the process from any point on is
independent of all that has previously occurred (by independent increments), and also
has the same distribution as the original process (by stationary increments). In other
words, the process has no memory, and hence exponential interarrival times are to be
expected.
Another quantity of interest is Sn , the arrival time of the nth event, also called the
waiting time until the nth event. It is easily seen that

n
Sn = Ti , n1
i=1

and hence from Proposition 5.1 and the results of Section 2.2 it follows that Sn has a
gamma distribution with parameters n and λ. That is, the probability density of Sn is
given by

(λt)n−1
f Sn (t) = λe−λt , t 0 (5.13)
(n − 1)!
Equation (5.13) may also be derived by noting that the nth event will occur prior to or
at time t if and only if the number of events occurring by time t is at least n. That is,

N (t)  n ⇔ Sn  t

Hence,

 (λt) j
FSn (t) = P{Sn  t} = P{N (t)  n} = e−λt
j!
j=n

which, upon differentiation, yields



 ∞
(λt) j  −λt (λt) j−1
f Sn (t) = − λe−λt + λe
j! ( j − 1)!
j=n j=n

 ∞
−λt (λt)n−1 −λt (λt) j−1  −λt (λt) j
= λe + λe − λe
(n − 1)! ( j − 1)! j!
j=n+1 j=n

(λt)n−1
= λe−λt
(n − 1)!
The Exponential Distribution and the Poisson Process 303

Example 5.13 Suppose that people immigrate into a territory at a Poisson rate λ = 1
per day.
(a) What is the expected time until the tenth immigrant arrives?
(b) What is the probability that the elapsed time between the tenth and the eleventh
arrival exceeds two days?
Solution:
(a) E[S10 ] = 10/λ = 10 days.
(b) P{T11 > 2} = e−2λ = e−2 ≈ 0.133. 
Proposition 5.1 also gives us another way of defining a Poisson process. Suppose
we start with a sequence {Tn , n  1} of independent identically distributed exponential
random variables each having mean 1/λ. Now let us define a counting process by saying
that the nth event of this process occurs at time
Sn ≡ T1 + T2 + · · · + Tn
The resultant counting process {N (t), t  0}∗ will be Poisson with rate λ.
Remark Another way of obtaining the density function of Sn is to note that because
Sn is the time of the nth event,
P{t < Sn < t + h} = P{N (t) = n − 1, one event in (t, t + h)} + o(h)
= P{N (t) = n − 1}P{one event in (t, t + h)} + o(h)
(λt)n−1
= e−λt [λh + o(h)] + o(h)
(n − 1)!
(λt)n−1
= λe−λt h + o(h)
(n − 1)!

where the first equality uses the fact that the probability of 2 or more events in (t, t + h)
is o(h). If we now divide both sides of the preceding equation by h and then let h → 0,
we obtain
(λt)n−1
f Sn (t) = λe−λt
(n − 1)!

5.3.4 Further Properties of Poisson Processes


Consider a Poisson process {N (t), t  0} having rate λ, and suppose that each time an
event occurs it is classified as either a type I or a type II event. Suppose further that each
event is classified as a type I event with probability p or a type II event with probability
1 − p, independently of all other events. For example, suppose that customers arrive
at a store in accordance with a Poisson process having rate λ; and suppose that each
arrival is male with probability 21 and female with probability 21 . Then a type I event
would correspond to a male arrival and a type II event to a female arrival.
∗ A formal definition of N (t) is given by N (t) ≡ max{n: Sn  t} where S0 ≡ 0.
304 Introduction to Probability Models

Let N1 (t) and N2 (t) denote respectively the number of type I and type II events
occurring in [0, t]. Note that N (t) = N1 (t) + N2 (t).
Proposition 5.2 {N1 (t), t  0} and {N2 (t), t  0} are both Poisson processes having
respective rates λ p and λ(1 − p). Furthermore, the two processes are independent.
Proof. It is easy to verify that {N1 (t), t  0} is a Poisson process with rate λ p by
verifying that it satisfies Definition 5.3.
• N1 (0) = 0 follows from the fact that N (0) = 0.
• It is easy to see that {N1 (t), t  0} inherits the stationary and independent increment
properties of the process {N (t), t  0}. This is true because the distribution of
the number of type I events in an interval can be obtained by conditioning on the
number of events in that interval, and the distribution of this latter quantity depends
only on the length of the interval and is independent of what has occurred in any
nonoverlapping interval.
• P{N1 (h) = 1} = P{N1 (h) = 1 | N (h) = 1}P{N (h) = 1}
+P{N1 (h) = 1 | N (h)  2}P{N (h)  2}
= p(λh + o(h)) + o(h)
= λ ph + o(h)

• P{N1 (h)  2}  P{N (h)  2} = o(h)


Thus we see that {N1 (t), t  0} is a Poisson process with rate λ p and, by a similar
argument, that {N2 (t), t  0} is a Poisson process with rate λ(1 − p). Because the
probability of a type I event in the interval from t to t + h is independent of all that
occurs in intervals that do not overlap (t, t + h), it is independent of knowledge of
when type II events occur, showing that the two Poisson processes are independent.
(For another way of proving independence, see Example 3.23.) 
Example 5.14 If immigrants to area A arrive at a Poisson rate of ten per week, and if
1
each immigrant is of English descent with probability 12 , then what is the probability
that no people of English descent will emigrate to area A during the month of February?
Solution: By the previous proposition it follows that the number of Englishmen
emigrating to area A during the month of February is Poisson distributed with mean
4 · 10 · 12
1
= 10 −10/3 . 
3 . Hence, the desired probability is e

Example 5.15 Suppose nonnegative offers to buy an item that you want to sell arrive
according to a Poisson process with rate λ. Assume that each offer is the value of a
continuous random variable having density function f (x). Once the offer is presented
to you, you must either accept it or reject it and wait for the next offer. We suppose that
you incur costs at a rate c per unit time until the item is sold, and that your objective is
to maximize your expected total return, where the total return is equal to the amount
received minus the total cost incurred. Suppose you employ the policy of accepting the
first offer that is greater than some specified value y. (Such a type of policy, which we
call a y-policy, can be shown to be optimal.) What is the best value of y? What is the
maximal expected net return?
The Exponential Distribution and the Poisson Process 305

Solution: Let us compute the expected total return when you use the y-policy, and
then choose the value of y that maximizes this quantity.
∞ Let X denote the value of
a random offer, and let F̄(x) = P{X > x} = x f (u) du be its tail distribution
function. Because each offer will be greater than y with probability F̄(y), it follows
that such offers occur according to a Poisson process with rate λ F̄(y). Hence, the
time until an offer is accepted is an exponential random variable with rate λ F̄(y).
Letting R(y) denote the total return from the policy that accepts the first offer that
is greater than y, we have

E[R(y)] = E[accepted offer] − cE[time to accept]


c
= E[X |X > y] −
λ F̄(y)
 ∞
c
= x f X |X >y (x) d x −
0 λ F̄(y)
 ∞
f (x) c
= x dx −
y F̄(y) λ F̄(y)
∞
y x f (x) d x − c/λ
= (5.14)
F̄(y)

Differentiation yields
 ∞
d c
E[R(y)] = 0 ⇔ − F̄(y)y f (y) + x f (x) d x − f (y) = 0
dy y λ

Therefore, the optimal value of y satisfies


 ∞ c
y F̄(y) = x f (x) d x −
y λ

or
 ∞  ∞ c
y f (x) d x = x f (x) d x −
y y λ

or
 ∞ c
(x − y) f (x) d x =
y λ

It is not difficult to show that there is a unique value of y that satisfies the preceding.
Hence, the optimal policy is the one that accepts the first offer that is greater than
y ∗ , where y ∗ is such that
 ∞
(x − y ∗ ) f (x) d x = c/λ
y∗
306 Introduction to Probability Models

Putting y = y ∗ in Equation (5.14) shows that the maximal expected net return is
 ∞
1
E[R(y ∗ )] = ( (x − y ∗ + y ∗ ) f (x) d x − c/λ)
F̄(y ∗ ) y ∗
 ∞  ∞
1 ∗ ∗
= ( (x − y ) f (x) d x + y f (x) d x − c/λ)
F̄(y ∗ ) y ∗ y∗
1
= (c/λ + y ∗ F̄(y ∗ ) − c/λ)
F̄(y ∗ )
= y∗

Thus, the optimal critical value is also the maximal expected net return. To understand
why this is so, let m be the maximal expected net return, and note that when an offer
is rejected the problem basically starts anew and so the maximal expected additional
net return from then on is m. But this implies that it is optimal to accept an offer
if and only if it is at least as large as m, showing that m is the optimal critical
value. 
It follows from Proposition 5.2 that if each of a Poisson number of individuals is
independently classified into one of two possible groups with respective probabilities p
and 1 − p, then the number of individuals in each of the two groups will be independent
Poisson random variables. Because this result easily generalizes to the case where the
classification is into any one of r possible groups, we have the following application
to a model of employees moving about in an organization.
Example 5.16 Consider a system in which individuals at any time are classified as
being in one of r possible states, and assume that an individual changes states in accor-
dance with a Markov chain having transition probabilities Pi j , i, j = 1, . . . , r . That
is, if an individual is in state i during a time period then, independently of its previous
states, it will be in state j during the next time period with probability Pi j . The individ-
uals are assumed to move through the system independently of each other. Suppose that
the numbers of people initially in states 1, 2, . . . , r are independent Poisson random
variables with respective means λ1 , λ2 , . . . , λr . We are interested in determining the
joint distribution of the numbers of individuals in states 1, 2, . . . , r at some time n.
Solution: For fixed i, let N j (i), j = 1, . . . , r denote the number of those indi-
viduals, initially in state i, that are in state j at time n. Now each of the (Poisson
distributed) number of people initially in state i will, independently of each other,
be in state j at time n with probability Pinj , where Pinj is the n-stage transition
probability for the Markov chain having transition probabilities Pi j . Hence, the
N j (i), j = 1, . . . , r will be independent Poisson random variables with respective
means λi Pinj , j = 1, . . . , r . Because the sum of independent Poisson random vari-
ables is itself a Poisson randomvariable, it follows that the number of individuals
r
in state j at time n—namely  i=1 N j (i)—will be independent Poisson random
variables with respective means i λi Pinj , for j = 1, . . . , r . 
Example 5.17 (The Coupon Collecting Problem) There are m different types of
coupons. Each time a person collects a coupon it is, independently of ones previously

You might also like