0% found this document useful (0 votes)
25 views38 pages

Lecture 5 20240325

The document discusses the exponential distribution and its properties including being memoryless. It provides examples of how the exponential distribution can model real-world phenomena and the calculations involved. It also discusses the Poisson process.

Uploaded by

Reedus
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views38 pages

Lecture 5 20240325

The document discusses the exponential distribution and its properties including being memoryless. It provides examples of how the exponential distribution can model real-world phenomena and the calculations involved. It also discusses the Poisson process.

Uploaded by

Reedus
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

The Exponential

Distribution and the


Poisson Process
Introduction
In making a mathematical model for a real-world
phenomenon it is always necessary to make certain
simplifying assumptions so as to render the mathematics
tractable.

One simplifying assumption that is often made is to assume


that certain random variables are exponentially distributed.
The reason for this is that the exponential distribution is both
relatively easy to work with and is often a good
approximation to the actual distribution.

Properties in Sect. 5.2

Poisson process in Sect. 5.3


Definition of the Exponential
Distribution
A continuous random variable X is said to have an
exponential distribution with parameter λ, λ > 0
𝜆𝜆𝑒𝑒 −𝜆𝜆𝑥𝑥 , 𝑖𝑖𝑖𝑖 𝑥𝑥 ≥ 0
𝑓𝑓 𝑥𝑥 = �
0, 𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜
𝑥𝑥 −𝜆𝜆𝑥𝑥 , 𝑥𝑥 ≥ 0
𝐹𝐹 𝑥𝑥 = � 𝜆𝜆𝑒𝑒 −𝜆𝜆𝑦𝑦 𝑑𝑑𝑑𝑑 =�1 − 𝑒𝑒
0 0, 𝑥𝑥 < 0

Statistics
1
Mean: 𝐸𝐸 𝑋𝑋 =
𝜆𝜆
𝜆𝜆
Moment generating function: ϕ 𝑡𝑡 = 𝐸𝐸 𝑒𝑒 𝑋𝑋𝑋𝑋 =
𝜆𝜆−𝑡𝑡
2
The second moment: 𝐸𝐸 𝑋𝑋 2 =
𝜆𝜆2
1
Variance: 𝑉𝑉𝑉𝑉𝑉𝑉 𝑋𝑋 =
𝜆𝜆2
Properties of the Exponential
Distribution
A random variable X is said to be without memory, or
memoryless, if
𝑃𝑃 𝑋𝑋 > 𝑠𝑠 + 𝑡𝑡 𝑋𝑋 > 𝑡𝑡 = 𝑃𝑃{𝑋𝑋 > 𝑠𝑠} for all s, t >0 (5.2)
In other words, if the instrument is alive at time t, then the
distribution of the remaining amount of time that it survives is the
same as the original lifetime distribution; that is, the instrument
does not remember that it has already been in use for a time t.

Eq. (5.2) equals to


𝑃𝑃{𝑋𝑋>𝑠𝑠+𝑡𝑡,𝑋𝑋>𝑡𝑡}
= 𝑃𝑃{𝑋𝑋 > 𝑠𝑠} or 𝑃𝑃{𝑋𝑋 > 𝑠𝑠 + 𝑡𝑡} = 𝑃𝑃{𝑋𝑋 > 𝑠𝑠}𝑃𝑃{𝑋𝑋 > 𝑡𝑡}
𝑃𝑃{𝑋𝑋>𝑡𝑡}
(5.3)

Since Equation (5.3) is satisfied when X is exponentially


distributed (for 𝑒𝑒 −𝜆𝜆(𝑠𝑠+𝑡𝑡) = 𝑒𝑒 −𝜆𝜆𝑠𝑠 𝑒𝑒 −𝜆𝜆𝑡𝑡 ), it follows that
exponentially distributed random variables are memoryless.
Examples
Example 5.2 Suppose that the amount of time one spends in a
bank is exponentially distributed with mean ten minutes, that is, λ =
1/10 . What is the probability that a customer will spend more than
fifteen minutes in the bank? What is the probability that a customer
will spend more than fifteen minutes in the bank given that she is
still in the bank after ten minutes?
𝑃𝑃 𝑋𝑋 > 15 = 𝑒𝑒 −15𝜆𝜆 = 𝑒𝑒 −3/2 ≈ 0.22
𝑃𝑃 𝑋𝑋 > 5 = 𝑒𝑒 −5𝜆𝜆 = 𝑒𝑒 −1/2 ≈ 0.62

Example 5.3 Consider a post office that is run by two clerks.


Suppose that when Mr. Smith enters the system he discovers that
Mr. Jones is being served by one of the clerks and Mr. Brown by the
other. Suppose also that Mr. Smith is told that his service will begin
as soon as either Jones or Brown leaves. If the amount of time that
a clerk spends with a customer is exponentially distributed with
mean 1/λ, what is the probability that, of the three customers, Mr.
Smith is the last to leave the post office?
By the lack of memory of the exponential, the answer must equal 1/2.
LoL
Examples
Example 5.4 The dollar amount of damage involved in an automobile
accident is an exponential random variable with mean 1000. Of this, the
insurance company only pays that amount exceeding (the deductible
amount of) 400. Find the expected value and the standard deviation of
the amount the insurance company pays per accident.
Let 𝑌𝑌 = (𝑋𝑋 − 400)+ be the amount paid.
1, 𝑖𝑖𝑖𝑖 𝑋𝑋 ≥ 400
Let 𝐼𝐼 = � be the condition whether X exceeds 400.
0, 𝑖𝑖𝑖𝑖 𝑋𝑋 < 400
By the lack of memory property of the exponential, it follows that if a damage
amount exceeds 400, then the amount by which it exceeds it is exponential with
mean 1000. Therefore,
𝐸𝐸 𝑌𝑌 𝐼𝐼 = 0 = 1000
 � → 𝐸𝐸 𝑌𝑌 𝐼𝐼 = 1000𝐼𝐼
𝐸𝐸 𝑌𝑌 𝐼𝐼 = 1 = 0
𝑉𝑉𝑉𝑉𝑉𝑉 𝑌𝑌 𝐼𝐼 = 0 = 1000 2
 � → 𝑉𝑉𝑉𝑉𝑉𝑉 𝑌𝑌 𝐼𝐼 = 10 6 𝐼𝐼
𝑉𝑉𝑉𝑉𝑉𝑉 𝑌𝑌 𝐼𝐼 = 1 = 0
 𝐸𝐸 𝑌𝑌 = 𝐸𝐸 𝐸𝐸 𝑌𝑌 𝐼𝐼 = 103 𝐸𝐸 𝐼𝐼 = 103 𝑒𝑒 −0.4 ≈ 670.32
 𝑉𝑉𝑉𝑉𝑉𝑉 𝑌𝑌 = 𝐸𝐸 𝑉𝑉𝑉𝑉𝑉𝑉 𝑌𝑌 𝐼𝐼 + 𝑉𝑉𝑉𝑉𝑉𝑉 𝐸𝐸 𝑌𝑌 𝐼𝐼 = 10 6 𝑒𝑒 −0.4 − 106 𝑒𝑒 −0.4 (1 − 𝑒𝑒 −0.4 ) ≈ 994.092

It turns out that not only is the exponential distribution “memoryless,”
but it is the unique distribution possessing this property.
Examples
Example 5.5 A store must decide how much of a certain
commodity to order so as to meet next month’s demand, where
that demand is assumed to have an exponential distribution with
rate λ. If the commodity costs the store c per pound, and can be
sold at a price of s > c per pound, how much should be ordered so
as to maximize the store’s expected profit? Assume that any
inventory left over at the end of the month is worthless and that
there is no penalty if the store cannot meet all the demand.
Let X equal the demand. If the store orders the amount t, then the
profit, call it P, is given by
𝑃𝑃 = 𝑠𝑠 min 𝑋𝑋, 𝑡𝑡 − 𝑐𝑐𝑐𝑐, where min 𝑋𝑋, 𝑡𝑡 = 𝑋𝑋 − (𝑋𝑋 − 𝑡𝑡)+ .
𝐸𝐸 𝑋𝑋 − 𝑡𝑡 + = 𝐸𝐸 𝑋𝑋 − 𝑡𝑡 + 𝑋𝑋 > 𝑡𝑡 𝑃𝑃 𝑋𝑋 > 𝑡𝑡 + 𝐸𝐸 𝑋𝑋 − 𝑡𝑡 + 𝑋𝑋 ≤ 𝑡𝑡 𝑃𝑃 𝑋𝑋 ≤ 𝑡𝑡 =
1
𝐸𝐸 𝑋𝑋 − 𝑡𝑡 + 𝑋𝑋 > 𝑡𝑡 𝑒𝑒 −𝜆𝜆𝑡𝑡 = 𝑒𝑒 −𝜆𝜆𝑡𝑡
𝜆𝜆
1 1 −𝜆𝜆𝑡𝑡
𝐸𝐸 min(𝑋𝑋, 𝑡𝑡) = − 𝑒𝑒
𝜆𝜆 𝜆𝜆
𝑠𝑠 𝑠𝑠 −𝜆𝜆𝑡𝑡
𝐸𝐸 𝑃𝑃 = − 𝑒𝑒 − 𝑐𝑐𝑐𝑐
𝜆𝜆 𝜆𝜆
1
For maximal profit 𝑡𝑡 = 𝜆𝜆 log(𝑠𝑠/𝑐𝑐)
Another Interpretation of
Memoryless
The memoryless property is further illustrated by the failure rate
function (also called the hazard rate function) of the exponential
distribution. Consider a continuous positive random variable X
having distribution function F and density f . The failure (or hazard)
rate function
𝑓𝑓(𝑡𝑡)
r(t) is defined by
𝑟𝑟 𝑡𝑡 = (5.4)
1−𝐹𝐹(𝑡𝑡)

To interpret r(t), suppose that an item, having lifetime X, has


survived for t hours, and we desire the probability that it does not
survive for an additional time dt.
𝑃𝑃{𝑋𝑋∈ 𝑡𝑡,𝑡𝑡+𝑑𝑑𝑑𝑑 ,𝑋𝑋>𝑡𝑡} 𝑃𝑃{𝑋𝑋∈ 𝑡𝑡,𝑡𝑡+𝑑𝑑𝑑𝑑 } 𝑓𝑓 𝑡𝑡 𝑑𝑑𝑑𝑑
𝑃𝑃{𝑋𝑋 ∈ 𝑡𝑡, 𝑡𝑡 + 𝑑𝑑𝑑𝑑 |𝑋𝑋 > 𝑡𝑡} = 𝑃𝑃{𝑋𝑋>𝑡𝑡}
=
𝑃𝑃{𝑋𝑋>𝑡𝑡}

1−𝐹𝐹(𝑡𝑡)
That is, r(t) represents the conditional probability density that a t-year-
old item will fail.

Suppose now that the lifetime distribution is exponential. Then, by


the memoryless property, it follows that the distribution of
remaining life for−𝜆𝜆𝑡𝑡a t-year-old item is the same as for a new item.
𝑓𝑓(𝑡𝑡) 𝑒𝑒 1
𝑟𝑟 𝑡𝑡 = =𝜆𝜆𝑒𝑒 −𝜆𝜆𝑡𝑡 =
1−𝐹𝐹(𝑡𝑡) 𝜆𝜆
Examples
Example 5.6 Let X1, . . . ,Xn be independent exponential
random variables with respective rates λ1, . . . , λn, where λi ≠
λj when i ≠ j. Let T be independent of these random variables
and suppose that 𝑛𝑛

� 𝑃𝑃𝑗𝑗 = 1
𝑗𝑗=1
where 𝑃𝑃𝑗𝑗 = 𝑃𝑃{𝑇𝑇 = 𝑗𝑗}. The random variable XT is said to be a
hyperexponential random variable. What is the failure rate?
1 − 𝐹𝐹 𝑡𝑡 = 𝑃𝑃 𝑋𝑋 > 𝑡𝑡 = ∑𝑛𝑛𝑖𝑖=1 𝑃𝑃 𝑋𝑋 > 𝑡𝑡 𝑇𝑇 = 𝑖𝑖 𝑃𝑃 𝑇𝑇 = 𝑖𝑖 = ∑𝑛𝑛𝑖𝑖=1 𝑃𝑃𝑖𝑖 𝑒𝑒 −𝜆𝜆𝑖𝑖𝑡𝑡
𝑓𝑓 𝑡𝑡 = ∑𝑛𝑛𝑖𝑖=1 𝜆𝜆𝑖𝑖 𝑃𝑃𝑖𝑖 𝑒𝑒 −𝜆𝜆𝑖𝑖𝑡𝑡
∑𝑛𝑛
𝑖𝑖=1 𝜆𝜆𝑖𝑖 𝑃𝑃𝑖𝑖 𝑒𝑒
−𝜆𝜆𝑖𝑖 𝑡𝑡
From (5.4), 𝑟𝑟 𝑡𝑡 =
∑𝑛𝑛
𝑖𝑖=1 𝑃𝑃𝑖𝑖 𝑒𝑒
−𝜆𝜆𝑖𝑖 𝑡𝑡
Further Properties of the
Exponential Distribution
Let X1, . . . ,Xn be independent and identically distributed
exponential random variables having mean 1/λ. Sn=X1 + · · · + Xn
has a gamma distribution with parameters 𝑛𝑛−1 n and λ.
(𝜆𝜆𝑡𝑡)
𝑓𝑓𝑠𝑠𝑛𝑛 (𝑡𝑡) = 𝜆𝜆𝑒𝑒 −𝜆𝜆𝑡𝑡
𝑛𝑛 − 1 !

[Example] Suppose that X1 and X2 are independent exponential


random variables with respective means 1/λ1 and 1/λ2; what is P{X1
< X2}?
∞ ∞
𝑃𝑃 𝑋𝑋1 < 𝑋𝑋2 = ∫0 𝑃𝑃 𝑋𝑋1 < 𝑋𝑋2 |𝑋𝑋1 = 𝑥𝑥 𝜆𝜆1 𝑒𝑒 −𝜆𝜆1 𝑡𝑡 𝑑𝑑𝑑𝑑 = ∫0 𝑃𝑃 𝑥𝑥 < 𝑋𝑋2 𝜆𝜆1 𝑒𝑒 −𝜆𝜆1 𝑡𝑡 𝑑𝑑𝑑𝑑 =
∞ ∞ 𝜆𝜆1
∫0 𝑒𝑒 −𝜆𝜆2 𝑡𝑡 𝜆𝜆1 𝑒𝑒 −𝜆𝜆1 𝑡𝑡 𝑑𝑑𝑑𝑑 = 𝜆𝜆1 ∫0 𝑒𝑒 −(𝜆𝜆1 +𝜆𝜆2 )𝑡𝑡 𝑑𝑑𝑑𝑑 = 𝜆𝜆 +𝜆𝜆
1 2

Suppose that X1, . . . ,Xn are independent exponential random


variables, with Xi having rate λi, i = 1, . . . , n. 𝑛𝑛

𝑃𝑃 min 𝑋𝑋1 , … , 𝑋𝑋𝑛𝑛 > 𝑥𝑥 = 𝑃𝑃 𝑋𝑋𝑖𝑖 > 𝑥𝑥 𝑓𝑓𝑓𝑓𝑓𝑓 𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖 = 1, … , 𝑛𝑛 = � 𝑃𝑃 𝑋𝑋𝑖𝑖 > 𝑥𝑥
𝑛𝑛
𝑖𝑖=1
−𝜆𝜆𝑖𝑖 𝑡𝑡 −(∑𝑛𝑛
𝑖𝑖=1 𝜆𝜆𝑖𝑖 )𝑡𝑡
= � 𝑒𝑒 = 𝑒𝑒
𝑖𝑖=1
independency
Convolutions of
Exponential Random Variables
Let X1, . . . ,Xn be independent exponential random variables
with respective rates λ1, . . . , λn, where λi ≠ λj when i ≠ j. The
𝑛𝑛
random variable ∑𝑖𝑖=1 𝑋𝑋𝑖𝑖 is said to be a hypoexponential
random variable. To compute its probability density function,
let us start with the case n = 2. Now,

𝑓𝑓𝑋𝑋1+𝑋𝑋2 𝑡𝑡 = 𝑓𝑓𝑋𝑋1 𝑡𝑡 ∗ 𝑓𝑓𝑋𝑋2 𝑡𝑡 = ∫−∞ 𝑓𝑓𝑋𝑋1 𝑠𝑠 𝑓𝑓𝑋𝑋2 𝑡𝑡 − 𝑠𝑠 𝑑𝑑𝑑𝑑 =
𝜆𝜆1 −𝜆𝜆2 𝑡𝑡 + 𝜆𝜆2 𝜆𝜆 𝑒𝑒 −𝜆𝜆1 𝑡𝑡
𝜆𝜆 −𝜆𝜆 2
𝜆𝜆 𝑒𝑒 𝜆𝜆 −𝜆𝜆 1
1 2 2 1

In the general case, 𝑛𝑛

𝑓𝑓𝑋𝑋1+⋯+𝑋𝑋𝑛𝑛 𝑡𝑡 = 𝑓𝑓𝑋𝑋1 𝑡𝑡 ∗ ⋯ ∗ 𝑓𝑓𝑋𝑋2 𝑡𝑡 = � 𝐶𝐶𝑖𝑖,𝑛𝑛 𝜆𝜆𝑖𝑖 𝑒𝑒 −𝜆𝜆𝑖𝑖𝑡𝑡


𝑖𝑖=1
𝜆𝜆𝑗𝑗
𝐶𝐶𝑖𝑖,𝑛𝑛 =�
𝜆𝜆𝑗𝑗 − 𝜆𝜆𝑖𝑖
𝑗𝑗≠𝑖𝑖
Stochastic Processes
A stochastic process {X(t), t ∈ T} is a collection of random
variables. That is, for each t ∈ T, X(t) is a random variable.
The index t is often interpreted as time and, as a result, we
refer to X(t) as the state of the process at time t.
(e.g) X(t) : the total number of customers that have entered a
supermarket by time t
The set T is called the index set of the process.
T is a countable set: a discrete-time process.
T is an interval of the real line: a continuous-time process.
The state space of a stochastic process is defined as the set
of all possible values that the random variables X(t) can
assume. Thus, a stochastic process is a family of random
variables that describes the evolution through time of some
(physical) process. We shall see much of stochastic processes
in the following chapters of this text.
Counting Process
A stochastic process {N(t), t≥0} is said to be a counting
process if N(t) represents the total number of “events” that
occur by time t.
Some examples of counting processes are the following:
 If we let N(t) equal the number of persons who enter a particular store at or
prior to time t, then {N(t), t≥0} is a counting process in which an event
corresponds to a person entering the store.
 If we say that an event occurs whenever a child is born, then {N(t), t ≥ 0} is a
counting process when N(t) equals the total number of people who were born
by time t.
 If N(t) equals the number of goals that a given soccer player scores by time t,
then {N(t), t≥0} is a counting process.

The properties of the counting process


1. N(t) ≥ 0.
2. N(t) is integer valued.
3. If s < t, then N(s) ≤ N(t).
4. For s < t, N(t) − N(s) equals the number of events that occur in
the interval (s, t].
Poisson Process (1/2)
One of the most important counting processes is the Poisson
process

Definition 5.1 The counting process {N(t), t≥0} is said to be


a Poisson process having rate 𝜆𝜆, 𝜆𝜆 > 0, if
1. N(0) = 0
2. The process has independent increments.
3. The number of events in any interval of length t is Poisson
distributed with mean λt. That is, 𝑛𝑛for all s, t 0
(𝜆𝜆𝑡𝑡)
𝑃𝑃 𝑁𝑁 𝑡𝑡 + 𝑠𝑠 − 𝑁𝑁 𝑠𝑠 = 𝑛𝑛 = 𝑒𝑒 −𝜆𝜆𝑡𝑡 , n = 0, 1, . . .
𝑛𝑛!
 Note that it follows from condition (iii) that a Poisson process has stationary
increments and also that 𝑁𝑁 𝑡𝑡 = 𝜆𝜆𝑡𝑡, which explains why 𝜆𝜆 is called the rate of
the process


Poisson Process (2/2)
To determine if an arbitrary counting process is actually a Poisson
process, we must show that conditions (i), (ii), and (iii) are satisfied.
We have an alternate definition of a Poisson process.

Definition 5.3 The counting process {N(t), t 0} is said to be a


Poisson process having rate 𝜆𝜆, 𝜆𝜆 > 0, if
1. N(0) = 0.
2. The process has stationary and independent increments.
1. For any t , s ≥0, the distribution of N(t+s) – N(t) is independent of t.
2. For any t ≥0, s ≥0, N(t +s) – N(t) is independent of {N(u): u≤t}.
3. P{N(h) = 1} = λh + o(h).
4. P{N(h) 2} = o(h).

Theorem 5.1 Definitions 5.1 and 5.3 are equivalent.


𝑓𝑓(ℎ)
Definition 5.2 The function f (·) is said to be o(h) if lim =0
ℎ→0 ℎ
𝑓𝑓 𝑥𝑥 = 𝑥𝑥 2 is o(h) while 𝑓𝑓 𝑥𝑥 = 𝑥𝑥 is not o(h)
Independent Increments Property
𝑁𝑁 𝑡𝑡 + 𝑠𝑠 − 𝑁𝑁(𝑡𝑡) is independent of {𝑁𝑁 𝑡𝑡 : 𝑢𝑢 ≤ 𝑡𝑡}. Arrivals in the
future (beyond time 𝑡𝑡) are independent of the entire past
history up to time 𝑡𝑡 (very strong property).

t
Stationary Increments Property
Strong Property (but less so than Ind. Inc):
For any 𝑡𝑡, 𝑡𝑡 ≥ 0 the distribution of 𝑁𝑁 𝑡𝑡 + 𝑠𝑠 − 𝑁𝑁(𝑡𝑡) is independent of
𝑡𝑡.
𝑃𝑃 𝑁𝑁 𝑡𝑡 + 𝑠𝑠 − 𝑁𝑁 𝑡𝑡 = 𝑛𝑛
= 𝑃𝑃 𝑁𝑁 𝑡𝑡1 + 𝑠𝑠 − 𝑁𝑁 𝑡𝑡1 = 𝑛𝑛
= 𝑃𝑃 𝑁𝑁 𝑠𝑠 = 𝑛𝑛

Note 𝑁𝑁 𝑡𝑡 + 𝑠𝑠 − 𝑁𝑁 𝑡𝑡 ≠ 𝑃𝑃 𝑠𝑠
Numerical Example
N(t) is a Poisson Process with rate λ = 8.
We would like to compute:
𝑃𝑃 𝑁𝑁 2.5 = 17, 𝑁𝑁 3.7 = 22, 𝑁𝑁 4.3 = 36

𝑃𝑃 = 𝑃𝑃{𝑁𝑁(2.5) = 17, 𝑁𝑁(3.7) = 22, 𝑁𝑁(4.3) = 36}


= 𝑃𝑃{𝑁𝑁(2.5) = 17, 𝑁𝑁(3.7) − 𝑁𝑁(2.5) = 5, 𝑁𝑁(4.3) − 𝑁𝑁(3.7) = 14}
= 𝑃𝑃{𝑁𝑁(2.5 = 17}𝑃𝑃{𝑁𝑁(3.7) − 𝑁𝑁(2.5) (From
= 5}𝑃𝑃{𝑁𝑁Independent
4.3 − 𝑁𝑁 3.7 Increments)
= 14}

= 𝑃𝑃 𝑁𝑁 2.5 = 17 𝑃𝑃 𝑁𝑁 1.2 = 5 𝑃𝑃 𝑁𝑁 0.6 = 14


(From Stationary Increments)

−2.5×8
(2.5 × 8)17 −1.2×8 (1.2 × 8)5 −0.6×8 (0.6 × 8)14
= 𝑒𝑒 𝑒𝑒 𝑒𝑒
17! 5! 14!
Interarrival Time Distributions
Consider a Poisson process, and let us denote the time of
the first event by 𝑇𝑇1 . Further, for 𝑛𝑛 > 1, let 𝑇𝑇𝑛𝑛 denote the
elapsed time between the (𝑛𝑛 − 1)st and the 𝑛𝑛th event. The
sequence {𝑇𝑇𝑛𝑛 , 𝑛𝑛 = 1, 2, … } is called the sequence of interarrival
times.

T1 T2 T3

t=0 t=1 t=2 t=3


Interarrival Time Distributions
We shall now determine the distribution of the 𝑇𝑇𝑛𝑛 .

To do so, we first note that the event {𝑇𝑇1 > 𝑡𝑡} takes place if
and only if no events of the Poisson process occur in the
interval [0, 𝑡𝑡] and thus,
𝑃𝑃 𝑇𝑇1 > 𝑡𝑡 = 𝑃𝑃 𝑁𝑁 𝑡𝑡 = 0 = 𝑒𝑒 −𝜆𝜆𝑡𝑡 .

Now, 𝑃𝑃 𝑇𝑇2 > 𝑡𝑡 = 𝐸𝐸 𝑃𝑃 𝑇𝑇2 > 𝑡𝑡|𝑇𝑇1


𝑃𝑃 𝑇𝑇2 > 𝑡𝑡|𝑇𝑇1 = 𝑠𝑠
= 𝑃𝑃 0 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 𝑠𝑠, 𝑠𝑠 + 𝑡𝑡 𝑇𝑇1 = 𝑠𝑠
= 𝑃𝑃 0 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 𝑠𝑠, 𝑠𝑠 + 𝑡𝑡
= 𝑒𝑒 −𝜆𝜆𝑡𝑡

Proposition 5.1 𝑇𝑇𝑛𝑛 , 𝑛𝑛 = 1,2, … , are independent identically


1
distributed exponential random variables having mean .
𝜆𝜆
Waiting Time Distributions (1/2)
The arrival time of the nth event, also called the waiting time
𝑛𝑛
until the nth event. It is easily seen that 𝑆𝑆𝑛𝑛 = ∑𝑖𝑖=1 𝑇𝑇𝑖𝑖 , 𝑛𝑛 ≥ 1

Hence from Proposition 5.1 and the results of Section 2.2 it


follows that Sn has a gamma distribution with parameters n
and λ.
−𝜆𝜆𝑡𝑡 (𝜆𝜆𝑡𝑡)𝑛𝑛−1
𝑓𝑓𝑆𝑆𝑛𝑛 𝑡𝑡 = 𝜆𝜆𝑒𝑒 , 𝑡𝑡 ≥ 0 (5.13)
(𝑛𝑛−1)!

Another way to find 𝑓𝑓𝑆𝑆𝑛𝑛 𝑡𝑡 using the Poisson process


property


Waiting Time Distributions (2/2)
Equation (5.13) may also be derived by noting that the nth
event will occur prior to or at time t if and only if the number
of events occurring by time t is at least n. That is,
N(t) ≥ n ⇔ Sn ≤ t

∞ −𝜆𝜆𝑡𝑡 (𝜆𝜆𝑡𝑡)𝑗𝑗
Hence, 𝐹𝐹𝑆𝑆𝑛𝑛 𝑡𝑡 = 𝑃𝑃 𝑆𝑆𝑛𝑛 ≤ 𝑡𝑡 = 𝑃𝑃 𝑁𝑁 𝑡𝑡 ≥ 𝑛𝑛 = ∑𝑗𝑗=𝑛𝑛 𝑒𝑒 .
𝑗𝑗!

By differentiating both sides,


′ ∞ −𝜆𝜆𝑡𝑡 𝜆𝜆𝑡𝑡 𝑗𝑗 ∞ −𝜆𝜆𝑡𝑡 𝜆𝜆𝑡𝑡 𝑗𝑗−1
𝑓𝑓𝑆𝑆𝑛𝑛 𝑡𝑡 = 𝐹𝐹𝑆𝑆𝑛𝑛 𝑡𝑡 = − ∑𝑗𝑗=𝑛𝑛 𝜆𝜆𝑒𝑒 + ∑𝑗𝑗=𝑛𝑛 𝜆𝜆𝑒𝑒
∞ 𝑗𝑗! ∞ 𝑗𝑗−1 !
𝜆𝜆𝑡𝑡 𝑛𝑛−1 𝜆𝜆𝑡𝑡 𝑗𝑗−1 𝜆𝜆𝑡𝑡 𝑗𝑗
−𝜆𝜆𝑡𝑡 + � 𝜆𝜆𝑒𝑒 −𝜆𝜆𝑡𝑡 − � 𝜆𝜆𝑒𝑒 −𝜆𝜆𝑡𝑡
= 𝜆𝜆𝑒𝑒
𝑛𝑛 − 1 ! 𝑗𝑗 − 1 ! 𝑗𝑗!
𝑗𝑗=𝑛𝑛+1 𝑗𝑗=𝑛𝑛
(𝜆𝜆𝑡𝑡)𝑛𝑛−1
= 𝜆𝜆𝑒𝑒 −𝜆𝜆𝑡𝑡
(𝑛𝑛 − 1)!
Paradox of Residual Life
Suppose that buses arrive at a stop according to a Poisson
process with rate.

Assume that you arrive at that bus-stop at some arbitrary


point in time.

Question: How long would you expect to wait?

Two “logical” answers.


Answer 1: Average wait = 1/2λ
1
Note that average time between bus arrivals = .
𝜆𝜆

Xn Xn+1 Xn+2

Bus Arrives Bus Arrives t

On average you will arrive exactly in the middle of an inter-arrival


time.

1 1
Your average wait for the next bus = 2𝐸𝐸[𝑋𝑋 ] = 2𝜆𝜆
𝑖𝑖
Answer 2: Average wait = 1/λ
Since buses arrive according to a Poisson Process with rate λ,
the time you have to wait is independent of how long it has
1
taken since the last bus, i.e. , the Avg. wait = .
𝜆𝜆

In fact by the second argument, if buses have been


operational for a long time, then by the memory-less
1
property, the expected time since the last bus arrival is also
𝜆𝜆

This implies that the expected time between the last bus and
2
the next bus arrival in your interval = ????
𝜆𝜆

So which answer is correct?


Paradox of Residual Life
Answer 2 is in fact correct!

This is because the interval that you arrive in is not a typical


interval, i.e. it is in fact more likely that you will arrive in a larger
time interval.

In the case of a Poisson process if you are sufficiently far from the
origin, this interval that you arrive in is in fact two times as long as
the average interval.
Paradox of Residual Life
Formally:

Let {N(t),t≥0} be a Poisson Process with rate 

Let S1, S2,… be successive arrival times, and S0 ≜ 0

The number of arrivals in [0,t] ≜ N(t)

The time of the last arrival before t ≜ SN(t)

The time until the next arrival after t = SN(t)+1


Paradox of Residual Life
A(t)=age Y(t) ≜ residual lifetime

SN(t) t SN(t)+1

To Prove: 𝑃𝑃 𝑌𝑌 𝑡𝑡 ≤ 𝑢𝑢 = 1 − 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0

In other words what we want to prove is that the time until
1
the next arrival is exponentially distributed with mean .
𝜆𝜆
Proof: For 𝑢𝑢 ≥ 0 : 𝑃𝑃 𝑌𝑌 𝑡𝑡 > 𝑢𝑢 = 𝑃𝑃 𝑆𝑆𝑁𝑁 𝑡𝑡 + 1 − 𝑡𝑡 > 𝑢𝑢
= 𝑃𝑃 𝑆𝑆𝑁𝑁 𝑡𝑡 + 1 > 𝑢𝑢 + 𝑡𝑡
= 𝑃𝑃 𝑁𝑁 𝑡𝑡 + 𝑢𝑢 − 𝑁𝑁 𝑡𝑡 = 0
𝑃𝑃 𝑌𝑌 𝑡𝑡 > 𝑢𝑢 = 𝑃𝑃 𝑁𝑁 𝑡𝑡 + 𝑢𝑢 − 𝑁𝑁 𝑡𝑡 = 0 = = 𝑃𝑃 𝑁𝑁 𝑢𝑢 = 0 = 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0
Hence, 𝑃𝑃 𝑌𝑌 𝑡𝑡 ≤ 𝑢𝑢 = 1 − 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0
Paradox of Residual Life (2/2)
𝑃𝑃 𝑌𝑌 𝑡𝑡 > 𝑢𝑢
= 𝑃𝑃{𝑁𝑁 𝑡𝑡 + 𝑢𝑢 − 𝑁𝑁( 𝑡𝑡 = 0}
= 𝑃𝑃 𝑁𝑁 𝑢𝑢 = 0
= 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0

𝑃𝑃 𝑌𝑌 𝑡𝑡 ≤ 𝑢𝑢 = 1 − 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0

The Paradox of residual life is important in calculating the


𝐸𝐸 𝑊𝑊 in the queueing system.

Homework: Find the distribution of the age of the process.


In other words find 𝑃𝑃 𝐴𝐴 𝑡𝑡 ≥ 𝑢𝑢 ?
⇒ 𝐸𝐸 𝑆𝑆𝑁𝑁 𝑡𝑡 +1 − 𝑆𝑆𝑁𝑁 𝑡𝑡 =? as t→∞
2
⇒ 𝐸𝐸 𝑆𝑆𝑁𝑁 𝑡𝑡 +1 − 𝑆𝑆𝑁𝑁 𝑡𝑡 = as t→∞
𝜆𝜆
Further Properties of Poisson
Processes
Consider a Poisson process 𝑁𝑁 𝑡𝑡 , 𝑡𝑡 ≥ 0 having rate 𝜆𝜆, and suppose
that each time an event occurs it is classified as either a type I or a
type II event. Suppose further that each event is classified as a type
I event with probability 𝑝𝑝 or a type II event with probability (1 − 𝑝𝑝),
independently of all other events.
Let 𝑁𝑁1 𝑡𝑡 and 𝑁𝑁2 𝑡𝑡 denote respectively the number of type I and
type II events occurring in [0, 𝑡𝑡]. Note that 𝑁𝑁 𝑡𝑡 = 𝑁𝑁1 𝑡𝑡 + 𝑁𝑁2 (𝑡𝑡).
P1

P2

Proposition 5.2 𝑁𝑁1 𝑡𝑡 , 𝑡𝑡 ≥ 0 and 𝑁𝑁2 𝑡𝑡 , 𝑡𝑡 ≥ 0 are both Poisson


processes having respective rates 𝜆𝜆𝑝𝑝 and 𝜆𝜆(1 − 𝑝𝑝). Furthermore, the
two processes are independent.
Proof of Proposition 5.2
(Decomposition of a Poisson Process)
We have a Poisson Process 𝑁𝑁 𝑡𝑡 , 𝑡𝑡 ≥ 0 with rate λ
For each arrival/event, we toss a coin with probability “p” of success
We construct two processes from it:
𝑈𝑈 𝑡𝑡 = contains all “successful” events
𝑉𝑉 𝑡𝑡 = contains all “failed” events

s s s s
F F F F

s s s s

F F F F
Next page: 𝑚𝑚 Proof of 𝑃𝑃 𝑈𝑈 𝑡𝑡 = 𝑛𝑛 𝑚𝑚, 𝑉𝑉 𝑡𝑡 = 𝑛𝑛 =
−𝜆𝜆𝜆𝜆𝜆𝜆 (𝜆𝜆𝜆𝜆𝜆𝜆) 𝑒𝑒 −𝜆𝜆𝜆𝜆 1−𝑝𝑝 (𝜆𝜆𝜆𝜆(1−𝑝𝑝))
𝑒𝑒 𝑚𝑚! 𝑛𝑛!
Decomposition of a Poisson Process
𝑃𝑃 𝑈𝑈 𝑡𝑡 = 𝑚𝑚, 𝑉𝑉 𝑡𝑡 = 𝑛𝑛 = ∑∞ 𝑘𝑘=0 𝑃𝑃{𝑈𝑈(𝑡𝑡) = 𝑚𝑚, 𝑉𝑉(𝑡𝑡) = 𝑛𝑛|𝑁𝑁(𝑡𝑡) = 𝑘𝑘} 𝑃𝑃 𝑁𝑁 𝑡𝑡 = 𝑘𝑘
= 𝑃𝑃 𝑈𝑈 𝑡𝑡 = 𝑚𝑚, 𝑉𝑉 𝑡𝑡 = 𝑛𝑛 𝑁𝑁 𝑡𝑡 = 𝑚𝑚 + 𝑛𝑛 𝑃𝑃 𝑁𝑁 𝑡𝑡 = 𝑚𝑚 + 𝑛𝑛
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝
𝑚𝑚 + 𝑛𝑛 𝑚𝑚 𝑛𝑛
𝑒𝑒 −𝜆𝜆𝜆𝜆 (𝜆𝜆𝜆𝜆)𝑚𝑚+𝑛𝑛
= 𝑝𝑝 1 − 𝑝𝑝
𝑛𝑛 (𝑚𝑚 + 𝑛𝑛)!
(𝑚𝑚 + 𝑛𝑛)! 𝑚𝑚 𝑛𝑛
𝑒𝑒 (𝜆𝜆𝜆𝜆)𝑚𝑚+𝑛𝑛
−𝜆𝜆𝜆𝜆
= 𝑝𝑝 (1 − 𝑝𝑝)
𝑛𝑛! 𝑚𝑚! (𝑚𝑚 + 𝑛𝑛)!
𝑒𝑒 −𝜆𝜆𝜆𝜆𝜆𝜆 𝑚𝑚
(𝜆𝜆𝜆𝜆𝜆𝜆) 𝑒𝑒 −𝜆𝜆(1−𝑝𝑝)𝑡𝑡 [𝜆𝜆(1 − 𝑝𝑝)𝑡𝑡]𝑛𝑛
=
𝑚𝑚! 𝑛𝑛!

It follows from Proposition 5.2 that if each of a Poisson number of


individuals is independently classified into one of two possible
groups with respective probabilities 𝑝𝑝 and 1 − 𝑝𝑝, then the number
of individuals in each of the two groups will be independent
Poisson random variables. Because this result easily generalizes to
the case where the classification is into any one of r possible
groups.
Superposition of Poisson
Let 𝐿𝐿 𝑡𝑡 , 𝑡𝑡 ≥ 0 and 𝑀𝑀(𝑡𝑡), 𝑡𝑡 ≥ 0 be two independent
Poisson Processes with rates 𝜆𝜆 and 𝜇𝜇, respectively. For 𝑡𝑡 ≥ 0,
let 𝑁𝑁 𝑡𝑡 = 𝐿𝐿 𝑡𝑡 + 𝑀𝑀 𝑡𝑡 . Then the resulting process {𝑁𝑁 𝑡𝑡 , 𝑡𝑡 ≥
0} is called the superposition of processes 𝐿𝐿 𝑡𝑡 , 𝑡𝑡 ≥ 0 and
𝑀𝑀 𝑡𝑡 , 𝑡𝑡 ≥ 0 and is a Poisson Process with rate 𝜆𝜆 + 𝜇𝜇.
e− 𝜆𝜆+𝜇𝜇 𝑡𝑡 [(𝜆𝜆 + 𝜇𝜇)𝑡𝑡]𝑛𝑛
Proof: To show 𝑃𝑃{𝑁𝑁(𝑡𝑡) = 𝑛𝑛} =
𝑛𝑛!

𝑛𝑛

𝑃𝑃 𝑁𝑁 𝑡𝑡 = 𝑛𝑛 = � 𝑃𝑃{ 𝐿𝐿 𝑡𝑡 = 𝑘𝑘, 𝑀𝑀 𝑡𝑡 = 𝑛𝑛 − 𝑘𝑘}


𝑘𝑘=0
𝑛𝑛

𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁: [{𝑁𝑁 𝑡𝑡 = 𝑛𝑛} = �{𝐿𝐿 𝑡𝑡 = 𝑘𝑘, 𝑀𝑀 𝑡𝑡 = 𝑛𝑛 − 𝑘𝑘}]


𝑘𝑘=0
𝑛𝑛

= � 𝑃𝑃{ 𝐿𝐿 𝑡𝑡 = 𝑘𝑘}𝑃𝑃{𝑀𝑀 𝑡𝑡 = 𝑛𝑛 − 𝑘𝑘}


𝑘𝑘=0
𝑛𝑛
e−𝜆𝜆t [𝜆𝜆𝜆𝜆]𝑘𝑘 e−𝜇𝜇t [𝜇𝜇𝜇𝜇]𝑛𝑛−𝑘𝑘
=�
k! (𝑛𝑛−k)!
𝑘𝑘=0
𝑛𝑛 𝑘𝑘 𝑛𝑛−𝑘𝑘
e− 𝜆𝜆+𝜇𝜇 t [(𝜆𝜆 + 𝜇𝜇)𝑡𝑡]𝑛𝑛 𝑛𝑛! 𝜆𝜆 𝜇𝜇
= �
𝑛𝑛! 𝑘𝑘! 𝑛𝑛−𝑘𝑘 ! 𝜆𝜆 + 𝜇𝜇 𝜆𝜆 + 𝜇𝜇
𝑘𝑘=0
𝑛𝑛
e− 𝜆𝜆+𝜇𝜇 t [(𝜆𝜆 + 𝜇𝜇)𝑡𝑡]𝑛𝑛 𝜆𝜆 𝜇𝜇
= +
𝑛𝑛! 𝜆𝜆 + 𝜇𝜇 𝜆𝜆 + 𝜇𝜇
Proof (Cont.)
e−(𝜆𝜆+𝜇𝜇)t [(𝜆𝜆 + 𝜇𝜇)𝑡𝑡]𝑛𝑛
Finally: =
𝑛𝑛!

e−(𝜆𝜆+𝜇𝜇)t [(𝜆𝜆 + 𝜇𝜇)𝑡𝑡]𝑛𝑛


Thus: 𝑃𝑃{𝑁𝑁 𝑡𝑡 = 𝑛𝑛} =
𝑛𝑛!
𝜆𝜆1

Sum
𝑛𝑛 is Poisson
𝜆𝜆2 ( � 𝜆𝜆𝑖𝑖 )
𝑖𝑖=1

𝜆𝜆𝑛𝑛−1
Provided they are
𝜆𝜆𝑛𝑛 independent!
Further Properties of Poisson
Processes
We shall determine is the probability that n events occur in one
Poisson process before m events have occurred in a second and
independent Poisson process. More formally let 𝑁𝑁1 (𝑡𝑡), 𝑡𝑡 ≥ 0 and
𝑁𝑁2 (𝑡𝑡), 𝑡𝑡 ≥ 0 be two independent Poisson processes having
respective rates λ1 and λ2. Also, let S1n denote the time of the nth
event of the first process, and S2m the time of the mth event of the
second process. That is Pr{S2m S2m}
Let us consider the special case n = m = 1
𝜆𝜆1
 Pr 𝑆𝑆 11 ≤ 𝑆𝑆 21 =
𝜆𝜆1 +𝜆𝜆2
Let us consider the special case n = 2, m = 1
𝜆𝜆1 2
 Pr 𝑆𝑆 1 2 ≤ 𝑆𝑆 21 =
𝜆𝜆1 +𝜆𝜆2
 Each event that occurs is going to be an event of the N1(t) process with probability
λ1/(λ1 + λ2) or an event of the N2(t) process with probability λ2/(λ1 + λ2), independent
of all that has previously occurred.
This event will occur if and only if the first n + m − 1 events result in n
or more N1(t) events, we see that our desired probability is given by
𝑛𝑛 + 𝑚𝑚 − 1 𝜆𝜆1 𝑘𝑘 𝜆𝜆2 𝑛𝑛+𝑚𝑚−1−𝑘𝑘
 Pr 𝑆𝑆 1 2 ≤ 𝑆𝑆 21 = ∑𝑛𝑛+𝑚𝑚−1
𝑘𝑘=𝑛𝑛
𝑘𝑘 𝜆𝜆1 +𝜆𝜆2 𝜆𝜆1 +𝜆𝜆2
Conditional Distribution of the
Arrival Times
Suppose we are told that exactly one event of a Poisson
process has taken place by time 𝑡𝑡, and we are asked to
determine the distribution of the time at which the event
occurred. Now, since a Poisson process possesses stationary and
independent increments it seems reasonable that each interval
in [0, 𝑡𝑡] of equal length should have the same probability of
containing the event. In other words, the time of the event
should be uniformly distributed over [0, 𝑡𝑡]. This is easily checked
since, for 𝑠𝑠 ≤ 𝑡𝑡,
Pr{𝑇𝑇 <𝑠𝑠,𝑁𝑁 𝑡𝑡 =1}
1
Pr 𝑇𝑇1 < 𝑠𝑠 𝑁𝑁 𝑡𝑡 = 1 =
Pr{𝑁𝑁 𝑡𝑡 =1}
Pr{1 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 0, 𝑠𝑠 , 0 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 [𝑠𝑠, 𝑡𝑡)}
=
Pr{𝑁𝑁 𝑡𝑡 = 1}
Pr{1 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 0, 𝑠𝑠 , } Pr{0 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 [𝑠𝑠, 𝑡𝑡)}
=
Pr{𝑁𝑁 𝑡𝑡 = 1}
𝑠𝑠
=
𝑡𝑡
Generalizations of the Poisson
Process
Definition 5.3 The counting process {N(t), t≥0} is said to be
nonhomogeneous Poisson process with intensity function λ(t),
t≥0, if
1. N(0) = 0.
2. {N(t), t≥0} has independent increments.
3. P{N(h) = 1} = λ(t)h + o(h).
4. P{N(t+h)-N(h) ≥ 2} = o(h).

You might also like