Lecture 5 20240325
Lecture 5 20240325
Statistics
1
Mean: 𝐸𝐸 𝑋𝑋 =
𝜆𝜆
𝜆𝜆
Moment generating function: ϕ 𝑡𝑡 = 𝐸𝐸 𝑒𝑒 𝑋𝑋𝑋𝑋 =
𝜆𝜆−𝑡𝑡
2
The second moment: 𝐸𝐸 𝑋𝑋 2 =
𝜆𝜆2
1
Variance: 𝑉𝑉𝑉𝑉𝑉𝑉 𝑋𝑋 =
𝜆𝜆2
Properties of the Exponential
Distribution
A random variable X is said to be without memory, or
memoryless, if
𝑃𝑃 𝑋𝑋 > 𝑠𝑠 + 𝑡𝑡 𝑋𝑋 > 𝑡𝑡 = 𝑃𝑃{𝑋𝑋 > 𝑠𝑠} for all s, t >0 (5.2)
In other words, if the instrument is alive at time t, then the
distribution of the remaining amount of time that it survives is the
same as the original lifetime distribution; that is, the instrument
does not remember that it has already been in use for a time t.
It turns out that not only is the exponential distribution “memoryless,”
but it is the unique distribution possessing this property.
Examples
Example 5.5 A store must decide how much of a certain
commodity to order so as to meet next month’s demand, where
that demand is assumed to have an exponential distribution with
rate λ. If the commodity costs the store c per pound, and can be
sold at a price of s > c per pound, how much should be ordered so
as to maximize the store’s expected profit? Assume that any
inventory left over at the end of the month is worthless and that
there is no penalty if the store cannot meet all the demand.
Let X equal the demand. If the store orders the amount t, then the
profit, call it P, is given by
𝑃𝑃 = 𝑠𝑠 min 𝑋𝑋, 𝑡𝑡 − 𝑐𝑐𝑐𝑐, where min 𝑋𝑋, 𝑡𝑡 = 𝑋𝑋 − (𝑋𝑋 − 𝑡𝑡)+ .
𝐸𝐸 𝑋𝑋 − 𝑡𝑡 + = 𝐸𝐸 𝑋𝑋 − 𝑡𝑡 + 𝑋𝑋 > 𝑡𝑡 𝑃𝑃 𝑋𝑋 > 𝑡𝑡 + 𝐸𝐸 𝑋𝑋 − 𝑡𝑡 + 𝑋𝑋 ≤ 𝑡𝑡 𝑃𝑃 𝑋𝑋 ≤ 𝑡𝑡 =
1
𝐸𝐸 𝑋𝑋 − 𝑡𝑡 + 𝑋𝑋 > 𝑡𝑡 𝑒𝑒 −𝜆𝜆𝑡𝑡 = 𝑒𝑒 −𝜆𝜆𝑡𝑡
𝜆𝜆
1 1 −𝜆𝜆𝑡𝑡
𝐸𝐸 min(𝑋𝑋, 𝑡𝑡) = − 𝑒𝑒
𝜆𝜆 𝜆𝜆
𝑠𝑠 𝑠𝑠 −𝜆𝜆𝑡𝑡
𝐸𝐸 𝑃𝑃 = − 𝑒𝑒 − 𝑐𝑐𝑐𝑐
𝜆𝜆 𝜆𝜆
1
For maximal profit 𝑡𝑡 = 𝜆𝜆 log(𝑠𝑠/𝑐𝑐)
Another Interpretation of
Memoryless
The memoryless property is further illustrated by the failure rate
function (also called the hazard rate function) of the exponential
distribution. Consider a continuous positive random variable X
having distribution function F and density f . The failure (or hazard)
rate function
𝑓𝑓(𝑡𝑡)
r(t) is defined by
𝑟𝑟 𝑡𝑡 = (5.4)
1−𝐹𝐹(𝑡𝑡)
� 𝑃𝑃𝑗𝑗 = 1
𝑗𝑗=1
where 𝑃𝑃𝑗𝑗 = 𝑃𝑃{𝑇𝑇 = 𝑗𝑗}. The random variable XT is said to be a
hyperexponential random variable. What is the failure rate?
1 − 𝐹𝐹 𝑡𝑡 = 𝑃𝑃 𝑋𝑋 > 𝑡𝑡 = ∑𝑛𝑛𝑖𝑖=1 𝑃𝑃 𝑋𝑋 > 𝑡𝑡 𝑇𝑇 = 𝑖𝑖 𝑃𝑃 𝑇𝑇 = 𝑖𝑖 = ∑𝑛𝑛𝑖𝑖=1 𝑃𝑃𝑖𝑖 𝑒𝑒 −𝜆𝜆𝑖𝑖𝑡𝑡
𝑓𝑓 𝑡𝑡 = ∑𝑛𝑛𝑖𝑖=1 𝜆𝜆𝑖𝑖 𝑃𝑃𝑖𝑖 𝑒𝑒 −𝜆𝜆𝑖𝑖𝑡𝑡
∑𝑛𝑛
𝑖𝑖=1 𝜆𝜆𝑖𝑖 𝑃𝑃𝑖𝑖 𝑒𝑒
−𝜆𝜆𝑖𝑖 𝑡𝑡
From (5.4), 𝑟𝑟 𝑡𝑡 =
∑𝑛𝑛
𝑖𝑖=1 𝑃𝑃𝑖𝑖 𝑒𝑒
−𝜆𝜆𝑖𝑖 𝑡𝑡
Further Properties of the
Exponential Distribution
Let X1, . . . ,Xn be independent and identically distributed
exponential random variables having mean 1/λ. Sn=X1 + · · · + Xn
has a gamma distribution with parameters 𝑛𝑛−1 n and λ.
(𝜆𝜆𝑡𝑡)
𝑓𝑓𝑠𝑠𝑛𝑛 (𝑡𝑡) = 𝜆𝜆𝑒𝑒 −𝜆𝜆𝑡𝑡
𝑛𝑛 − 1 !
𝑃𝑃 min 𝑋𝑋1 , … , 𝑋𝑋𝑛𝑛 > 𝑥𝑥 = 𝑃𝑃 𝑋𝑋𝑖𝑖 > 𝑥𝑥 𝑓𝑓𝑓𝑓𝑓𝑓 𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖 = 1, … , 𝑛𝑛 = � 𝑃𝑃 𝑋𝑋𝑖𝑖 > 𝑥𝑥
𝑛𝑛
𝑖𝑖=1
−𝜆𝜆𝑖𝑖 𝑡𝑡 −(∑𝑛𝑛
𝑖𝑖=1 𝜆𝜆𝑖𝑖 )𝑡𝑡
= � 𝑒𝑒 = 𝑒𝑒
𝑖𝑖=1
independency
Convolutions of
Exponential Random Variables
Let X1, . . . ,Xn be independent exponential random variables
with respective rates λ1, . . . , λn, where λi ≠ λj when i ≠ j. The
𝑛𝑛
random variable ∑𝑖𝑖=1 𝑋𝑋𝑖𝑖 is said to be a hypoexponential
random variable. To compute its probability density function,
let us start with the case n = 2. Now,
∞
𝑓𝑓𝑋𝑋1+𝑋𝑋2 𝑡𝑡 = 𝑓𝑓𝑋𝑋1 𝑡𝑡 ∗ 𝑓𝑓𝑋𝑋2 𝑡𝑡 = ∫−∞ 𝑓𝑓𝑋𝑋1 𝑠𝑠 𝑓𝑓𝑋𝑋2 𝑡𝑡 − 𝑠𝑠 𝑑𝑑𝑑𝑑 =
𝜆𝜆1 −𝜆𝜆2 𝑡𝑡 + 𝜆𝜆2 𝜆𝜆 𝑒𝑒 −𝜆𝜆1 𝑡𝑡
𝜆𝜆 −𝜆𝜆 2
𝜆𝜆 𝑒𝑒 𝜆𝜆 −𝜆𝜆 1
1 2 2 1
Poisson Process (2/2)
To determine if an arbitrary counting process is actually a Poisson
process, we must show that conditions (i), (ii), and (iii) are satisfied.
We have an alternate definition of a Poisson process.
t
Stationary Increments Property
Strong Property (but less so than Ind. Inc):
For any 𝑡𝑡, 𝑡𝑡 ≥ 0 the distribution of 𝑁𝑁 𝑡𝑡 + 𝑠𝑠 − 𝑁𝑁(𝑡𝑡) is independent of
𝑡𝑡.
𝑃𝑃 𝑁𝑁 𝑡𝑡 + 𝑠𝑠 − 𝑁𝑁 𝑡𝑡 = 𝑛𝑛
= 𝑃𝑃 𝑁𝑁 𝑡𝑡1 + 𝑠𝑠 − 𝑁𝑁 𝑡𝑡1 = 𝑛𝑛
= 𝑃𝑃 𝑁𝑁 𝑠𝑠 = 𝑛𝑛
Note 𝑁𝑁 𝑡𝑡 + 𝑠𝑠 − 𝑁𝑁 𝑡𝑡 ≠ 𝑃𝑃 𝑠𝑠
Numerical Example
N(t) is a Poisson Process with rate λ = 8.
We would like to compute:
𝑃𝑃 𝑁𝑁 2.5 = 17, 𝑁𝑁 3.7 = 22, 𝑁𝑁 4.3 = 36
−2.5×8
(2.5 × 8)17 −1.2×8 (1.2 × 8)5 −0.6×8 (0.6 × 8)14
= 𝑒𝑒 𝑒𝑒 𝑒𝑒
17! 5! 14!
Interarrival Time Distributions
Consider a Poisson process, and let us denote the time of
the first event by 𝑇𝑇1 . Further, for 𝑛𝑛 > 1, let 𝑇𝑇𝑛𝑛 denote the
elapsed time between the (𝑛𝑛 − 1)st and the 𝑛𝑛th event. The
sequence {𝑇𝑇𝑛𝑛 , 𝑛𝑛 = 1, 2, … } is called the sequence of interarrival
times.
T1 T2 T3
Interarrival Time Distributions
We shall now determine the distribution of the 𝑇𝑇𝑛𝑛 .
To do so, we first note that the event {𝑇𝑇1 > 𝑡𝑡} takes place if
and only if no events of the Poisson process occur in the
interval [0, 𝑡𝑡] and thus,
𝑃𝑃 𝑇𝑇1 > 𝑡𝑡 = 𝑃𝑃 𝑁𝑁 𝑡𝑡 = 0 = 𝑒𝑒 −𝜆𝜆𝑡𝑡 .
Waiting Time Distributions (2/2)
Equation (5.13) may also be derived by noting that the nth
event will occur prior to or at time t if and only if the number
of events occurring by time t is at least n. That is,
N(t) ≥ n ⇔ Sn ≤ t
∞ −𝜆𝜆𝑡𝑡 (𝜆𝜆𝑡𝑡)𝑗𝑗
Hence, 𝐹𝐹𝑆𝑆𝑛𝑛 𝑡𝑡 = 𝑃𝑃 𝑆𝑆𝑛𝑛 ≤ 𝑡𝑡 = 𝑃𝑃 𝑁𝑁 𝑡𝑡 ≥ 𝑛𝑛 = ∑𝑗𝑗=𝑛𝑛 𝑒𝑒 .
𝑗𝑗!
Xn Xn+1 Xn+2
1 1
Your average wait for the next bus = 2𝐸𝐸[𝑋𝑋 ] = 2𝜆𝜆
𝑖𝑖
Answer 2: Average wait = 1/λ
Since buses arrive according to a Poisson Process with rate λ,
the time you have to wait is independent of how long it has
1
taken since the last bus, i.e. , the Avg. wait = .
𝜆𝜆
This implies that the expected time between the last bus and
2
the next bus arrival in your interval = ????
𝜆𝜆
In the case of a Poisson process if you are sufficiently far from the
origin, this interval that you arrive in is in fact two times as long as
the average interval.
Paradox of Residual Life
Formally:
SN(t) t SN(t)+1
In other words what we want to prove is that the time until
1
the next arrival is exponentially distributed with mean .
𝜆𝜆
Proof: For 𝑢𝑢 ≥ 0 : 𝑃𝑃 𝑌𝑌 𝑡𝑡 > 𝑢𝑢 = 𝑃𝑃 𝑆𝑆𝑁𝑁 𝑡𝑡 + 1 − 𝑡𝑡 > 𝑢𝑢
= 𝑃𝑃 𝑆𝑆𝑁𝑁 𝑡𝑡 + 1 > 𝑢𝑢 + 𝑡𝑡
= 𝑃𝑃 𝑁𝑁 𝑡𝑡 + 𝑢𝑢 − 𝑁𝑁 𝑡𝑡 = 0
𝑃𝑃 𝑌𝑌 𝑡𝑡 > 𝑢𝑢 = 𝑃𝑃 𝑁𝑁 𝑡𝑡 + 𝑢𝑢 − 𝑁𝑁 𝑡𝑡 = 0 = = 𝑃𝑃 𝑁𝑁 𝑢𝑢 = 0 = 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0
Hence, 𝑃𝑃 𝑌𝑌 𝑡𝑡 ≤ 𝑢𝑢 = 1 − 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0
Paradox of Residual Life (2/2)
𝑃𝑃 𝑌𝑌 𝑡𝑡 > 𝑢𝑢
= 𝑃𝑃{𝑁𝑁 𝑡𝑡 + 𝑢𝑢 − 𝑁𝑁( 𝑡𝑡 = 0}
= 𝑃𝑃 𝑁𝑁 𝑢𝑢 = 0
= 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0
𝑃𝑃 𝑌𝑌 𝑡𝑡 ≤ 𝑢𝑢 = 1 − 𝑒𝑒 −𝜆𝜆𝜆𝜆 , 𝑢𝑢 ≥ 0
P2
s s s s
F F F F
s s s s
F F F F
Next page: 𝑚𝑚 Proof of 𝑃𝑃 𝑈𝑈 𝑡𝑡 = 𝑛𝑛 𝑚𝑚, 𝑉𝑉 𝑡𝑡 = 𝑛𝑛 =
−𝜆𝜆𝜆𝜆𝜆𝜆 (𝜆𝜆𝜆𝜆𝜆𝜆) 𝑒𝑒 −𝜆𝜆𝜆𝜆 1−𝑝𝑝 (𝜆𝜆𝜆𝜆(1−𝑝𝑝))
𝑒𝑒 𝑚𝑚! 𝑛𝑛!
Decomposition of a Poisson Process
𝑃𝑃 𝑈𝑈 𝑡𝑡 = 𝑚𝑚, 𝑉𝑉 𝑡𝑡 = 𝑛𝑛 = ∑∞ 𝑘𝑘=0 𝑃𝑃{𝑈𝑈(𝑡𝑡) = 𝑚𝑚, 𝑉𝑉(𝑡𝑡) = 𝑛𝑛|𝑁𝑁(𝑡𝑡) = 𝑘𝑘} 𝑃𝑃 𝑁𝑁 𝑡𝑡 = 𝑘𝑘
= 𝑃𝑃 𝑈𝑈 𝑡𝑡 = 𝑚𝑚, 𝑉𝑉 𝑡𝑡 = 𝑛𝑛 𝑁𝑁 𝑡𝑡 = 𝑚𝑚 + 𝑛𝑛 𝑃𝑃 𝑁𝑁 𝑡𝑡 = 𝑚𝑚 + 𝑛𝑛
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝
𝑚𝑚 + 𝑛𝑛 𝑚𝑚 𝑛𝑛
𝑒𝑒 −𝜆𝜆𝜆𝜆 (𝜆𝜆𝜆𝜆)𝑚𝑚+𝑛𝑛
= 𝑝𝑝 1 − 𝑝𝑝
𝑛𝑛 (𝑚𝑚 + 𝑛𝑛)!
(𝑚𝑚 + 𝑛𝑛)! 𝑚𝑚 𝑛𝑛
𝑒𝑒 (𝜆𝜆𝜆𝜆)𝑚𝑚+𝑛𝑛
−𝜆𝜆𝜆𝜆
= 𝑝𝑝 (1 − 𝑝𝑝)
𝑛𝑛! 𝑚𝑚! (𝑚𝑚 + 𝑛𝑛)!
𝑒𝑒 −𝜆𝜆𝜆𝜆𝜆𝜆 𝑚𝑚
(𝜆𝜆𝜆𝜆𝜆𝜆) 𝑒𝑒 −𝜆𝜆(1−𝑝𝑝)𝑡𝑡 [𝜆𝜆(1 − 𝑝𝑝)𝑡𝑡]𝑛𝑛
=
𝑚𝑚! 𝑛𝑛!
𝑛𝑛
Sum
𝑛𝑛 is Poisson
𝜆𝜆2 ( � 𝜆𝜆𝑖𝑖 )
𝑖𝑖=1
𝜆𝜆𝑛𝑛−1
Provided they are
𝜆𝜆𝑛𝑛 independent!
Further Properties of Poisson
Processes
We shall determine is the probability that n events occur in one
Poisson process before m events have occurred in a second and
independent Poisson process. More formally let 𝑁𝑁1 (𝑡𝑡), 𝑡𝑡 ≥ 0 and
𝑁𝑁2 (𝑡𝑡), 𝑡𝑡 ≥ 0 be two independent Poisson processes having
respective rates λ1 and λ2. Also, let S1n denote the time of the nth
event of the first process, and S2m the time of the mth event of the
second process. That is Pr{S2m S2m}
Let us consider the special case n = m = 1
𝜆𝜆1
Pr 𝑆𝑆 11 ≤ 𝑆𝑆 21 =
𝜆𝜆1 +𝜆𝜆2
Let us consider the special case n = 2, m = 1
𝜆𝜆1 2
Pr 𝑆𝑆 1 2 ≤ 𝑆𝑆 21 =
𝜆𝜆1 +𝜆𝜆2
Each event that occurs is going to be an event of the N1(t) process with probability
λ1/(λ1 + λ2) or an event of the N2(t) process with probability λ2/(λ1 + λ2), independent
of all that has previously occurred.
This event will occur if and only if the first n + m − 1 events result in n
or more N1(t) events, we see that our desired probability is given by
𝑛𝑛 + 𝑚𝑚 − 1 𝜆𝜆1 𝑘𝑘 𝜆𝜆2 𝑛𝑛+𝑚𝑚−1−𝑘𝑘
Pr 𝑆𝑆 1 2 ≤ 𝑆𝑆 21 = ∑𝑛𝑛+𝑚𝑚−1
𝑘𝑘=𝑛𝑛
𝑘𝑘 𝜆𝜆1 +𝜆𝜆2 𝜆𝜆1 +𝜆𝜆2
Conditional Distribution of the
Arrival Times
Suppose we are told that exactly one event of a Poisson
process has taken place by time 𝑡𝑡, and we are asked to
determine the distribution of the time at which the event
occurred. Now, since a Poisson process possesses stationary and
independent increments it seems reasonable that each interval
in [0, 𝑡𝑡] of equal length should have the same probability of
containing the event. In other words, the time of the event
should be uniformly distributed over [0, 𝑡𝑡]. This is easily checked
since, for 𝑠𝑠 ≤ 𝑡𝑡,
Pr{𝑇𝑇 <𝑠𝑠,𝑁𝑁 𝑡𝑡 =1}
1
Pr 𝑇𝑇1 < 𝑠𝑠 𝑁𝑁 𝑡𝑡 = 1 =
Pr{𝑁𝑁 𝑡𝑡 =1}
Pr{1 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 0, 𝑠𝑠 , 0 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 [𝑠𝑠, 𝑡𝑡)}
=
Pr{𝑁𝑁 𝑡𝑡 = 1}
Pr{1 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 0, 𝑠𝑠 , } Pr{0 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑖𝑖𝑖𝑖 [𝑠𝑠, 𝑡𝑡)}
=
Pr{𝑁𝑁 𝑡𝑡 = 1}
𝑠𝑠
=
𝑡𝑡
Generalizations of the Poisson
Process
Definition 5.3 The counting process {N(t), t≥0} is said to be
nonhomogeneous Poisson process with intensity function λ(t),
t≥0, if
1. N(0) = 0.
2. {N(t), t≥0} has independent increments.
3. P{N(h) = 1} = λ(t)h + o(h).
4. P{N(t+h)-N(h) ≥ 2} = o(h).