0% found this document useful (0 votes)
439 views

Exponential Distribution

1. The exponential distribution describes the time between independent events in a Poisson process and has a memoryless property. The mean and variance of an exponential distribution are 1/λ. 2. A Poisson process is a counting process where the number of events in non-overlapping time intervals are independent. The time between events and waiting times for events in a Poisson process have exponential distributions. 3. In a compound Poisson process, the total value of events over time is the sum of iid random variables, with the number of events following a Poisson process. Its mean is λt times the mean of individual values and variance is λt times the individual variance plus the square of the individual mean.

Uploaded by

Arabi Ali A
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
439 views

Exponential Distribution

1. The exponential distribution describes the time between independent events in a Poisson process and has a memoryless property. The mean and variance of an exponential distribution are 1/λ. 2. A Poisson process is a counting process where the number of events in non-overlapping time intervals are independent. The time between events and waiting times for events in a Poisson process have exponential distributions. 3. In a compound Poisson process, the total value of events over time is the sum of iid random variables, with the number of events following a Poisson process. Its mean is λt times the mean of individual values and variance is λt times the individual variance plus the square of the individual mean.

Uploaded by

Arabi Ali A
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Exponential Distribution

Definition: Exponential distribution with parameter


:
 x
e
x0
f (x) =
0
x<0
The cdf:
F (x) =

f (x)dx =

1 ex
0

x0
x<0

Mean E(X) = 1/.


Moment generating function:
(t) = E[etX ] =
E(X 2) =

d2
(t)|t=0
dt2
2

,
t

t<

= 2/2 .

V ar(X) = E(X ) (E(X))2 = 1/2.

Properties
1. Memoryless: P (X > s + t|X > t) = P (X > s).

=
=
=
=
=

P (X > s + t|X > t)


P (X > s + t, X > t)
P (X > t)
P (X > s + t)
P (X > t)
e(s+t)
et
es
P (X > s)

Example: Suppose that the amount of time one


spends in a bank is exponentially distributed with
mean 10 minutes, = 1/10. What is the probability that a customer will spend more than 15
minutes in the bank? What is the probability
that a customer will spend more than 15 minutes in the bank given that he is still in the bank
after 10 minutes?
Solution:
P (X > 15) = e15 = e3/2 = 0.22
P (X > 15|X > 10) = P (X > 5) = e1/2 = 0.604

Failure rate (hazard rate) function r(t)


r(t) =

f (t)
1 F (t)

P (X (t, t + dt)|X > t) = r(t)dt.


For exponential distribution: r(t) = , t > 0.
Failure rate function uniquely determines F (t):
R
0t r(t)dt

F (t) = 1 e

2. If Xi, i = 1, 2, ..., n, are


Pniid exponential RVs with
mean 1/, the pdf of i=1 Xi is:
n1
(t)
,
fX1+X2++Xn (t) = et
(n 1)!

gamma distribution with parameters n and .


3. If X1 and X2 are independent exponential RVs
with mean 1/1, 1/2,
1
.
P (X1 < X2) =
1 + 2
4. If Xi, i = 1, 2, ..., n, are independent exponential
RVs with rate i. Let Z = min(X1, ..., Xn) and
Y = max(X1, ..., Xn). Find distribution of Z and
Y.
Pn
Z is an exponential RV with rate i=1 i.

P (Z > x) = P (min(X1, ..., Xn) > x)


= P (X1 > x, X2 > x, ..., Xn > x)
= P (X1 > x)P (X2 > x) P (Xn > x)
n
Y
P
i x
( ni=1 i )x
=
e
=e
i=1

FY (x) = P (Y < x) =

Qn

i=1(1

eix).

Poisson Process
Counting process: Stochastic process {N (t), t 0}
is a counting process if N (t) represents the total number of events that have occurred up to time t.
N (t) 0 and are of integer values.
N (t) is nondecreasing in t.
Independent increments: the numbers of events occurred in disjoint time intervals are independent.
Stationary increments: the distribution of the number
of events occurred in a time interval only depends on
the length of the interval and does not depend on the
position.

A counting process {N (t), t 0} is a Poisson process with rate , > 0 if


1. N (0) = 0.
2. The process has independent increments.
3. The process has staionary increments and
N (t+s)N (s) follows a Poisson distribution with
parameter t:
t (t)

P (N (t+s)N (s) = n) = e

n!

Note: E[N (t + s) N (s)] = t.


E[N (t)] = E[N (t + 0) N (0)] = t.

n = 0, 1, ...

Interarrival and Waiting Time


Define Tn as the elapsed time between (n 1)st and
the nth event.
{Tn, n = 1, 2, ...}
is a sequence of interarrival times.
Proposition 5.1: Tn, n = 1, 2, ... are independent
identically distributed exponential random variables
with mean 1/.
Define Sn as the waiting time for the nth event, i.e.,
the arrival time of the nth event.
n
X
Sn =
Ti .
i=1

Distribution of Sn:
t

fSn (t) = e

(t)n1
,
(n 1)!

gamma distribution with parameters n and .


Pn
E(Sn) = i=1 E(Ti) = n/.

Example: Suppose that people immigrate into a territory at a Poisson rate = 1 per day. (a) What is the
expected time until the tenth immigrant arrives? (b)
What is the probability that the elapsed time between
the tenth and the eleventh arrival exceeds 2 days?
Solution:
Time until the 10th immigrant arrives is S10.
E(S10) = 10/ = 10 .
P (T11 > 2) = e2 = 0.133 .

Further Properties
Consider a Poisson process {N (t), t 0} with rate
. Each event belongs to two types, I and II. The type
of an event is independent of everything else. The
probability of being in type I is p.
Examples: female vs. male customers, good emails
vs. spams.
Let N1(t) be the number of type I events up to time t.
Let N2(t) be the number of type II events up to time
t.
N (t) = N1(t) + N2(t).

Proposition 5.2: {N1(t), t 0} and {N2(t), t 0}


are both Poisson processes having respective rates p
and (1 p). Furthermore, the two processes are independent.
Example: If immigrants to area A arrive at a Poisson
rate of 10 per week, and if each immigrant is of English descent with probability 1/12, then what is the
probability that no people of English descent will immigrate to area A during the month of February?
Solution:
The number of English descent immigrants arrived
up to time t is N1(t), which is a Poisson process with
mean /12 = 10/12.
P (N1(4) = 0) = e(/12)4 = e10/3 .

10

Conversely: Suppose {N1(t), t 0} and {N2(t), t


0} are independent Poisson processes having respective rates 1 and 2. Then N (t) = N1(t) + N2(t) is a
Poisson process with rate = 1 + 2. For any event
occurred with unknown type, independent of every1
thing else, the probability of being type I is p = 1+
2
and type II is 1 p.
Example: On a road, cars pass according to a Poisson
process with rate 5 per minute. Trucks pass according to a Poisson process with rate 1 per minute. The
two processes are indepdendent. If in 3 minutes, 10
veicles passed by. What is the probability that 2 of
them are trucks?
Solution:
Each veicle is independently a car with probability
5
1
5
=
and
a
truck
with
probability
5+1
6
6 . The probability that 2 out of 10 veicles are trucks is given by the
binomial distribution:
   2  8
5
1
10
2
6
6

11

Conditional Distribution of Arrival Times


Consider a Poisson process {N (t), t 0} with rate
. Up to t, there is exactly one event occurred. What
is the conditional distribution of T1?
Under the condition, T1 uniformly distributes on [0, t].
Proof

=
=
=
=
=

P (T1 < s|N (t) = 1)


P (T1 < s, N (t) = 1)
P (N (t) = 1)
P (N (s) = 1, N (t) N (s) = 0)
P (N (t) = 1)
P (N (s) = 1)P (N (t) N (s) = 0)
P (N (t) = 1)
(ses) e(ts)
tet
s
Note: cdf of a uniform
t

12

If N (t) = n, what is the joint conditional distribution


of the arrival times S1, S2, ..., Sn?
S1, S2, ..., Sn is the ordered statistics of n independent
random variables uniformly distributed on [0, t].
Let Y1, Y2, ..., Yn be n RVs. Y(1), Y(2),..., Y(n) is the
ordered statistics of Y1, Y2, ..., Yn if Y(k) is the kth
smallest value among them.
If Yi, i = 1, ..., n are iid continuous RVs with pdf
f , then the joint density of the ordered statistics Y(1),
Y(2),..., Y(n) is
fY(1),Y(2),...,Y(n) (y1, y2, ..., yn)
 Qn
n! i=1 f (yi) y1 < y2 < < yn
=
0
otherwise

13

We can show that


n!
f (s1 , s2, ..., sn | N (t) = n) = n
t
0 < s1 < s2 < sn < t
Proof
f (s1 , s2, ..., sn | N (t) = n)
f (s1 , s2, ..., sn , n)
=
P (N (t) = n))
es1 e(s2 s1) e(snsn1)e(tsn)
=
et(t)n/n!
n!
= n , 0 < s1 < < sn < t
t
For n independent uniformly distributed RVs on [0, t],
Y1, ..., Yn:
1
f (y1, y2, ..., yn ) = n .
t
Proposition 5.4: Given Sn = t, the arrival times S1,
S2, ..., Sn1 has the distribution of the ordered statistics of a set n 1 independent uniform (0, t) random
variables.

14

Generalization of Poisson Process


Nonhomogeneous Poisson process: The counting process {N (t), t 0} is said to be a nonhomogeneous
Poisson process with intensity function (t), t 0 if
1. N (0) = 0.
2. The process has independent increments.
3. The distribution of N (t+s)N (t) is Poisson with
mean given by m(t + s) m(t), where
Z t
( )d .
m(t) =
0

We call m(t) mean value function.


Poisson process is a special case where (t) = , a
constant.

15

Compound Poisson process: A stochastic process


{X(t), t 0} is said to be a compound Poisson process if it can be represented as
X(t) =

N (t)
X

Yi , t 0

i=1

where {N (t), t 0} is a Poisson process and


{Yi, i 0} is a family of independent and identically
distributed random variables which are also independent of {N (t), t 0}.
The random variable X(t) is said to be a compound
Poisson random variable.
Example: Suppose customers leave a supermarket in
accordance with a Poisson process. If Yi, the amount
spent by the ith customer, i = 1, 2, ..., are indepenPN (t)
dent and identically distributed, then X(t) = i=1 Yi,
the total amount of money spent by customers by time
t is a compound Poisson process.

16

Find E[X(t)] and V ar[X(t)].


E[X(t)] = tE(Y1).
V ar[X(t)] = t(V ar(Y1) + E 2(Y1))
Proof
E(X(t)|N (t) = n) = E(
= E(
= E(

N (t)
X

i=1
n
X

i=1
n
X

Yi|N (t) = n)
Yi|N (t) = n)
Yi) = nE(Y1)

i=1

E(X(t)) = EN (t)E(X(t)|N (t))

X
=
P (N (t) = n)E(X(t)|N (t) = n)
=

n=1

P (N (t) = n)nE(Y1)

n=1

= E(Y1)

nP (N (t) = n)

n=1

= E(Y1)E(N (t))
= tE(Y1)
17

V ar(X(t)|N (t) = n) = V ar(


= V ar(
= V ar(

N (t)
X

i=1
n
X

i=1
n
X

Yi|N (t) = n)
Yi|N (t) = n)
Yi )

i=1

= nV ar(Y1)

V ar(X(t)|N (t) = n)
= E(X 2(t)|N (t) = n) (E(X(t)|N (t) = n))2
E(X 2(t)|N (t) = n)
= V ar(X(t)|N (t) = n) + (E(X(t)|N (t) = n))2
= nV ar(Y1) + n2E 2(Y1)

18

V ar(X(t))
= E(X 2(t)) (E(X(t)))2

X
=
P (N (t) = n)E(X 2(t)|N (t) = n) (E(X(t)))2
=

n=1

P (N (t) = n)(nV ar(Y1) + n2E 2(Y1)) (tE(Y1))2

n=1

=
=
=
=

V ar(Y1)E(N (t)) + E 2(Y1)E(N 2(t)) (tE(Y1))2


tV ar(Y1) + tE 2(Y1)
t(V ar(Y1) + E 2(Y1))
tE(Y12)

19

You might also like