Queuing Theory: Little's Theorem
Queuing Theory: Little's Theorem
Little’s Theorem: N = λT
arrival rate = λ departure rate = λ
⎯⎯⎯⎯⎯ → System ⎯⎯⎯⎯⎯ ⎯→
• Holds for any (ergodic) system with a steady state
• Def.
α ( t ) = the number of arrivals at the system in the interval from time 0 to time t.
= number of arrivals in [0, t]
β ( t ) = the number of customer departures in the interval from time 0 to time t.
= number of departure in [0, t]
N ( t ) = Number of customers in the system at time t
= α (t ) − β (t )
N = average (steady-state, long run, expected) number of customers in system
(waiting for service or receiving service) in equilibrium
t
∫ N ( t ′)dt ′
= lim 0 [customers
t →∞ t
λ = average (long run) arrival rate of customers
number of arrivals in [ 0,t ] α (t )
= lim = lim
t →∞ t t t →∞
α(s)
N(s)
i β(s)
2 Ti
1
t s
Arrival of ith customer
• Let t = a time instant where α ( t ) = β ( t ) , which implies N(t) = 0.
The area between α ( t ) and β ( t ) from 0 to t:
α (t )
1) horizontally, area = ∑ T j
j =1
t t
2) vertically, area = ∫ N ( s )ds = ∫ (α ( s ) − β ( s ) )ds
0 0
t α (t )
Hence, we have ∫ N ( s )ds = ∑ T
0 j =1
j .
1 ()
t α t
1
Thus, N = lim ∫ N ( s )d = lim ∑ T j
t →∞ t t →∞ t
0 j =1
α ( t ) α (t ) T j ⎛ α ( t ) ⎞ ⎛ α (t ) T j ⎞
= lim
t →∞
∑ = ⎜ lim ⎟ ⎜ lim ∑ ⎟ = λT
t j =1 α ( t ) ⎝ t →∞ t ⎠ ⎜⎝ t →∞ j =1 α ( t ) ⎟⎠
Queuing Theory
• Standard queuing theory nomenclature
Arrival process / Service time / Servers / Max occupancy
Interarrival time τ Service times X 1 server K customers
M = exponential M = exponential c servers unspecified if unlimited
D = deterministic D = deterministic ∞
G = general G = general
1
Arrival rate: λ =
1 Service rate: µ =
Eτ EX
( )
pn ( t ) = Pr ⎡⎣ N ( t ) = n ⎤⎦ . ⇒ pn = lim pn ( t ) .
t →∞
Then,
{
an ( t ) = P N ( t ) = n an arrival occured just after time t }
{
= lim P N ( t ) = n A ( t , t + δ )
δ →0
}
If the event A ( t , t + δ ) is independent of N ( t ) , then
an ( t ) = lim P { N ( t ) = n} = P { N ( t ) = n} = pn ( t )
δ →0
Taking the limit as t → ∞, from the definition of an and pn, we obtain an = pn.
• Ex. non-Poisson arrival process.
Suppose interarrival times are independent and uniformly distributed between
[a,b] ; a < b. Customer service times are all equal to c < a sec.
• Then, an arriving customer always finds an empty system (N = 0) .
• On the other hand, the average number in the system as seen by an outside
observer looking at a system at random time is N = λT where
1 1 2
λ= = = and T = c.
Eτ a + b a + b
2
2c
Thus, N = λT = .
a+b
• Ex. service times and future arrival times are correlated.
Packet arrival is Poisson process. Transmission time of the nth packet equals one
half the interarrival time between packets n and n+1
• Upon arrival, a packet finds the system empty.
• On the other hand, the average number in the system as seen by an outside
observer looking at a system at random time is
1 ⎛τ ⎞ 1
N = λT = ⎜ ⎟ =
τ ⎝2⎠ 2
Occupancy Distribution upon Departure
• The distribution of customers in the system just after a departure has occurred.
• d n ( t ) = P { N ( t ) = n a departure occured just before time t}
• steady-state values d n = lim d n ( t ) , n = 0, 1, …
t →∞
• dn = an, n = 0, 1, …
if
• the system reaches a steady-state with all n having positive steady-state
probabilities.
and
• N(t) changes in unit increments.
For any sample path of the system and for every n,
the number in the system will be n infinitely often (with probability 1).
⇒
For each time the number in the system increases from n to n+1 due to an arrival,
there will be a corresponding future decrease from n+1 to n due to a
departure.
⇒
In the long run,
the frequency of transitions from n to n+1 out of transitions from any k to k+1
equals
the frequency of transitions from n+1 to n out of transitions from any k+1 to
k,
which implies that dn = an.
M/G/1
• “G” ≡ general (really, GI ≡ general independent)
Service times are i.i.d.
Pr[service time ≤ t] = H(t) ≡ cdf of the service time; don’t have to be continuous
∞
Mean service time h = ∫ tdH ( t )
0
• R ( z ) = h* ( λ (1 − z ) )
N (z) =
(1 − ρ )( z − 1) R ( z )
z − R( z)
∞
λ 2 h2
• N = ∑ mP ( n = m ) = N ′ (1) = ρ +
m=0 2 (1 − ρ )
λ h2
T = h+
2 (1 − ρ )
λ h2
Wq = T − h =
2 (1 − ρ )
λ 2 h2
Nq =
2 (1 − ρ )
• Distribution of N
⎧1 − ρ m=0
⎪⎪
P ( n = m ) = ⎨ dm N ( z )
⎪ dz z =0
m>0
⎪⎩ m!
• { N ( t )} is no longer Markov in non-expo service time case. However, can embed a
discrete-time Markov chain at the departure instants
• Define
nk = number of customers in system right after (upon) departure of customer k (so, not
including customer k itself.)
sk = service time of customer k.
Assume the sk are i.i.d. with common cdf H ( t ) = Fs ( t ) = P ( sk ≤ t )
∞
Let h = ∫ tdH ( t ) be the mean service time.
0
= ∑∫ dH ( t ) z = ∑ ∫
dH ( t ) n
So, R ( z ) = Ez rk = h* ( λ (1 − z ) ) .
Note: require Poisson to prove R ( z ) = h* ( λ (1 − z ) )
• Quantity of principal interest is lim N k ( z ) = N (z ) (Will show later that N ′ (1) = N )
k →∞
⎧n + r − 1 ; nk −1 > 0
Recall that nk = ⎨ k −1 k ; hence,
⎩rk ; nk −1 = 0
( )
N k ( z ) = Ez nk = P ( nk −1 = 0 ) E z nk nk −1 = 0 + P ( nk −1 > 0 ) E z nk nk −1 > 0 ( )
= P ( nk −1 = 0 ) Ez rk + P ( nk −1 > 0 ) E z nk −1 + rk −1 nk −1 ( > 0)
1
(
= E z nk −1 nk −1 > 0
z
) R( z)
(
P ( nk −1 > 0 ) E z nk −1 + rk −1 )
nk −1 > 0 = P ( nk −1 > 0 ) E (z nk −1 + rk −1
nk −1 > 0 )
1 ⎛ ∞ P ( nk −1 = n ) n ⎞
= P ( nk −1 > 0 ) ⎜ ∑ z ⎟ R( z)
z ⎜ n =1 P ( nk −1 > 0 ) ⎟
⎝ ⎠
1⎛ ∞
⎞
= ⎜ ∑ P ( nk −1 = n ) z n ⎟ R ( z )
z ⎝ n =1 ⎠
1⎛ ∞ ⎞
= ⎜ ∑ P ( nk −1 = n ) z n − P ( nk −1 = 0 ) z 0 ⎟ R ( z )
z ⎝ n=0 ⎠
= ( N k −1 ( z ) − P ( nk −1 = 0 ) ) R ( z )
1
z
Thus N k ( z ) = P(nk −1 = 0)R( z ) + ( N k −1 ( z ) − P(nk −1 = 0))R( z )
1
z
• As k →∞ with ρ < 1, we get
• Nk(z) and Nk-1(z) → N(z) and
• P ( nk −1 = 0 ) → p0 .
We already know that, for a single server, ρ = 1 − p0 .
Thus, we have
( N ( z ) − (1 − ρ ) ) R ( z )
1
N ( z ) = (1 − ρ ) R ( z ) +
z
zN ( z ) = z (1 − ρ ) R ( z ) + N ( z ) R ( z ) − (1 − ρ ) R ( z )
N (z) =
(1 − ρ )( z − 1) R ( z )
z − R( z)
• To find N,
∞
N = ∑ mP ( n = m )
m=0
∞
N ( z ) = Ez n = ∑ P ( n = m )z m
m=0
∞
d
N ( z ) = N ′ ( z ) = ∑ mP ( n = m )z m −1
dz m=0
∞
N ′ (1) = ∑ mP ( n = m ) = N
m=0
G ( u ) ≡ N ( z ) z = u +1 =
(1 − ρ ) ub ( u ) .
u + 1 − b (u )
• By Taylor’s Theorem:
b′′ ( 0 ) 2
b ( u ) = b ( 0 ) + b′ ( 0 ) u + u + o ( u 2 ) as u → 0.
2
∞ ∞ ∞
Note that b0 = b ( 0 ) = ∫ e dH ( t ) = ∫ dH ( t ) = 1 . Also, b′ ( u ) = λ ∫ teλut dH ( t ) . Hence,
λ 0t
0 0 0
∞ ∞
b1 = b′ ( 0 ) = λ ∫ teλ 0t dH ( t ) = λ ∫ tdH ( t ) = λ h = ρ . The second derivative
0 0
∞ ∞ ∞
b′′ ( u ) = λ 2 ∫ t 2 eλut dH ( t ) . Therefore, b′′ ( 0 ) = λ 2 ∫ t 2 eλ 0t dH ( t ) = λ 2 ∫ t 2 dH ( t ) = λ 2 h 2 ,
0 0 0
b′′ ( 0 ) λ h
2 2
and b2 = = .
2 2
λ 2 h2
• N=ρ+
2 (1 − ρ )
Proof. G ( u ) =
(1 − ρ ) ub ( u ) = (1 − ρ ) ub ( u )
u + 1 − b ( u ) u + 1 − b0 − b1u − b2u 2 + o ( u 2 )
=
(1 − ρ ) u b ( u )
u + 1 − 1 − b1 u − b2u 2 − o u 2
1
( )
=
(1 − ρ ) b ( u )
(1 − b1 ) − b2u − o ( u )
1− ρ 1
= b (u )
1 − b1 b
1 − 2 u + o (u )
1 − b1
1
Now, note that as x → 0, = 1 + x + o( x) .
1 − x + o( x)
1
Pf. First, note that = 1 + x + o ( x ) . We will show that if
1− x
1 1
= g ( x ) + o ( x ) , then = g ( x) + o( x) .
f ( x) f ( x) + o( x)
1 ⎛ 1 g ( x) ⎞
Start with = g ( x ) + o ( x ) , we have lim ⎜ − ⎟ = 0.
f ( x) x → 0 ⎜ xf ( x ) x ⎟⎠
⎝
⎛ ⎞
1⎛ 1 ⎞ ⎜ 1 g ( x) ⎟
Now, lim ⎜ − g ( x ) ⎟ = lim ⎜ − ⎟.
x →0 x ⎜ f ( x ) + o ( x ) ⎟ x →0 ⎜
⎝ ⎠ xf ( x ) + o ( x 2 ) x ⎟
⎜ ⎟
⎝ 0 ⎠
=0
1− ρ ⎛ ⎞
(1 + b1u + o ( u ) ) ⎜1 + 2 u + o ( u ) ⎟
b
Hence, G ( u ) =
1 − b1 ⎝ 1 − b1 ⎠
1− ρ ⎛ ⎛ b2 ⎞ ⎞
= ⎜⎜1 + ⎜ b1 + ⎟ u + o ( u ) ⎟⎟
1 − b1 ⎝ ⎝ 1 − b1 ⎠ ⎠
From r ′ ( 0 ) = 0 for continuous r ( x ) = o ( x ) as x → 0, we then have
d 1− ρ ⎛ b2 ⎞
G′ ( 0 ) = G (u ) = ⎜ b1 + ⎟
du u =0 1 − b1 ⎝ 1 − b1 ⎠
1− ρ ⎛ b2 ⎞
Thus, N ′ (1) = G′ ( 0 ) = ⎜ b1 + ⎟.
1 − b1 ⎝ 1 − b1 ⎠
We finally have
⎛ λ 2 h2 ⎞
1− ρ ⎛ b2 ⎞ 1 − ρ ⎜ ⎟
2 ⎟=ρ+ λ h .
2 2
N = N ′ (1) = ⎜ 1
b + ⎟ = ⎜ ρ +
1 − b1 ⎝ 1 − b1 ⎠ 1 − ρ ⎜ 1− ρ ⎟ 2 (1 − ρ )
⎜ ⎟
⎝ ⎠
λ 2 h2
• Nq = N − ρ =
2 (1 − ρ )
λ 2 h2
ρ+
N 2 (1 − ρ ) λ h2
• T= = = h+
λ λ 2 (1 − ρ )
λ h2
• Wq = T − h =
2 (1 − ρ )
⎧1 − ρ m=0
⎪ m
⎪
• P ( n = m) = ⎨ d m N ( z )
⎪ dz
⎪⎩
z =0
m>0
m!
∞
Proof. N ( z ) = ∑ P ( n = m ) z m = P ( n = 0 ) + P ( n = 1) z1 + P ( n = 2 ) z 2 + …
m =0
P ( n = 0) = N ( 0) =
(1 − ρ )( 0 − 1) R ( 0 ) = 1 − ρ
0 − R (0)
∞
N ′ ( z ) = ∑ mP ( n = m ) z m−1 = P ( n = 1) + P ( n = 2 ) z + …
m =1
N ′ ( 0 ) = P ( n = 1) .
⎧1 − ρ m=0
⎪ m
⎪
Hence, P ( n = m ) = ⎨ d m N ( z )
⎪ dz
⎪⎩
z =0
m>0
m!
• Distribution of waiting time.
w* ( λ (1 − z ) ) =
(1 − ρ )( z − 1)
z − h* ( λ (1 − z ) )
1− ρ
w* ( s ) =
λ
1−
s
(1 − h ( s ) )
*
Observe that, in the steady-state, the random variable n that represents the system
population at the point of departure of a customer may also be thought of as the
arrivals during the total system time (sojourn time) of that customer.
Said sojourn time is the sum of the waiting random variable, w, and the service
random variable, s.
The same sort of reasoning that gave us R ( z ) = h* ( λ (1 − z ) ) can be applied to
give us the moment generating function of the number of arrivals during w + s as
N ( z ) = f * ( λ (1 − z ) )
where f * is the Laplace transform of the distribution of w + s
during s → r, R
during w + s → n, N
Since w and s are independent, the pdf of w + s is the convolution of the pdf of w
and pdf of s. This implies that the Laplace transform is the multiplication:
f * ( s ) = w* ( s ) h* ( s )
N ( z ) = f * ( λ (1 − z ) ) = w* ( λ (1 − z ) ) h* ( λ (1 − z ) ) .
where w* is the L-S transform of the distribution of w.
(1 − ρ )( z − 1) R ( z ) (1 − ρ )( z − 1) h ( λ (1 − z ) )
*
N ( z) z − R( z) z − h* ( λ (1 − z ) )
w ( λ (1 − z ) ) = *
*
= =
h ( λ (1 − z ) ) h* ( λ (1 − z ) ) h* ( λ (1 − z ) )
=
(1 − ρ )( z − 1)
z − h* ( λ (1 − z ) )
Let s = λ (1 − z )
(1 − ρ ) ⎛⎜ −
s⎞
⎟ 1− ρ 1− ρ
w* ( s ) = ⎝ λ⎠ = =
λ λ λ
⎜1 − ⎟ − h ( s ) − s + 1 + s h ( s ) 1 − s (1 − h ( s ) )
⎛ s ⎞ * * *
⎝ λ ⎠
• We did explicitly use the fact that the number of arrivals during a service of length s
is Poisson with parameter λs.
Our justification for equating the statistics just after a departure instant in equilibrium
to those at a randomly chosen instant in equilibrium also depended on the Poisson
nature of the arrivals. (need dn = an = pn).
1
• Average length of an idle period =
λ
Proof. Since an idle period occurs when the system is waiting for a customer to
arrive after the queue becomes empty. At the moment that server becomes
1
empty, by memoryless property, have to wait E ( λ ) with average for the
λ
next customer to arrive, independent of how long it has already been from the
moment when the last customer arrived.
1
• Average length of busy period = .
µ −λ
Proof. Let B = average length of buy period. We have shown that average length of
1
an idle period is . Note that the busy period and idle period are alternating
λ
sequence. Hence,
n
n ∑τ busy , i
∑τ busy ,i lim
n →∞
i =1
n B
ρ = lim i =1
= = .
∑ (τ + τ busy ,i )
n n n
n →∞ 1
idle , i ∑τ idle , i ∑τ busy , i
B+
λ
i =1
lim i =1
+ lim i =1
n n n →∞ n →∞
ρ h 1
Solving for B, we get B = = = .
λ (1 − ρ ) 1 − ρ µ − λ
1
• Avergae number of customers served in a busy period =
1− ρ
B
Idea. = .
h
M/G/1 analysis based on the concept of the mean residual service time
• Ri = Residual service time seen by the ith customer.
By this we mean that if customer j is already being served when i arrives,
Ri is the remaining time until customer j’s service time is complete.
If no customer is in service (i.e., the system is empty when i arrives), the Ri = 0.
• R = mean residual time = lim ERi
i →∞
• R = mean residual service time given that one is arrived when the server is busy
1 X2
By renewal theory: R =
2 X
• Note: If M/M/1, service time is exponentially distributed, and thus memoryless.
Therefore, given that the service time does not end there, what’s left is also
1
exponentially distributed with the same mean. So, R = .
µ
2
⎛1⎞ 1
+⎜ ⎟
1 µ ⎝µ⎠
2
1
Using the above equation gives the same result: R = = .
2 1 µ
µ
1
• R= λX2
2
Proof. We know that the probability of server being busy for single server is
p1 = λ EX = ρ . Hence,
1 X2 1
R = Rp0 + 0 p1 = Rρ = ρ = λX2
2 X 2
Proof. (Graphical argument)
Residual Service time
Xi
Xi t
r(τ) = the remaining time for completion of the customer in service at time τ
When a new service of duration X begins, r(τ) starts at X and decays linearly
for X time units.
t M (t )
1
∫ r (τ )dτ = ∑ 2 X
2
i
0 i =1
1
= λX2
2
λX2
• W=
2 (1 − ρ )
Proof. Note that the time waiting in the queue of the ith customer = residual service
time seen by the ith customer + time used to service all customers already in
the queue.
1 1
W =R+ Nq = R + λW = R + ρW .
µ µ
1
R
λX2 λX2
W= =2 = .
1− ρ 1− ρ 2 (1 − ρ )
• The average customer in queue N q and the mean residual time R as seen by an
arriving customer are also equal to the average number in queue and mean residual
time seen by an outside observer at a random time.
This is due to the Poisson character of the arrival process, which implies that the
occupancy distribution upon arrival is typical.
• M/G/1 is a renewal process when busy
M/G/1 has occasional (with probability 1-ρ of occurrence) E ( λ ) random variable
inserted into service time renewal process.
• M/G/1 queue can have ρ < 1 but infinite W if the second moment X 2 → ∞
• The formula is valid for any order of servicing customers as long as the order is
determined independently of the required service time
To see this, suppose the ith and jth customers are both in the queue and that they
exchange places.
The expected queuing time of customer i will then be exchanged with that for
customer j, but the average, over all customers, is unchanged.
Since any service order can be considered as a sequence of reversals in queue
position, the P-K formula remains valid.
M/G/1 with vacations
• At the end of each busy period, the server goes on “vacation” for some random
interval of time.
A new arrival to an idle system, rather than going into service immediately, waits for
the end of the vacation period.
If the system is still idle at the completion of a vacation, a new vacation starts
immediately.
• Let Vi’s be the durations of the successive vacations taken by the server.
Assume Vi’s are i.i.d. random variables and independent of the customer interarrival
times and service times.
1 X2
• R= = mean residual time given that arrive when the server is serving someone
2 X
1V2
= mean residual time given that arrive when the server is on vacation (idle)
2V
1 X2 1V2
R= P {server busy} + P {server idle}
2 X 2V
1 X2 1V2 1 1V2
= ρ+ (1 − ρ ) = X 2λ + (1 − ρ )
2 X 2V 2 2V
R λX2 1V2
• W= = +
1 − ρ 2 (1 − ρ ) 2 V
M/M/1
λ
• ρ := = 1 − p0 = N s = p1
µ server
pn = ρ n (1 − ρ ) ; n = 0,1,…
ρ λ
N= =
1− ρ µ −λ
1
T=
µ −λ
ρ
W=
µ −λ
ρ2
Nq =
1− ρ
• Transient if ρ > 1; Null recurrence if ρ = 1; Ergodic if ρ < 1
• ρ = 1 − p0 = utilization factor = the long-term proportion of time the server is busy
Proof.
(1) If the system has ≥ 1 customers, the server is busy (serving surely 1
customer). This occur with probability 1 − p0 . Note also that if the server
is busy, then the system has ≥ 1 customers (at least one in the server).
Hence, p0system = p0server = p0 . If the system has 0 customer, the server is idle
(serving 0 customer). This occur with probability p0 . So, the long-term
proportion of time the server is busy = 1 − p0 . Average number of
customer in the server = N s = p0 × 0 + p1 × 1 = p1 = 1 − p0
1
(2) Now, Apply Little’s theorem to the server. Then N s = λ EX = λ := ρ .
µ
λ
From (1) and (2), ρ := = 1 − p0 = N s = p1 .
µ server
• State diagram
λ λ
0 1 i-1 i
µ µ
• pn = ρ n (1 − ρ ) ; n = 0,1,…
Proof. This is a birth-and-death process.
ρ λ
• N= =
1− ρ µ −λ
∞ ∞
ρ
Proof. N = ∑ npn = ∑ n ρ n (1 − ρ ) = (1 − ρ ) .
(1 − ρ )
2
n=0 n=0
ρ
N 1− ρ 1
• T= = =
λ λ µ −λ
1 1 λ ρ
• W =T − X = − = =
µ − λ µ µ (µ − λ) µ − λ
λ
ρ
λρ µ ρ2
• N q = λW = = =
µ − λ 1− λ 1− ρ
µ
or
N q = 0 p0 + 0 p1 + 1 p2 + 2 p3 + … + ( i − 1) pi + …
∞ ∞
= ∑ ( i − 1) pi = ∑ ( i − 1)ρ i (1 − ρ )
i=2 i=2
∞ ∞ ∞
= (1 − ρ ) ∑ mρ m +1 = (1 − ρ ) ρ ∑ mρ m +1 = (1 − ρ ) ρ ∑ mρ m
m =1 m =1 m =1
∞
ρ ρ2
= (1 − ρ ) ρ ∑ mρ m = (1 − ρ ) ρ =
(1 − ρ ) 1 − ρ
2
m=0
=
(1 − ρ )( z − 1) µ = (1 − ρ )( z − 1) µ = (1 − ρ )( z − 1) µ
( λ (1 − z ) + µ ) z − µ λ (1 − z ) z + ( µ z − µ ) λ (1 − z ) z + µ ( z − 1)
=
(1 − ρ ) µ = 1 − ρ
µ − λz 1− ρz
∞
By expanding N(z) in a power series, we have N ( z ) = ∑ (1 − ρ )( ρ z ) .
i
i =0
∞
Since N ( z ) = ∑ P ( n = i ) z i , P ( n = i ) = (1 − ρ ) ρ i for k = 0, 1, 2, …
i =0
1− ρ 1− ρ 1− ρ 1− ρ
w* ( s ) = = = =
λ λ⎛ µ ⎞ λ ⎛ s ⎞ 1− λ
1− (1 − h ( s ) )
*
1 − ⎜1 − ⎟ 1− ⎜
s s⎝ s+µ⎠ s ⎝ s + µ ⎟⎠ s+µ
s+µ ⎛ (s + µ − λ) + λ ⎞ ⎛ λ ⎞
= (1 − ρ ) = (1 − ρ ) ⎜ ⎟ = (1 − ρ ) ⎜⎜1 + ⎟⎟
s+µ −λ ⎝ s+µ −λ ⎠ ⎝ s + (µ − λ) ⎠
( )
fW ( t ) = (1 − ρ ) δ ( t ) + λ e −( µ −λ )t = (1 − ρ ) δ ( t ) + (1 − ρ ) λ e − µ (1− ρ )t ; t > 0
M/D/1
• “D” ⇒ deterministic
service time = h for every customer
• h = h , h2 = h2
λ h2 λ h2
W= =
2 (1 − ρ ) 2 (1 − λ h )
1
λ
µ2 ρ
= =
2 (1 − ρ ) 2µ (1 − ρ )
λρ ρ2
N q = λW = =
2µ (1 − ρ ) 2 (1 − ρ )
ρ 1 1⎛ ρ ⎞ 1 ⎛ ρ + 2 − 2ρ ⎞ 2−ρ
T =W + h = + = ⎜⎜ + 1⎟⎟ = ⎜⎜ ⎟⎟ =
2µ (1 − ρ ) µ µ ⎝ 2 (1 − ρ ) ⎠ µ ⎝ 2 (1 − ρ ) ⎠ 2µ (1 − ρ )
ρ2 ρ 2 + 2ρ − 2ρ 2 2ρ − ρ 2
N = Nq + ρ = +ρ = =
2 (1 − ρ ) 2 (1 − ρ ) 2 (1 − ρ )
M/M/1/K
• Equilibrium (stable):
• pn λ = pn +1µ
n +1
λ ⎛λ⎞
pn +1 = pn = ⎜ ⎟ p0 = ρ n +1 p0 ; n = 0, 1, …, K-1
µ ⎝µ⎠
K K
1 − ρ K +1 1− ρ
• ∑ pn = 1 ⇒
n=0
∑
n=0
ρ p0 = p0
n
1− ρ
= 1 ⇒ p0 =
1 − ρ K +1
1− ρ
• pn = ρ n ; n = 0, 1, …, K
1− ρ K +1
1− ρ
• P(blocking or loss) = pK = proportion of time that the system is full = ρK
1− ρ K +1
• T = W+X → T = X
• For ρ → ∞,
• N→K
• pK → 1
K
• T→ = KEX
µ
M/M/m
• m server.
λ λ λ
µ iµ mµ mµ
1 λ
• ρ= <1
mµ
1
• p0 =
( mρ ) ( mρ )
m −1 i m
∑
i =0 i!
+
m !(1 − ρ )
⎧ ( m ρ )i
⎪ p0 0≤i≤m
⎪ i!
• pi = ⎨
⎪m ρ
m i
⎪⎩ m ! p0 i≥m
p0 ( mρ )
m
pm
• Erlang C formula PQ = = = P{W > 0}
m!(1 − ρ ) 1− ρ
ρ PQ
• NQ =
1− ρ
ρ PQ
• W=
λ (1 − ρ )
1
• T= +W
µ
ρ PQ
• N = N Q + m ρ = mρ +
1− ρ
• For 1 ≤ i ≤ m
1λ 1
i µ pi = λ pi −1 ⇒ pi = pi −1 = ( mρ ) pi −1
iµ i
( mρ )
i
pi = p0 , 1 ≤ i ≤ m
i!
For i > m
1 λ
mµ pi = λ pi −1 ⇒ pi = pi −1 = ρ pi −1
mµ
( mρ )
m
mm ρ i
pi = ρ i −m
pm = ρ i −m
p0 = p0
m! m!
mm ρ i
pi = p0 , i ≥ m + 1
m!
⎧ ( m ρ )i ⎧ ( m ρ )i ⎧ ( m ρ )i
⎪ p0 1 ≤ i ≤ m ⎪ p0 0≤i≤m ⎪ p0 0 ≤ i ≤ m −1
⎪ ⎪ ⎪
pi = ⎨ i ! = ⎨ i! = ⎨ i!
⎪m ρ ⎪m ρ ⎪m ρ
m i m i m i
⎪⎩ m ! 0 p i ≥ m + 1 ⎪⎩ m ! p0 i ≥ m +1 ⎪ p0 i≥m
⎩ m!
Note: for i = m, can use any equation.
• PQ = P{Queuing} = probability that an arrival will find all servers busy and will be
forced to wait in queue
p0 ( mρ )
m
pm
• Erlang C formula: PQ = =
m!(1 − ρ ) 1− ρ
• probability that an arrival will find all servers busy and will be forced to wait in
queue = Pblock
• Since an arriving customer finds the system in “typical” state
PQ = P { N ≥ m}
∞ ∞
mm ρ i ∞
mm ρ i ∞
m m ρ k +m
= ∑ pi = ∑ p0 = p0 ∑ = p0 ∑ ;k = i − m
i =m i =m m! i=m m! k =0 m!
( mρ )
m
∞
mm ρ k ρ m mm ρ m ∞
= p0 ∑ = p0 ∑ ρ k = p0
k =0 m! m! k =0 m !(1 − ρ )
• The probability of a call request finding all of the m circuits of a transmission line
busy, assumed that such a call request “remains in queue,” that is, continuously
attempts to find a free circuit.
1
• p0 =
( mρ ) + ( mρ )
m −1 i m
∑
i =0 i! m !(1 − ρ )
( mρ ) ( mρ ) ( mρ )
i i m
∞ m −1 ∞ m −1 m −1
∑p =∑p +∑p =∑
i =0
i
i =0
i
i=m
i
i=0 i!
p0 + PQ = ∑
i=0 i!
p0 + p0
m!(1 − ρ )
⎛ m −1 ( mρ )i ( mρ ) ⎞
m
= p0 ⎜ ∑ + ⎟
⎜ i =0 i!
⎝ m!(1 − ρ ) ⎟⎠
∞
1
Set ∑p = 1 gives p0 =
( mρ ) ( mρ )
i i m
m −1
∑
i =0
+
i =0 i! m !(1 − ρ )
( ρ m ) p ∞ iρ i = ( ρ m ) p ρ = P ρ
m m
∞ ∞
mm ρ i + m
• N Q = ∑ ipi + m = ∑i p0 = 0∑
(1 − ρ ) 1− ρ
0 2 Q
i =0 i =0 m! m! i =0 m!
• N = NQ + N S
λ
N S = mρ =
µ
N = N Q + mρ
• Another way to find NS
( mρ )
i
m −1 ∞ m −1
N S = ∑ ipi + ∑ mpi = ∑ i p0 + mPQ
i =0 i=m i =0 i!
( mρ ) p + mp ( mρ )
i m
m −1
=∑
i =1 ( i − 1)! m!(1 − ρ )
0 0
⎛ ⎛ m ( m ρ )i ( m ρ ) m ⎞ ( mρ ) ⎞
m
= p0 ⎜ ⎜ ∑ − ⎟+m ⎟
⎜ ⎜ i =1 ( i − 1)! ( m − 1)! ⎟ m !(1 − ρ ) ⎟
⎝⎝ ⎠ ⎠
⎛ m −1 ( mρ )k +1 ⎛ m (1 − ρ )( mρ )m ( mρ ) ⎞⎟ ⎟⎞
m
= p0 ⎜ ∑ +⎜− +m
⎜ k =0 k ! ⎜
⎝ m!(1 − ρ ) m!(1 − ρ ) ⎟⎠ ⎟
⎝ ⎠
⎛ ( mρ ) + mρ ( mρ ) ⎞⎟
k m
m −1
= p0 ⎜ mρ ∑
⎜
⎝ k =0 k! m!(1 − ρ ) ⎟⎠
⎛ m −1 ( mρ )k ( mρ ) ⎞⎟
m
mρ ⎜ ∑ +
⎜ k =0 k !
⎝ m!(1 − ρ ) ⎟⎠
= = mρ
( mρ ) ( mρ )
m −1 i m
∑
i =0 i!
+
m!(1 − ρ )
ρn
ρ n
n!
• pn = p0 = m
ρi
∑ i!
n!
i =0
ρm
• Erlang B formula: plost = pm = mm ! i
ρ
∑ i!i =0
λ λ λ
0 1 i-1 i m-1 m
µ iµ mµ
M/M/∞
λ λ
0 1 i-1 i
µ iµ
λ
• ρ=
µ
1 1
• p0 = = = e− ρ
∞
ρ i
e ρ
∑ i!
i =0
ρi ρi
• pi = p0 = e− ρ ⇒ Poisson
i! i!
Comparison
Engsett:
kλ (k-i)λ (k-c+1)λ
0 1 i i+1 c-1 c
µ (i+1)µ cµ
Erlang:
λ λ λ
0 1 i i+1 c-1 c
µ (i+1)µ cµ
M/M/1
λ λ
0 1 i-1 i
µ µ
M/M/m
λ λ λ
µ iµ mµ mµ
M/M/m/m
λ λ λ
0 1 i-1 i m-1 m
µ iµ mµ
M/M/∞
λ λ
0 1 i-1 i
µ iµ
Etc
• Burke’s theorem:
For an M/M/1. M/M/c, or M/M/∞ queuing system at steady state with arrival rate λ,
then
• The departure process is Poisson with rate λ
• At each time t, the number of customers in the system n(t) is independent of the
sequence of departure times prior to t.