0% found this document useful (0 votes)
78 views

Queuing Theory: Little's Theorem

Little's Theorem states that the average number of customers in a system (N) equals the average arrival rate (λ) multiplied by the average time a customer spends in the system (T). It holds for any system in steady state. Queuing theory uses terms like interarrival time, service time, number of servers, and maximum occupancy to describe systems. The utilization factor is a measure of how busy servers are on average. For finite systems, there is a distinction between the offered load and the actual carried load. Upon arrival, a customer sees the probabilistic occupancy distribution rather than the unconditional steady-state probabilities.

Uploaded by

ronynaidu84
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views

Queuing Theory: Little's Theorem

Little's Theorem states that the average number of customers in a system (N) equals the average arrival rate (λ) multiplied by the average time a customer spends in the system (T). It holds for any system in steady state. Queuing theory uses terms like interarrival time, service time, number of servers, and maximum occupancy to describe systems. The utilization factor is a measure of how busy servers are on average. For finite systems, there is a distinction between the offered load and the actual carried load. Upon arrival, a customer sees the probabilistic occupancy distribution rather than the unconditional steady-state probabilities.

Uploaded by

ronynaidu84
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Queuing Theory

Little’s Theorem: N = λT
arrival rate = λ departure rate = λ
⎯⎯⎯⎯⎯ → System ⎯⎯⎯⎯⎯ ⎯→
• Holds for any (ergodic) system with a steady state
• Def.
α ( t ) = the number of arrivals at the system in the interval from time 0 to time t.
= number of arrivals in [0, t]
β ( t ) = the number of customer departures in the interval from time 0 to time t.
= number of departure in [0, t]
N ( t ) = Number of customers in the system at time t
= α (t ) − β (t )
N = average (steady-state, long run, expected) number of customers in system
(waiting for service or receiving service) in equilibrium
t

∫ N ( t ′)dt ′
= lim 0 [customers
t →∞ t
λ = average (long run) arrival rate of customers
number of arrivals in [ 0,t ] α (t )
= lim = lim
t →∞ t t t →∞

T = average time in system of customers in equilibrium


α (t )
⎪⎧ 1 ⎪⎫
= lim ⎨
α ( t ) →∞ α ( t )
⎪⎩
∑ Tj ⎬
j =1 ⎭⎪
• System = system: N = λT
System = queue: N q = λW
λ
System = server: N s = λ EX =
µ
Because, by definition, T = W + EX , we have N = N q + N s .
• Let τ = interarrival times
τi = the time between the arrival of the i-1 and the ith customer
Assume that all τi’s are i.i.d., and thus have the same E [τ i ] = Eτ .
1 ⎡ customers ⎤
λ = long-term arrival rate at the system =
Eτ ⎢⎣ second ⎥⎦
n 1 1
λ = lim = =
n →∞ n n
τi Eτ
∑τ
i =0
i lim ∑
n →∞
i =0 n
• T = Average time each customer spent in the system
Ti = Time the ith customer spent in the system
= the time that elapses between the instant when
α ( t ) goes from i-1 to i
to the instant when
β ( t ) goes from i-1 to i.
[customers]

α(s)
N(s)
i β(s)
2 Ti
1
t s
Arrival of ith customer
• Let t = a time instant where α ( t ) = β ( t ) , which implies N(t) = 0.
The area between α ( t ) and β ( t ) from 0 to t:
α (t )
1) horizontally, area = ∑ T j
j =1

Note if define di = departure time, ai = arrival time of the ith customer


Then Ti = d i − ai .
α (t ) α (t ) α (t ) α (t )
Note that ∑ T j = ∑ ( d j − a j ) = ∑ d j − ∑ a j ; so, order doesn’t matter.
j =1 j =1 j =1 j =1

t t
2) vertically, area = ∫ N ( s )ds = ∫ (α ( s ) − β ( s ) )ds
0 0

t α (t )
Hence, we have ∫ N ( s )ds = ∑ T
0 j =1
j .

1 ()
t α t
1
Thus, N = lim ∫ N ( s )d = lim ∑ T j
t →∞ t t →∞ t
0 j =1

α ( t ) α (t ) T j ⎛ α ( t ) ⎞ ⎛ α (t ) T j ⎞
= lim
t →∞
∑ = ⎜ lim ⎟ ⎜ lim ∑ ⎟ = λT
t j =1 α ( t ) ⎝ t →∞ t ⎠ ⎜⎝ t →∞ j =1 α ( t ) ⎟⎠
Queuing Theory
• Standard queuing theory nomenclature
Arrival process / Service time / Servers / Max occupancy
Interarrival time τ Service times X 1 server K customers
M = exponential M = exponential c servers unspecified if unlimited
D = deterministic D = deterministic ∞
G = general G = general
1
Arrival rate: λ =
1 Service rate: µ =
Eτ EX

• 1st letter ⇒ nature of the arrival process


• M = Poisson process (Markov, memoryless) ⇒ exponentially distributed
interarrival times.
• G = general distribution of interarrival times
• D = deterministic interarrival times
• 2nd letter ⇒ nature of the probability distribution of the service times.
• M = exponential
• G = general
• D = deterministic
• 3rd letter ⇒ number of servers
• Successive interarrival times and service times are assumed to be statistically
independent of each other.
• Def:
pn = Steady state probability of having n customers in the system, n = 0, 1, …

N = Average number of customers in the system = ∑ np
n=0
n

T = Average customer time in the system


Nq = Average number of customers waiting in queue.

If there are m server, then N q = ∑ ( n − m) p
n = m +1
n

W = Average customer waiting time in queue


Ns(t) = the number of customers that are being served at time t, and let X denote the
service time.
Ns = the average number of busy servers for a system in steady state
X = service time, a random variable.
1
h = EX = = average service time.
µ
• Utilization factor:
λ
• Single server: ρ = proportion of time the server is busy = p1 = 1 − p0 = = Ns .
µ
Proof. For single-server systems, (1) system has ≥ 1 customers ≡ server is busy;
hence, p0,server = p0,system := p0 . Also, p1,server = p1,system := p1 = 1 − p0 . (2)
Ns(t) can only be 0 or 1, so Ns represents the proportion of time that the
server is busy ( p1,server ) . N s = 0 p0,server + 1 p1,server = p1,server = 1 − p0 . (3) From
Little’s theorem, N s = λ EX . Hence, 1 − p0 = N s = λ EX . Note that
1 − p0 is the proportion of time that the server is busy. For this reason, the
λ
utilization of a single-server system is defined by ρ = λ EX = .
µ
λ EX λ
• Similarly, define utilization of a m-server system by ρ = = .
m mµ
• For finite-capacity systems,
it is necessary to distinguish between the traffic load offered to a system and the
actual load carried by the system
• The offered load or traffic intensity is a measure of the demand made on the
system
= λX
• The carried load is the actual demand met by the system
= λ (1 − Pb ) X

Occupancy Distribution upon Arrival


• Probabilistic characterization of a queuing system as seen by an arriving customer.
• Unconditional steady-state probabilities
pn = lim P { N ( t ) = n}
t →∞

• Steady-state occupancy probabilities upon arrival


{
an = lim P N ( t ) = n an arrival occured just after time t
t →∞
}
• pn = an , n = 0, 1, …
for queuing systems
regardless of the distribution of the service times
if either
• the arrival process is Poisson and interarrival times and service times are
independent.
• future arrivals are independent of the current number in the system.

for every time t and increment δ > 0,
the number of arrivals in the interval (t,t+δ) is independent of the number in the
system at time t.
• the arrival process is Poisson and, at any time, the service times of previously
arrived customers and the future interarrival times are independent.
Let
A ( t , t + δ ) be the event that an arrival occurs in the interval ( t , t + δ )

( )
pn ( t ) = Pr ⎡⎣ N ( t ) = n ⎤⎦ . ⇒ pn = lim pn ( t ) .
t →∞

Then,
{
an ( t ) = P N ( t ) = n an arrival occured just after time t }
{
= lim P N ( t ) = n A ( t , t + δ )
δ →0
}
If the event A ( t , t + δ ) is independent of N ( t ) , then
an ( t ) = lim P { N ( t ) = n} = P { N ( t ) = n} = pn ( t )
δ →0

Taking the limit as t → ∞, from the definition of an and pn, we obtain an = pn.
• Ex. non-Poisson arrival process.
Suppose interarrival times are independent and uniformly distributed between
[a,b] ; a < b. Customer service times are all equal to c < a sec.
• Then, an arriving customer always finds an empty system (N = 0) .
• On the other hand, the average number in the system as seen by an outside
observer looking at a system at random time is N = λT where
1 1 2
λ= = = and T = c.
Eτ a + b a + b
2
2c
Thus, N = λT = .
a+b
• Ex. service times and future arrival times are correlated.
Packet arrival is Poisson process. Transmission time of the nth packet equals one
half the interarrival time between packets n and n+1
• Upon arrival, a packet finds the system empty.
• On the other hand, the average number in the system as seen by an outside
observer looking at a system at random time is
1 ⎛τ ⎞ 1
N = λT = ⎜ ⎟ =
τ ⎝2⎠ 2
Occupancy Distribution upon Departure
• The distribution of customers in the system just after a departure has occurred.
• d n ( t ) = P { N ( t ) = n a departure occured just before time t}
• steady-state values d n = lim d n ( t ) , n = 0, 1, …
t →∞

• dn = an, n = 0, 1, …
if
• the system reaches a steady-state with all n having positive steady-state
probabilities.
and
• N(t) changes in unit increments.
For any sample path of the system and for every n,
the number in the system will be n infinitely often (with probability 1).

For each time the number in the system increases from n to n+1 due to an arrival,
there will be a corresponding future decrease from n+1 to n due to a
departure.

In the long run,
the frequency of transitions from n to n+1 out of transitions from any k to k+1
equals
the frequency of transitions from n+1 to n out of transitions from any k+1 to
k,
which implies that dn = an.
M/G/1
• “G” ≡ general (really, GI ≡ general independent)
Service times are i.i.d.
Pr[service time ≤ t] = H(t) ≡ cdf of the service time; don’t have to be continuous

Mean service time h = ∫ tdH ( t )
0

ρ = λ h < 1 which assumes stability.


M/G/1 analysis based on Pollazek-Khinchin theory
• Polla(c)zek-Khinchin theory
• N k ( z ) = Ez nk

• h ( s ) = ∫ e − st dH ( t )
*

• R ( z ) = h* ( λ (1 − z ) )
N (z) =
(1 − ρ )( z − 1) R ( z )
z − R( z)

λ 2 h2
• N = ∑ mP ( n = m ) = N ′ (1) = ρ +
m=0 2 (1 − ρ )

λ h2
T = h+
2 (1 − ρ )

λ h2
Wq = T − h =
2 (1 − ρ )

λ 2 h2
Nq =
2 (1 − ρ )
• Distribution of N
⎧1 − ρ m=0
⎪⎪
P ( n = m ) = ⎨ dm N ( z )
⎪ dz z =0
m>0
⎪⎩ m!
• { N ( t )} is no longer Markov in non-expo service time case. However, can embed a
discrete-time Markov chain at the departure instants
• Define
nk = number of customers in system right after (upon) departure of customer k (so, not
including customer k itself.)
sk = service time of customer k.
Assume the sk are i.i.d. with common cdf H ( t ) = Fs ( t ) = P ( sk ≤ t )

Let h = ∫ tdH ( t ) be the mean service time.
0

rk = number of new customers arriving during service time of customer k


⎧n + r − 1 ; nk −1 > 0
• nk = ⎨ k −1 k
⎩rk ; nk −1 = 0
Proof
• For nk-1 > 0, after the (k-1)th customer leave, there are nk-1 customer in the system
(the kth customer is included here also.) The first one which will be served right
away is the kth customer. While the kth customer is served, rk customers arrive.
Thus when the kth customer leave, we have nk-1+rk-1 customers left in the system.
(the -1 comes from the kth customer leaving)
• For nk-1 = 0, after the (k-1)th customer leave, there are no customer in the system.
After a while (exponentially distributed random duration), the kth customer
arrives. While the kth customer is served, rk additional customers arrive. Thus
when the kth customer leave, we have 1+rk-1 = rk customers left in the system.
(The +1 and -1 is from customer k arriving and leaving.)
Another way to think about this: for the first case, nk-1 already includes the kth
customer so it has to subtract 1 out when the kth customer leave.
• Generating function:
N k ( z ) = Ez nk
Rk ( z ) = Ez rk = R ( z ) ; not a function of k because sk are i.i.d.
• Laplace-Stieltjes transform of service distribution:

h* ( s ) = ∫ e − st dH ( t )
0

• Let r be a generic rk, then


∞ ∞ ⎛∞ ⎞
R ( z ) = Ez rk = Ez r = ∑ P ( r = n ) z n = ∑ ⎜ ∫ P ( r = n s = t )dH ( t ) ⎟ z n
n =0 n=0 ⎝ 0 ⎠
∞ ∞
( λt ) e− λt ∞ ∞
( λtz ) e−λt
n n

= ∑∫ dH ( t ) z = ∑ ∫
dH ( t ) n

n=0 0n! n=0 0 n!


Interchange the sum and the integral, then we have

( λtz ) ∞ ∞
n

R ( z ) = ∫ e− λt ∑ dH ( t ) = ∫ e− λ t eλ tz dH ( t ) = ∫ e − λt (1− z ) dH ( t )
0 n=0 n! 0 0

= ∫ e ( ( )) dH ( t ) = h* ( λ (1 − z ) )
− λ 1− z t

So, R ( z ) = Ez rk = h* ( λ (1 − z ) ) .
Note: require Poisson to prove R ( z ) = h* ( λ (1 − z ) )
• Quantity of principal interest is lim N k ( z ) = N (z ) (Will show later that N ′ (1) = N )
k →∞

⎧n + r − 1 ; nk −1 > 0
Recall that nk = ⎨ k −1 k ; hence,
⎩rk ; nk −1 = 0

( )
N k ( z ) = Ez nk = P ( nk −1 = 0 ) E z nk nk −1 = 0 + P ( nk −1 > 0 ) E z nk nk −1 > 0 ( )
= P ( nk −1 = 0 ) Ez rk + P ( nk −1 > 0 ) E z nk −1 + rk −1 nk −1 ( > 0)

We already have Ez rk = R ( z ) = h* ( λ (1 − z ) ) . So, consider E z nk −1 + rk −1 nk −1 > 0 = ( )


1
z
( )
E z nk −1 + rk nk −1 > 0 . Now, because nk-1 and rk are independent,
(
E z nk −1 + rk −1 nk −1 > 0 =) 1
z
(
E z nk −1 nk −1 > 0)E(z rk
nk −1 > 0 )
1
(
= E z nk −1 nk −1 > 0
z
)E(z ) rk

1
(
= E z nk −1 nk −1 > 0
z
) R( z)
(
P ( nk −1 > 0 ) E z nk −1 + rk −1 )
nk −1 > 0 = P ( nk −1 > 0 ) E (z nk −1 + rk −1
nk −1 > 0 )
1 ⎛ ∞ P ( nk −1 = n ) n ⎞
= P ( nk −1 > 0 ) ⎜ ∑ z ⎟ R( z)
z ⎜ n =1 P ( nk −1 > 0 ) ⎟
⎝ ⎠
1⎛ ∞

= ⎜ ∑ P ( nk −1 = n ) z n ⎟ R ( z )
z ⎝ n =1 ⎠
1⎛ ∞ ⎞
= ⎜ ∑ P ( nk −1 = n ) z n − P ( nk −1 = 0 ) z 0 ⎟ R ( z )
z ⎝ n=0 ⎠
= ( N k −1 ( z ) − P ( nk −1 = 0 ) ) R ( z )
1
z
Thus N k ( z ) = P(nk −1 = 0)R( z ) + ( N k −1 ( z ) − P(nk −1 = 0))R( z )
1
z
• As k →∞ with ρ < 1, we get
• Nk(z) and Nk-1(z) → N(z) and
• P ( nk −1 = 0 ) → p0 .
We already know that, for a single server, ρ = 1 − p0 .
Thus, we have

( N ( z ) − (1 − ρ ) ) R ( z )
1
N ( z ) = (1 − ρ ) R ( z ) +
z
zN ( z ) = z (1 − ρ ) R ( z ) + N ( z ) R ( z ) − (1 − ρ ) R ( z )

N (z) =
(1 − ρ )( z − 1) R ( z )
z − R( z)
• To find N,

N = ∑ mP ( n = m )
m=0

N ( z ) = Ez n = ∑ P ( n = m )z m
m=0

d
N ( z ) = N ′ ( z ) = ∑ mP ( n = m )z m −1
dz m=0

N ′ (1) = ∑ mP ( n = m ) = N
m=0

• May be easier to use a Taylor series approach and expand around z = 1.


Introduce u = z-1, so we can expand around u = 0.
• Let

b ( u ) = R ( z ) z = u +1 = h* ( λ (1 − z ) ) = h* ( −λu ) = ∫ eλut dH ( t ) , and
0

G ( u ) ≡ N ( z ) z = u +1 =
(1 − ρ ) ub ( u ) .
u + 1 − b (u )

• By Taylor’s Theorem:
b′′ ( 0 ) 2
b ( u ) = b ( 0 ) + b′ ( 0 ) u + u + o ( u 2 ) as u → 0.
2
∞ ∞ ∞
Note that b0 = b ( 0 ) = ∫ e dH ( t ) = ∫ dH ( t ) = 1 . Also, b′ ( u ) = λ ∫ teλut dH ( t ) . Hence,
λ 0t

0 0 0
∞ ∞
b1 = b′ ( 0 ) = λ ∫ teλ 0t dH ( t ) = λ ∫ tdH ( t ) = λ h = ρ . The second derivative
0 0
∞ ∞ ∞
b′′ ( u ) = λ 2 ∫ t 2 eλut dH ( t ) . Therefore, b′′ ( 0 ) = λ 2 ∫ t 2 eλ 0t dH ( t ) = λ 2 ∫ t 2 dH ( t ) = λ 2 h 2 ,
0 0 0

b′′ ( 0 ) λ h
2 2
and b2 = = .
2 2
λ 2 h2
• N=ρ+
2 (1 − ρ )
Proof. G ( u ) =
(1 − ρ ) ub ( u ) = (1 − ρ ) ub ( u )
u + 1 − b ( u ) u + 1 − b0 − b1u − b2u 2 + o ( u 2 )

=
(1 − ρ ) u b ( u )
u + 1 − 1 − b1 u − b2u 2 − o u 2
1
( )
=
(1 − ρ ) b ( u )
(1 − b1 ) − b2u − o ( u )
1− ρ 1
= b (u )
1 − b1 b
1 − 2 u + o (u )
1 − b1
1
Now, note that as x → 0, = 1 + x + o( x) .
1 − x + o( x)
1
Pf. First, note that = 1 + x + o ( x ) . We will show that if
1− x
1 1
= g ( x ) + o ( x ) , then = g ( x) + o( x) .
f ( x) f ( x) + o( x)

1 ⎛ 1 g ( x) ⎞
Start with = g ( x ) + o ( x ) , we have lim ⎜ − ⎟ = 0.
f ( x) x → 0 ⎜ xf ( x ) x ⎟⎠

⎛ ⎞
1⎛ 1 ⎞ ⎜ 1 g ( x) ⎟
Now, lim ⎜ − g ( x ) ⎟ = lim ⎜ − ⎟.
x →0 x ⎜ f ( x ) + o ( x ) ⎟ x →0 ⎜
⎝ ⎠ xf ( x ) + o ( x 2 ) x ⎟
⎜ ⎟
⎝ 0 ⎠
=0
1− ρ ⎛ ⎞
(1 + b1u + o ( u ) ) ⎜1 + 2 u + o ( u ) ⎟
b
Hence, G ( u ) =
1 − b1 ⎝ 1 − b1 ⎠
1− ρ ⎛ ⎛ b2 ⎞ ⎞
= ⎜⎜1 + ⎜ b1 + ⎟ u + o ( u ) ⎟⎟
1 − b1 ⎝ ⎝ 1 − b1 ⎠ ⎠
From r ′ ( 0 ) = 0 for continuous r ( x ) = o ( x ) as x → 0, we then have

d 1− ρ ⎛ b2 ⎞
G′ ( 0 ) = G (u ) = ⎜ b1 + ⎟
du u =0 1 − b1 ⎝ 1 − b1 ⎠

1− ρ ⎛ b2 ⎞
Thus, N ′ (1) = G′ ( 0 ) = ⎜ b1 + ⎟.
1 − b1 ⎝ 1 − b1 ⎠
We finally have
⎛ λ 2 h2 ⎞
1− ρ ⎛ b2 ⎞ 1 − ρ ⎜ ⎟
2 ⎟=ρ+ λ h .
2 2
N = N ′ (1) = ⎜ 1
b + ⎟ = ⎜ ρ +
1 − b1 ⎝ 1 − b1 ⎠ 1 − ρ ⎜ 1− ρ ⎟ 2 (1 − ρ )
⎜ ⎟
⎝ ⎠
λ 2 h2
• Nq = N − ρ =
2 (1 − ρ )

λ 2 h2
ρ+
N 2 (1 − ρ ) λ h2
• T= = = h+
λ λ 2 (1 − ρ )

λ h2
• Wq = T − h =
2 (1 − ρ )
⎧1 − ρ m=0
⎪ m

• P ( n = m) = ⎨ d m N ( z )
⎪ dz
⎪⎩
z =0
m>0
m!

Proof. N ( z ) = ∑ P ( n = m ) z m = P ( n = 0 ) + P ( n = 1) z1 + P ( n = 2 ) z 2 + …
m =0

P ( n = 0) = N ( 0) =
(1 − ρ )( 0 − 1) R ( 0 ) = 1 − ρ
0 − R (0)

N ′ ( z ) = ∑ mP ( n = m ) z m−1 = P ( n = 1) + P ( n = 2 ) z + …
m =1

N ′ ( 0 ) = P ( n = 1) .
⎧1 − ρ m=0
⎪ m

Hence, P ( n = m ) = ⎨ d m N ( z )
⎪ dz
⎪⎩
z =0
m>0
m!
• Distribution of waiting time.

w* ( λ (1 − z ) ) =
(1 − ρ )( z − 1)
z − h* ( λ (1 − z ) )
1− ρ
w* ( s ) =
λ
1−
s
(1 − h ( s ) )
*
Observe that, in the steady-state, the random variable n that represents the system
population at the point of departure of a customer may also be thought of as the
arrivals during the total system time (sojourn time) of that customer.
Said sojourn time is the sum of the waiting random variable, w, and the service
random variable, s.
The same sort of reasoning that gave us R ( z ) = h* ( λ (1 − z ) ) can be applied to
give us the moment generating function of the number of arrivals during w + s as
N ( z ) = f * ( λ (1 − z ) )
where f * is the Laplace transform of the distribution of w + s
during s → r, R
during w + s → n, N
Since w and s are independent, the pdf of w + s is the convolution of the pdf of w
and pdf of s. This implies that the Laplace transform is the multiplication:
f * ( s ) = w* ( s ) h* ( s )
N ( z ) = f * ( λ (1 − z ) ) = w* ( λ (1 − z ) ) h* ( λ (1 − z ) ) .
where w* is the L-S transform of the distribution of w.

(1 − ρ )( z − 1) R ( z ) (1 − ρ )( z − 1) h ( λ (1 − z ) )
*

N ( z) z − R( z) z − h* ( λ (1 − z ) )
w ( λ (1 − z ) ) = *
*
= =
h ( λ (1 − z ) ) h* ( λ (1 − z ) ) h* ( λ (1 − z ) )

=
(1 − ρ )( z − 1)
z − h* ( λ (1 − z ) )
Let s = λ (1 − z )

(1 − ρ ) ⎛⎜ −
s⎞
⎟ 1− ρ 1− ρ
w* ( s ) = ⎝ λ⎠ = =
λ λ λ
⎜1 − ⎟ − h ( s ) − s + 1 + s h ( s ) 1 − s (1 − h ( s ) )
⎛ s ⎞ * * *

⎝ λ ⎠
• We did explicitly use the fact that the number of arrivals during a service of length s
is Poisson with parameter λs.
Our justification for equating the statistics just after a departure instant in equilibrium
to those at a randomly chosen instant in equilibrium also depended on the Poisson
nature of the arrivals. (need dn = an = pn).
1
• Average length of an idle period =
λ
Proof. Since an idle period occurs when the system is waiting for a customer to
arrive after the queue becomes empty. At the moment that server becomes
1
empty, by memoryless property, have to wait E ( λ ) with average for the
λ
next customer to arrive, independent of how long it has already been from the
moment when the last customer arrived.
1
• Average length of busy period = .
µ −λ
Proof. Let B = average length of buy period. We have shown that average length of
1
an idle period is . Note that the busy period and idle period are alternating
λ
sequence. Hence,
n
n ∑τ busy , i
∑τ busy ,i lim
n →∞
i =1
n B
ρ = lim i =1
= = .
∑ (τ + τ busy ,i )
n n n
n →∞ 1
idle , i ∑τ idle , i ∑τ busy , i
B+
λ
i =1
lim i =1
+ lim i =1
n n n →∞ n →∞

ρ h 1
Solving for B, we get B = = = .
λ (1 − ρ ) 1 − ρ µ − λ
1
• Avergae number of customers served in a busy period =
1− ρ
B
Idea. = .
h
M/G/1 analysis based on the concept of the mean residual service time
• Ri = Residual service time seen by the ith customer.
By this we mean that if customer j is already being served when i arrives,
Ri is the remaining time until customer j’s service time is complete.
If no customer is in service (i.e., the system is empty when i arrives), the Ri = 0.
• R = mean residual time = lim ERi
i →∞

• R = mean residual service time given that one is arrived when the server is busy
1 X2
By renewal theory: R =
2 X
• Note: If M/M/1, service time is exponentially distributed, and thus memoryless.
Therefore, given that the service time does not end there, what’s left is also
1
exponentially distributed with the same mean. So, R = .
µ
2
⎛1⎞ 1
+⎜ ⎟
1 µ ⎝µ⎠
2
1
Using the above equation gives the same result: R = = .
2 1 µ
µ
1
• R= λX2
2
Proof. We know that the probability of server being busy for single server is
p1 = λ EX = ρ . Hence,

1 X2 1
R = Rp0 + 0 p1 = Rρ = ρ = λX2
2 X 2
Proof. (Graphical argument)
Residual Service time

Xi

Xi t

r(τ) = the remaining time for completion of the customer in service at time τ
When a new service of duration X begins, r(τ) starts at X and decays linearly
for X time units.
t M (t )
1
∫ r (τ )dτ = ∑ 2 X
2
i
0 i =1

M(t) = number of triangles in [0,t] = number of service completions in [0,t]


1t 1 M ( t ) M ( t ) X i2 1⎛ M (t ) ⎞ ⎛ M (t )
X i2 ⎞
t →∞ t ∫
R = lim r (τ )dτ = lim
t →∞ 2
∑ = ⎜ lim ⎟ ⎜ lim ∑
t i=1 M ( t ) 2 ⎝ t →∞ t ⎠ ⎜⎝ t →∞ i=1 M ( t ) ⎟⎠

0

1
= λX2
2
λX2
• W=
2 (1 − ρ )
Proof. Note that the time waiting in the queue of the ith customer = residual service
time seen by the ith customer + time used to service all customers already in
the queue.
1 1
W =R+ Nq = R + λW = R + ρW .
µ µ
1
R
λX2 λX2
W= =2 = .
1− ρ 1− ρ 2 (1 − ρ )
• The average customer in queue N q and the mean residual time R as seen by an
arriving customer are also equal to the average number in queue and mean residual
time seen by an outside observer at a random time.
This is due to the Poisson character of the arrival process, which implies that the
occupancy distribution upon arrival is typical.
• M/G/1 is a renewal process when busy
M/G/1 has occasional (with probability 1-ρ of occurrence) E ( λ ) random variable
inserted into service time renewal process.
• M/G/1 queue can have ρ < 1 but infinite W if the second moment X 2 → ∞
• The formula is valid for any order of servicing customers as long as the order is
determined independently of the required service time
To see this, suppose the ith and jth customers are both in the queue and that they
exchange places.
The expected queuing time of customer i will then be exchanged with that for
customer j, but the average, over all customers, is unchanged.
Since any service order can be considered as a sequence of reversals in queue
position, the P-K formula remains valid.
M/G/1 with vacations
• At the end of each busy period, the server goes on “vacation” for some random
interval of time.
A new arrival to an idle system, rather than going into service immediately, waits for
the end of the vacation period.
If the system is still idle at the completion of a vacation, a new vacation starts
immediately.
• Let Vi’s be the durations of the successive vacations taken by the server.
Assume Vi’s are i.i.d. random variables and independent of the customer interarrival
times and service times.
1 X2
• R= = mean residual time given that arrive when the server is serving someone
2 X
1V2
= mean residual time given that arrive when the server is on vacation (idle)
2V
1 X2 1V2
R= P {server busy} + P {server idle}
2 X 2V
1 X2 1V2 1 1V2
= ρ+ (1 − ρ ) = X 2λ + (1 − ρ )
2 X 2V 2 2V
R λX2 1V2
• W= = +
1 − ρ 2 (1 − ρ ) 2 V

λX2 1V2 1V2


• Ww/ vacation = + = Ww/o vacation +
2 (1 − ρ ) 2 V 2V

M/M/1
λ
• ρ := = 1 − p0 = N s = p1
µ server

pn = ρ n (1 − ρ ) ; n = 0,1,…
ρ λ
N= =
1− ρ µ −λ
1
T=
µ −λ
ρ
W=
µ −λ
ρ2
Nq =
1− ρ
• Transient if ρ > 1; Null recurrence if ρ = 1; Ergodic if ρ < 1
• ρ = 1 − p0 = utilization factor = the long-term proportion of time the server is busy
Proof.
(1) If the system has ≥ 1 customers, the server is busy (serving surely 1
customer). This occur with probability 1 − p0 . Note also that if the server
is busy, then the system has ≥ 1 customers (at least one in the server).
Hence, p0system = p0server = p0 . If the system has 0 customer, the server is idle
(serving 0 customer). This occur with probability p0 . So, the long-term
proportion of time the server is busy = 1 − p0 . Average number of
customer in the server = N s = p0 × 0 + p1 × 1 = p1 = 1 − p0
1
(2) Now, Apply Little’s theorem to the server. Then N s = λ EX = λ := ρ .
µ
λ
From (1) and (2), ρ := = 1 − p0 = N s = p1 .
µ server

• State diagram
λ λ

0 1 i-1 i

µ µ

• pn = ρ n (1 − ρ ) ; n = 0,1,…
Proof. This is a birth-and-death process.
ρ λ
• N= =
1− ρ µ −λ
∞ ∞
ρ
Proof. N = ∑ npn = ∑ n ρ n (1 − ρ ) = (1 − ρ ) .
(1 − ρ )
2
n=0 n=0

ρ
N 1− ρ 1
• T= = =
λ λ µ −λ
1 1 λ ρ
• W =T − X = − = =
µ − λ µ µ (µ − λ) µ − λ
λ
ρ
λρ µ ρ2
• N q = λW = = =
µ − λ 1− λ 1− ρ
µ
or
N q = 0 p0 + 0 p1 + 1 p2 + 2 p3 + … + ( i − 1) pi + …
∞ ∞
= ∑ ( i − 1) pi = ∑ ( i − 1)ρ i (1 − ρ )
i=2 i=2
∞ ∞ ∞
= (1 − ρ ) ∑ mρ m +1 = (1 − ρ ) ρ ∑ mρ m +1 = (1 − ρ ) ρ ∑ mρ m
m =1 m =1 m =1

ρ ρ2
= (1 − ρ ) ρ ∑ mρ m = (1 − ρ ) ρ =
(1 − ρ ) 1 − ρ
2
m=0

• Effect of scale on performance


• m separate M/M/1 systems each: λ, µ
1
µ
ET =
1− ρ
• One consolidated system: mλ, mµ
λ ′ mλ λ
ρ′ = = = =ρ
µ ′ mµ µ
1 1
µ′ mµ 1
ET′ = = = ET (less delay)
1− ρ 1− ρ m
• The improved performance of the combined system arises from improved global
usage of the processors.
• In the separate systems,
some of the queues may be empty while others are not.
Consequently, some processors can be idle, even though there is work to be
done in the system.
• In the combined system,
the processor will stay busy as long as customers are waiting to be served
Applying M/G/1 analysis to M/M/1
• Mean residual service time
As noted above, since service time for M/M/1 is exponentially distributed, and thus
memoryless. Therefore, given that the service time does not end when the packet
arrive, what’s left of service time for the currently serviced packet is also
1
exponentially distributed with the same mean. So, R = .
µ
⎛ 1 ⎞ 1
Thus, W = ⎜ ρ + (1 − ρ ) 0 ⎟ + N Q . The average waiting time in the queue is the
⎝ µ ⎠ µ
summation of 1) the residual time of the currently serviced packet which is 0 if server
1
is idle and if server is busy and 2) The time required to service the customers
µ
already waiting in the queue which is NQ times the average service time.
1 1 ρ
and W = ρ + λW = as before.
µ µ µ −λ
• dH ( t ) = µ e − µt dt
1 2
h= , h2 =
µ µ2
2
λ
λh 2
µ2 ρ ρ
W= = = = as expected.
2 (1 − ρ ) 2 (1 − ρ ) µ (1 − ρ ) µ − λ
• pmf for N for M/M/1 system:
1 µ
h ( t ) = µ e − µt ⎯⎯
L
→ h* ( s ) = µ =
s+µ s+µ
µ
(1 − ρ )( z − 1)
(1 − ρ )( z − 1) h ( λ (1 − z ) )
*
λ (1 − z ) + µ
N (z) = =
z − h* ( λ (1 − z ) ) z−
µ
λ (1 − z ) + µ

=
(1 − ρ )( z − 1) µ = (1 − ρ )( z − 1) µ = (1 − ρ )( z − 1) µ
( λ (1 − z ) + µ ) z − µ λ (1 − z ) z + ( µ z − µ ) λ (1 − z ) z + µ ( z − 1)
=
(1 − ρ ) µ = 1 − ρ
µ − λz 1− ρz

By expanding N(z) in a power series, we have N ( z ) = ∑ (1 − ρ )( ρ z ) .
i

i =0

Since N ( z ) = ∑ P ( n = i ) z i , P ( n = i ) = (1 − ρ ) ρ i for k = 0, 1, 2, …
i =0

• pdf of W for M/M/1 system: fW ( t ) = (1 − ρ ) δ ( t ) + (1 − ρ ) λ e− µ (1− ρ )t

1− ρ 1− ρ 1− ρ 1− ρ
w* ( s ) = = = =
λ λ⎛ µ ⎞ λ ⎛ s ⎞ 1− λ
1− (1 − h ( s ) )
*
1 − ⎜1 − ⎟ 1− ⎜
s s⎝ s+µ⎠ s ⎝ s + µ ⎟⎠ s+µ

s+µ ⎛ (s + µ − λ) + λ ⎞ ⎛ λ ⎞
= (1 − ρ ) = (1 − ρ ) ⎜ ⎟ = (1 − ρ ) ⎜⎜1 + ⎟⎟
s+µ −λ ⎝ s+µ −λ ⎠ ⎝ s + (µ − λ) ⎠
( )
fW ( t ) = (1 − ρ ) δ ( t ) + λ e −( µ −λ )t = (1 − ρ ) δ ( t ) + (1 − ρ ) λ e − µ (1− ρ )t ; t > 0

• pdf of T for M/M/1 system: fT ( t ) = µ (1 − ρ ) e − µ (1− ρ )t


T * ( s ) = w* ( s ) h* ( s )
⎛ s + µ ⎞⎛ µ ⎞ µ
T * ( s ) = w* ( s ) h* ( s ) = ⎜ (1 − ρ ) ⎟⎜ ⎟ = (1 − ρ )
⎝ s + µ − λ ⎠⎝ s + µ ⎠ s+ µ −λ
fT ( t ) = (1 − ρ ) µ e−( µ −λ )t = µ (1 − ρ ) e − µ (1− ρ )t

M/D/1
• “D” ⇒ deterministic
service time = h for every customer
• h = h , h2 = h2
λ h2 λ h2
W= =
2 (1 − ρ ) 2 (1 − λ h )
1
λ
µ2 ρ
= =
2 (1 − ρ ) 2µ (1 − ρ )
λρ ρ2
N q = λW = =
2µ (1 − ρ ) 2 (1 − ρ )

ρ 1 1⎛ ρ ⎞ 1 ⎛ ρ + 2 − 2ρ ⎞ 2−ρ
T =W + h = + = ⎜⎜ + 1⎟⎟ = ⎜⎜ ⎟⎟ =
2µ (1 − ρ ) µ µ ⎝ 2 (1 − ρ ) ⎠ µ ⎝ 2 (1 − ρ ) ⎠ 2µ (1 − ρ )
ρ2 ρ 2 + 2ρ − 2ρ 2 2ρ − ρ 2
N = Nq + ρ = +ρ = =
2 (1 − ρ ) 2 (1 − ρ ) 2 (1 − ρ )

M/M/1/K
• Equilibrium (stable):
• pn λ = pn +1µ
n +1
λ ⎛λ⎞
pn +1 = pn = ⎜ ⎟ p0 = ρ n +1 p0 ; n = 0, 1, …, K-1
µ ⎝µ⎠
K K
1 − ρ K +1 1− ρ
• ∑ pn = 1 ⇒
n=0

n=0
ρ p0 = p0
n

1− ρ
= 1 ⇒ p0 =
1 − ρ K +1
1− ρ
• pn = ρ n ; n = 0, 1, …, K
1− ρ K +1

1− ρ
• P(blocking or loss) = pK = proportion of time that the system is full = ρK
1− ρ K +1

Proof. This is a truncated birth-and-death process.


• For ρ < 1 ⇒ λ < µ
• the probabilities decrease exponentially as n increases
• N tends to cluster around n = 0
• adding more buffers (K) is beneficial since the result is a reduction in loss
probability
• For ρ = 1
• all state are equally probable
1
• pn =
K +1
• For ρ > 1 ⇒ λ > µ
• pn increase with n
• pn tend to cluster toward n = K ⇒ the system tends to be full
• adding buffers is counterproductive since the system will fill up the additional
buffers.
⎧ ρ

( K + 1) ρ K +1 for ρ ≠ 1

⎪1 − ρ 1 − ρ K +1
• N= ⎨
⎪K for ρ = 1
⎪⎩ 2
• For ρ < 1,
K K
1− ρ 1− ρ K
∑ np = ∑ n 1 − ρ
n=0
n
n =0
K +1
ρn = ∑ nρ n
1 − ρ K +1 n =0
1⋅ ρ + 2 ⋅ ρ 2 + 3 ⋅ ρ 3 + … + K ρ K = s
1ρ 2 + 2 ⋅ ρ 3 + 3 ⋅ ρ 3 + … + K ρ K +1 = ρs
s (1 − ρ ) = ρ + ρ 2 + ρ 3 + … + ρ K − K ρ K +1
1 ⎛ ρ − ρ K +1 ⎞ K ρ K +1
s= ⎜ ⎟−
1− ρ ⎝ 1− ρ ⎠ 1− ρ
1− ρ K 1 − ρ ρ − ρ K +1 1 − ρ K ρ K +1

1 − ρ K +1 n = 0
n ρ n
= −
1 − ρ K +1 (1 − ρ )2 1 − ρ K +1 1 − ρ
1 ρ − ρ K +1 K ρ K +1 1 ⎛ ρ − ρ K +1 ⎞
= − = K +1 ⎜
− K ρ K +1 ⎟
1 − ρ K +1 1− ρ 1− ρ K +1
1− ρ ⎝ 1− ρ ⎠
1 ⎛ ( ρ − ρρ K +1 + ρρ K +1 − ρ K +1 ) ⎞
= ⎜ − K ρ K +1

1 − ρ K +1 ⎜ 1− ρ ⎟
⎝ ⎠
1 ⎛ ( ρ − ρρ K +1 + ρρ K +1 − ρ K +1 ) ⎞
= ⎜ − K ρ K +1

1 − ρ K +1 ⎜ 1− ρ ⎟
⎝ ⎠
1 ⎛ ρ (1 − ρ K +1 ) + ( ρ − 1) ρ K +1 ⎞
= ⎜ − K ρ K +1 ⎟
1− ρ ⎜
K +1
1− ρ ⎟
⎝ ⎠
1 ⎛ ρ (1 − ρ K +1 ) ⎞ ρ ( K + 1) ρ K +1
= ⎜ − ( K + 1) ρ K +1
⎟ = −
1 − ρ K +1 ⎜ 1 − ρ ⎟ 1− ρ 1 − ρ K +1
⎝ ⎠
• For ρ = 1,
K K
1 1 K ( K + 1) K
∑ npn = ∑ n
n=0 n=0
=
K +1 K +1 2
=
2
N
• T=
λ (1 − pK )
• For ρ → 0,
• N=
ρ

( K + 1) ρ K +1 → 0
1− ρ 1 − ρ K +1
• W→0
1− ρ
• pK = Ploss = ρK = 0
1− ρ K +1

• T = W+X → T = X
• For ρ → ∞,
• N→K
• pK → 1
K
• T→ = KEX
µ
M/M/m
• m server.
λ λ λ

i-1 i m-1 m i-1 i


0 1

µ iµ mµ mµ

1 λ
• ρ= <1

1
• p0 =
( mρ ) ( mρ )
m −1 i m


i =0 i!
+
m !(1 − ρ )
⎧ ( m ρ )i
⎪ p0 0≤i≤m
⎪ i!
• pi = ⎨
⎪m ρ
m i

⎪⎩ m ! p0 i≥m

p0 ( mρ )
m
pm
• Erlang C formula PQ = = = P{W > 0}
m!(1 − ρ ) 1− ρ
ρ PQ
• NQ =
1− ρ
ρ PQ
• W=
λ (1 − ρ )
1
• T= +W
µ
ρ PQ
• N = N Q + m ρ = mρ +
1− ρ
• For 1 ≤ i ≤ m
1λ 1
i µ pi = λ pi −1 ⇒ pi = pi −1 = ( mρ ) pi −1
iµ i
( mρ )
i

pi = p0 , 1 ≤ i ≤ m
i!
For i > m
1 λ
mµ pi = λ pi −1 ⇒ pi = pi −1 = ρ pi −1

( mρ )
m
mm ρ i
pi = ρ i −m
pm = ρ i −m
p0 = p0
m! m!
mm ρ i
pi = p0 , i ≥ m + 1
m!
⎧ ( m ρ )i ⎧ ( m ρ )i ⎧ ( m ρ )i
⎪ p0 1 ≤ i ≤ m ⎪ p0 0≤i≤m ⎪ p0 0 ≤ i ≤ m −1
⎪ ⎪ ⎪
pi = ⎨ i ! = ⎨ i! = ⎨ i!
⎪m ρ ⎪m ρ ⎪m ρ
m i m i m i

⎪⎩ m ! 0 p i ≥ m + 1 ⎪⎩ m ! p0 i ≥ m +1 ⎪ p0 i≥m
⎩ m!
Note: for i = m, can use any equation.
• PQ = P{Queuing} = probability that an arrival will find all servers busy and will be
forced to wait in queue
p0 ( mρ )
m
pm
• Erlang C formula: PQ = =
m!(1 − ρ ) 1− ρ
• probability that an arrival will find all servers busy and will be forced to wait in
queue = Pblock
• Since an arriving customer finds the system in “typical” state
PQ = P { N ≥ m}
∞ ∞
mm ρ i ∞
mm ρ i ∞
m m ρ k +m
= ∑ pi = ∑ p0 = p0 ∑ = p0 ∑ ;k = i − m
i =m i =m m! i=m m! k =0 m!
( mρ )
m

mm ρ k ρ m mm ρ m ∞
= p0 ∑ = p0 ∑ ρ k = p0
k =0 m! m! k =0 m !(1 − ρ )
• The probability of a call request finding all of the m circuits of a transmission line
busy, assumed that such a call request “remains in queue,” that is, continuously
attempts to find a free circuit.
1
• p0 =
( mρ ) + ( mρ )
m −1 i m


i =0 i! m !(1 − ρ )

( mρ ) ( mρ ) ( mρ )
i i m
∞ m −1 ∞ m −1 m −1

∑p =∑p +∑p =∑
i =0
i
i =0
i
i=m
i
i=0 i!
p0 + PQ = ∑
i=0 i!
p0 + p0
m!(1 − ρ )
⎛ m −1 ( mρ )i ( mρ ) ⎞
m

= p0 ⎜ ∑ + ⎟
⎜ i =0 i!
⎝ m!(1 − ρ ) ⎟⎠

1
Set ∑p = 1 gives p0 =
( mρ ) ( mρ )
i i m
m −1


i =0
+
i =0 i! m !(1 − ρ )

( ρ m ) p ∞ iρ i = ( ρ m ) p ρ = P ρ
m m
∞ ∞
mm ρ i + m
• N Q = ∑ ipi + m = ∑i p0 = 0∑
(1 − ρ ) 1− ρ
0 2 Q
i =0 i =0 m! m! i =0 m!
• N = NQ + N S
λ
N S = mρ =
µ
N = N Q + mρ
• Another way to find NS
( mρ )
i
m −1 ∞ m −1
N S = ∑ ipi + ∑ mpi = ∑ i p0 + mPQ
i =0 i=m i =0 i!
( mρ ) p + mp ( mρ )
i m
m −1
=∑
i =1 ( i − 1)! m!(1 − ρ )
0 0

⎛ ⎛ m ( m ρ )i ( m ρ ) m ⎞ ( mρ ) ⎞
m

= p0 ⎜ ⎜ ∑ − ⎟+m ⎟
⎜ ⎜ i =1 ( i − 1)! ( m − 1)! ⎟ m !(1 − ρ ) ⎟
⎝⎝ ⎠ ⎠
⎛ m −1 ( mρ )k +1 ⎛ m (1 − ρ )( mρ )m ( mρ ) ⎞⎟ ⎟⎞
m

= p0 ⎜ ∑ +⎜− +m
⎜ k =0 k ! ⎜
⎝ m!(1 − ρ ) m!(1 − ρ ) ⎟⎠ ⎟
⎝ ⎠
⎛ ( mρ ) + mρ ( mρ ) ⎞⎟
k m
m −1
= p0 ⎜ mρ ∑

⎝ k =0 k! m!(1 − ρ ) ⎟⎠

⎛ m −1 ( mρ )k ( mρ ) ⎞⎟
m

mρ ⎜ ∑ +
⎜ k =0 k !
⎝ m!(1 − ρ ) ⎟⎠
= = mρ
( mρ ) ( mρ )
m −1 i m


i =0 i!
+
m!(1 − ρ )

M/M/m/m ⇒ Erlang model


λ
• ρ=
µ
1
• p0 = m
ρi
∑ i!
i =0

ρn
ρ n
n!
• pn = p0 = m
ρi
∑ i!
n!
i =0

ρm
• Erlang B formula: plost = pm = mm ! i
ρ
∑ i!i =0
λ λ λ

0 1 i-1 i m-1 m

µ iµ mµ

M/M/∞
λ λ

0 1 i-1 i

µ iµ

λ
• ρ=
µ
1 1
• p0 = = = e− ρ

ρ i
e ρ

∑ i!
i =0

ρi ρi
• pi = p0 = e− ρ ⇒ Poisson
i! i!
Comparison
Engsett:
kλ (k-i)λ (k-c+1)λ

0 1 i i+1 c-1 c

µ (i+1)µ cµ

Erlang:
λ λ λ

0 1 i i+1 c-1 c

µ (i+1)µ cµ

M/M/1
λ λ

0 1 i-1 i

µ µ

M/M/m
λ λ λ

i-1 i m-1 m i-1 i


0 1

µ iµ mµ mµ

M/M/m/m
λ λ λ

0 1 i-1 i m-1 m

µ iµ mµ

M/M/∞
λ λ

0 1 i-1 i

µ iµ

Etc
• Burke’s theorem:
For an M/M/1. M/M/c, or M/M/∞ queuing system at steady state with arrival rate λ,
then
• The departure process is Poisson with rate λ
• At each time t, the number of customers in the system n(t) is independent of the
sequence of departure times prior to t.

You might also like