Chapter 2
Chapter 2
November 2, 2023
Continuous-Time Markov Chains
DTMCs are totaly synchronized, in that the state only changes
at discrete time steps, whereas in CTMCs the state can
change at any time. This makes CTMCs more realistic for
modeling computer systems, where events can occur at any
time. In preparation for CTMCs, we need to dicuss the
exponential distribution and Poisson arrival process.
Let’s consider the residence time Ti at state i in the context
of a continuous-time process fXt gt 0 .
By hypothesis, we have the following property: The residence
time Ti is a continuous random variable with truly positive
values, thus prohibiting instantaneous visits. In fact, it follows
an exponential law with parameter 0 λi < ∞, with λi = 0
when the i is an absorbing state.
Exponential Distribution
De…nition
We say that a random variable X is distributed exponentially
with rate λ, X E (λ) , if X has the probability density
function:
λe λt , t 0
f (x ) =
0, t < 0
and the cummulative distribution function,
F (x ) = 1 e λt 1t 0 . So F (x ) = e λt , t 0.
In addition,
1 2 1
E [X ] = , E X 2 = 2 , Var (X ) = 2
λ λ λ
h i λ
E e tX = for t < λ.
λ t
Exponential Distribution
Remark
And the square coe¢ cient of variation of random varaible X is
de…ned as
Var (X )
CVX2 =
E [X ]2
This can be thought of as the scaled or normalized variance.
When X E (λ) , CVX2 = 1. A random variable X is said to
be memoryless if
Proposition
Given X1 , X2 , ..., Xn independant r.v of exponential law with
respective parameters λ1 , λ2 , ..., λn . We de…ne
X = min (X1 , X2 , ..., Xn ).
The variables X1 , X2 , ..., Xn represent times before di¤erent
events occur, and then X is the time until the …rst of these
events occurs. We have the following properties:
Exponential Distribution
Theorem
Given X1 , X2 , ..., Xn i.i.d of E (λi ) , 1 i n. Let
X = min (X1 , X2 , ..., Xn ) . Then
a)
X E (λ1 + λ2 + ... + λn ) ;
b)
λi
P (X Xi ) = , 8i = 1, 2, ..., n;
λ1 + λ2 + ... + λn
c) The event X Xi is independent of the r.v X .
Processus de Poisson
A Poisson process is a continuous-time Markov chain
fXt , t 0g , which counts the number of arrivals of an event
that occur randomly over time at a constant rate, for example
customers or accidents. The intensity of a Poisson process is
the instantaneous rate λ > 0 with which arrivals occur.
A ‡ow of random events can be described mathematically in
two di¤erent ways:
1. The number of events Xt occurring in [0, t ] and seek to
determine the probability law of this discrete random
variable. The process fXt , t 0g , is called the
"counting process".
2. The times intervals that separate the instants of
occurrence of two consecutive events is called
"inter-arrival time". These are independent and
identically distributed r.v with law E (λ). This process is
called the "birth process".
Processus de Poisson
De…nition
A Poisson process having rate λ is a sequence of events such
that
(λt )k λt
P (X (t ) = k ) = pk (t ) = e , λ > 0, k 0;
k!
E [X (t )] = λt et Var [X (t )] = λt.
De…nition
These relations de…ne the transitional state of the Poisson
process. No stationary state exists, since
pk = lim pk (t ) = 0, 8k 0.
t !∞
Processus de Poisson
De…nition
The interval time time Ti , (i 1) that separates any instant
from the next event is a random variable distributed according
to an E (λ) law.
Particular Markov Processes
Birth and Death Processes
The processes in question can generally be used to write the
temporal evolution of the size of a population of a given type.
They are widely used to model waiting phenomena or systems
subject to repairable failures. They are obtained by
superimposing a birth process and a death process. This
Markov process Xt represents the size of a population at time
t. These are stochastic processes with continuous time and
discrete states space (S = f0, 1, 2, ...g ). They are
characterized by two important conditions:
- Memoryless;
- transitions are only possible to one or other of the
neighboring states,from state n the possible
transitions are n 1 and n + 1 with n 1.
Birth and Death Process
Let Xt = N (t ) be the size of a population at time t (number
of individuals present). De…ne
pn (t ) = P (N (t ) = n) (during [0, t ]);
and pij (t ) be the probability that at the moment t the number
of individuals is j such that there were already i individuals in
the population. We have that
pi ,j (t ) = P(N (t + s ) = j/N (s ) = i ) does not depend on s
(the process is homogeneous), so
pi ,i +1 (∆t ) = λi ∆t + o∆t,
pi ,i 1 (∆t ) = µi ∆t + o∆t,
pi ,i (∆t ) = 1 (λi + µi )∆t + o∆t,
1, i = j
pi ,j (∆t ) = o∆t if ji j j 2 and pi ,j (0) = δij =
0, i 6= j
with λi > 0, µi > 0 and µ0 = 0.
Birth and Death Process
Transitory State
P(t + ∆t ) = P(t ) M
n = λWs, nq = λW ,
and
λ 1
n = nq + , Ws = W + ,
µ µ
where λ is the rate at which customers enter the system, µ1 is
the mean service time (µ > 0). Another important measure
for a queueing system, the one that measures the degree of
saturation of the system, is the tra¢ c intensity ρ. It is de…ned
by
mean service time
ρ= .
mean time between two successive arrivals
Queueing Model M/M/1
Model Description
fN (t ) , t 0g , (3)
where,
2 s 3 1
s 1 n λ
1 λ
p0 = 4 ∑
µ
+ 5 .
n =0 n! µ λ
s! 1 sµ
Queueing Model M/M/s
Stationary state
We get
n
1 λ
pn = p0 , 1 n<s
n! µ
s
1 λ
pn = ρn s
p0 , n s
s! µ
We can write
pn = ρn s
ps , n s
The probabilité ps that the customer will have to wait for
service if only the number of customers n s, called Erlang
formula, it is given by
s
λ
µ (sρ)s
ps = p0 = p0 .
s! 1 λ s! (1 ρ)
sµ
Queueing Model M/M/s
Mesures de performance