0% found this document useful (0 votes)
60 views29 pages

Week 3-Stochastic Processes

Examples 1. No. requests processed by a Web-server system during interval (0, t) - cont.- time, discrete-space 2. Xn = response time of the n-th inquiry to the central processing system of an interactive computer system. The stochastic process (Xn, n = 1, 2, ) is a discrete-time, continuous-space stochastic process. 3. No. requests served during n-th hour of day, {Xn, n = 1, 2, ..., 24} discrete- time, discrete-space 4. Response time of request to web server given that it arrives at time t; {X(t

Uploaded by

MI Brand
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views29 pages

Week 3-Stochastic Processes

Examples 1. No. requests processed by a Web-server system during interval (0, t) - cont.- time, discrete-space 2. Xn = response time of the n-th inquiry to the central processing system of an interactive computer system. The stochastic process (Xn, n = 1, 2, ) is a discrete-time, continuous-space stochastic process. 3. No. requests served during n-th hour of day, {Xn, n = 1, 2, ..., 24} discrete- time, discrete-space 4. Response time of request to web server given that it arrives at time t; {X(t

Uploaded by

MI Brand
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 29

Stochastic Process, Poisson

Process and Markov Chain


Introduction
• Stochastic Process
• Poisson Process
• Markov Chain
Stochastic Process
Stochastic Processes
• A stochastic process
• Takes on random values
…….at times
• Stochastic process X = (X(t), t  T } is a collection of random variables
(rvs)
• one random variable for each X(t) for each t  T
• Index set T - possible set of values at t
• If T is countable, then X is discrete-time process, will use notation
X = { Xn , n  T}
• If T is continuous, then X = { Xt , t  T} is continuous-time process
• X(t) can take values from discrete- or continuous-state space
• State space E - possible set of values of X(t)

If the set E is countable then X is called a discrete-space stochastic process;

if the set E is continuous then X is called a continuous-space stochastic process.

• Thus,
A stochastic process is characterized by type of time and space it represents
Examples
1. No. requests processed by a Web-server system during interval (0, t) - cont.-
time, discrete-space
2. Xn = response time of the n-th inquiry to the central processing system of an
interactive computer system. The stochastic process (Xn, n = 1, 2, … ) is a
discrete-time, continuous-space stochastic process.
3. No. requests served during n-th hour of day, {Xn , n = 1, 2, …, 24} discrete-time,
discrete-space
4. Response time of request to web server given that it arrives at time t;
{ X(t), t > 0}. Cont. time, continuous space stochastic process
Counting Process
A stochastic process that represents number of events that occurred by time t; a continuous-time,
discrete-state process { N(t), t > 0}
Definition: { N(t), t > 0} is a counting process if

• N(0) = 0

• N(t) ≥ 0

• N(t) increasing (non-decreasing) in t

• N(t) − N(s) is no. of events in interval [s, t]


Counting Process (2)
• Counting process has independent increments if no. events in disjoint
intervals are independent

if N1 and N2 describe disjoint intervals.

P ( N 1  n1 , N 2  n2 )  P ( N 1  n1 ) P ( N 2  n2 )

• Counting process has stationary increments if no. of events in


[t1 + s, t2 + s] has the same distribution as no. of events in [t1, t2], s > 0
• Bernoulli Process has stationary and independent increments (verify)
Little o notation
Definition: f(h) is o(h) if
f(h)
lim
h 0 h
0

• f(h) = h2 is o(h)

• f(h) = h is not

• f(h) = hr; r > 1 is o(h), sin(h) is not

• if f, g are o(h), then f(h) + g(h) is o(h)


Example: Exponential random variable

Exponential random variable X with parameter 


has distribution P(X ≤ h) = 1 - e-h , h > 0

P[X  t  h|X  t]  P[X  h]


 1-e -h
 ( h)
 
n

 1  1  h  
 n 2 n! 
 h  o ( h)
Poisson Process
Poisson Process
A stochastic process is a Poisson process if:

• Counting process {N(t), t ≥ 0} with rate  > 0

• Probability(one event in interval of duration h) = h + o(h)

• Probability(more than one event in interval h) = o(h)

= > P(N(h) = 0) = (1 - h) + o(h)

• The numbers of events occurring in non-overlapping intervals of time are independent of each other.

• Independent and stationary increments


One of the original applications of the Poisson process in communications was to
model the arrivals of calls to a telephone exchange (the work of A. K. Erlang in 1919).
The aggregate use of telephones, at least in a first analysis, can be modeled as a
Poisson process.
• Let Pn(t) be the probability that exactly n events occur in an interval of length t, namely, Pn(t) = P(N(t) =
n). We have, for each n   , t ≥ 0,

(  t ) n  t
Pn (t )  e
n!

• Proof
• Compute Po(t) from Po(t+h)
• Compute Pn(t) from Pn(t+h)
Properties of Poisson Process
• time between events is exponentially distributed

• if N(t) is a Poisson process and one event occurs in [0, t], then the time to the event is uniformly
distributed in [0, t]

• if N1(t) and N2(t) are independent Poisson processes with rates 1 and 2 , then N(t) = N1(t)+N2(t) is a
Poisson process with rate 1+ 2

• N(t) is Poisson with rate , and Mi is Bernoulli with success probability p. Construct a new process L(t) by
only counting the n-th event in N(t) whenever M n > Mn-1 (i.e., success at time n). L(t) is Poisson with rate p

• Exhibits memoryless property (show)


• For each t > 0, show that the mean number if events in an interval t,
E[N(t)] = t

Solution: Hint, use the fact that


( t )
n


k 0 n!
 e t
Markov Models
Markov Process
A Markov process is a stochastic process (X(t), t  T), X(t)  R , such that

P(X(t) ≤ x  | X(t1) = x1 , . . . , X(tn) = xn) = P( X(t) ≤ x | X(tn) = xn),

for all x1, . . . , xn, x  R , t1, . . . , tn, t  T with t1 < t2 < . . . < tn < t.

the probabilistic future of the process depends only on the current state and not upon the history
of the process.
Types:
• Markov Process with continuous state-space
• Markov process with discrete state-space: Markov Chain
Discrete-time Markov Chain (MC)

• P[Xn+1 = j |Xn = i, Xn-1 = in-1,Xn-2 = in-2 , . . . , X1 = i1] = P[Xn+1 = j | Xn = i]


transition probability from state i to j for all i0, . . . , in-1, i, j  
• Characterized by states and transitions between states
• Homogeneous Markov chain : if the transition probabilities are independent of n (or time). If so,
we can write

pij = P(Xn+1 = j | Xn = i),  i, j  

• Irreducible Markov chain : if every state can be reached from every other states
Periodic Markov Chain
• Periodic Markov chain: is a MC where all its states are periodic

A state i has period k if any return to state i must occur in multiples of k time steps. For example, if it is only
possible to return to state i in an even number of steps, then i is periodic with period 2. Formally, the period of a
state is defined as

k = gcd{n: P(Xn = i / Xo = i) > 0}

(where "gcd" is the greatest common divisor)

• If k = 1, then the state is said to be aperiodic; otherwise (k>1), the state is said to be periodic with period k.

• A Markov chain is aperiodic is all its states are aperiodic


Transition Matrix
• Define P = [pij] to be a transition matrix of a M.C., namely

 p00 p01 ... p0 j ... 


 
 p10 p11 ... p1 j ... 
 
P   ... ... ... ... ... 
 pi 0 pi1 ... pij ... 
 
 ... ... ... ... ... 
 

pij  0 i , j  I
where
p
jI
ij  1 i  I
• The n-step transition probabilities is defined by

pijn is the probabilit y of going from state i to state j in n steps,


pij  P ( X n  j | X o  i )
(n)

for all i, j  I , n  0

• Chapman – Kolmogorov Equation ,For all n ≥0, m ≥ 0, i,j  , we have


pij( n  m )   pikn pkjm
kI

or in matrix notation
P ( n )  [ pij( n ) ]
P (n+m)
= P P (n) (m)
where

Therefore,
P(n) = Pn , where Pn is the n-th power of Matrix P
• pij is conditional probability

• i is unconditional probability of the system to be in state i at time n

n(i) = P(Xn = i)

• Assuming that o(i) = P(Xo = i) is known, by Bayes’ rule we get


P( X n  j )   P( X
iI
n  j | X 0  i ) 0 (i )

 p 
iI
n
ij 0 (i )
• Then, for all n ≥ 1,

n = 0 Pn,

where m = (m(0), m(1), . . . , ) for all m ≥ 0. From the above relationship, we can
deduce that

n+1 = n P , for all n ≥ 0


• Limiting solution (steady state probability)
 ( n )  ( 0( n ) ,  1( n ) ,...,  k( n ) )
Let

Given (0), we can perform


 (1 )   ( 0 ) P
 ( 2 )   (1 ) P   ( 0 ) P 2
... ....
 ( n)   (0) P n
... ....
  P

Limiting solution doesn’t depend on the initial solution


How to compute the steady state solution 

Solving the linear equations defined by  = P


  (i)  1
iI

Plus using the normalization condition

0 3/4 1/4 
P 
• Example:
1/4 0 3/4 

1/4 1/4 1/2 

Draw the M.C.


Compute  = (0 , 1 , 2)

Answer: 0 = 0.2 , 1 = 0.28, 2 = 0.52


• State i is reachable from state j if pij (n) > 0 for some n ≥ 1
• If j is reachable from i and i is reachable from j then i and j are communicate (i ↔j )

A M.C. is irreducible if i ↔j for all i, j  

• For every state i   define the integer d(i) as the largest common divisor of all integers n such that
pii(n) > 0. If d(i) = 1 then the state i is aperiodic

A Markov Chain is aperiodic if all states are aperiodic


• That is: if a Markov Chain with transition matrix P is irreducible and aperiodic, and if the system of
equations

 =P
1=1

has a strictly positive solution (i.e., for all i , (i), the ith element of the row vector  , is strictly
positive) then

 (i )  lim n  n (i )

for all i , independently of the initial distribution

• Invariant measure of M. C.
………….END……….
Any Questions?

You might also like