0% found this document useful (0 votes)
20 views48 pages

Introduction To Stochastic Processes

The document provides an introduction to stochastic processes, focusing on the concepts of probability generating functions, mean and variance of random variables, and the exponential distribution. It explains the definition and features of stochastic processes, including parameter space, state space, and dependence relationships. Additionally, it discusses classifications of stochastic processes and key properties such as the Markov property and stationarity.

Uploaded by

jeromejunior444
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views48 pages

Introduction To Stochastic Processes

The document provides an introduction to stochastic processes, focusing on the concepts of probability generating functions, mean and variance of random variables, and the exponential distribution. It explains the definition and features of stochastic processes, including parameter space, state space, and dependence relationships. Additionally, it discusses classifications of stochastic processes and key properties such as the Markov property and stationarity.

Uploaded by

jeromejunior444
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 48

INTRODUCTION TO

STOCHASTIC PROCESSES
Review of concepts of a sum of
random variables

(1) Probability generating functions(p.g.f)

Let
𝑋 1 , 𝑋 2 , … ., 𝑋 𝑁be independent and identically (i.i.d ) random

variables; each random variable have p.g.f G(s)

Let SN be a sum of the N random variables ;


(a) If N is a fixed number, then the p.g.f of SN is
(b) If N is a random number, with p.g.f P(s) then the p.g.f of SN is
(2) Mean and variance

Let 𝑋 1 , 𝑋 2 , … ., 𝑋 𝑁 be i.i.d random variables, and SN be the sum of the


N random variables ;

(a) If N is a fixed number,


(b) If N is a random number,
Exponential distribution

 It is a continuous distribution which is often concerned with the amount of


time until some specific event occurs.

 Values for a random variable having exponential distribution occur in such a


way that there are fewer large values and more small values.

 The distribution is defined by a parameter referred to as rate parameter.


A continuous random variable X is said to have exponential distribution
with rate parameter  if its probability density function is given by

  e  x x  0
f ( x) 
0, elsewhere
The mean and variance of a random variable having exponential distribution
with rate parameter  are given by

1 1
E[ X ]  and Var[ X ] 
 2
Alternatively the probability density function can a be defined as

The mean and variance are given by


The cumulative distribution function is given by

and hence
Properties of exponential distribution

1. It has memoryless property


2. The sum of a fixed number of i.i.d exponential random variables is a random
variable having gamma distribution

then
The probability density function of S is defined by

The mean and variance of S are given by

and
3. The distribution of the smallest of independent exponential distribution is
exponential distribution with rate parameter equal to the sum of rate
parameters of all random variables considered
then
For any two independent exponentially distributed random variables
𝑋 1 𝐸𝑥𝑝 ( 𝜆1 ) and 𝑋 2 𝐸𝑥𝑝 ( 𝜆2 )
Basics of Stochastic processes

 A stochastic or random process is a sequence of events occurring over


time, where each step follows from the last after a random choice.

 A stochastic process is a mathematical model used to model time(or space)


dependence of a random phenomenon usually referred to as a system.
 The complete evolution of the system is modeled by assigning a random
variable to each point in time.

 The collection of all these random variables is called a stochastic process


Definition of stochastic process

A stochastic process is a family of time indexed random variables Xt or X(t)

where t belongs to an index set T


The process is usually defined as

{X(t), t  T} or {Xt , t  T}

 The possible values X(t) can take on are referred to as states or realizations
 At every time t; a random variable X(t) is observed and a sequence of random

variables
 X (0), X (1), X (2), X (3),........
is observed over time leading to a collection of random variables that are

functions of time and hence governed by probabilistic laws


Stochastic processes are described by three main features:

(a) Parameter space: this is also referred to as index set and it is the set of all
possible values of time t . T = [0, ∞); time is a continuous or T = {0, 1, 2, …};
time is discrete .

(b) State space: the set of all values of states or realizations; these are either
continuous or discrete
Stochastic processes are classified as:

(i) Discrete space-discrete time processes

(ii) Discrete space-continuous time processes

(iii)Continuous space-discrete time processes

(iv)Continuous space-continuous time processes


Examples of stochastic processes:
(i) Let {X(t): t ∈ T} be a stochastic process that models the state of health of
policyholders of a life insurance company. Suppose the company classifies its
policyholders as either healthy, sick or dead; then the state space usually denoted by
S is S = {Healthy, Sick, Dead}. If the health states of policyholders are observed
continuously, then index set T is T = [0,∞) and the process is discrete state-
continuous time process.
However, if the health states of policyholders are observed daily, the index set T is
T = 0, 1, 2, ... and the process is discrete state-discrete time process.
(ii)Let Xt : t ∈ T be a stochastic process that models the stock price of a
company.
Then the state space S = ℜ+ if we assume that the stock price can assume any
positive real number. The time space T can either be discrete or continuous.
This process can either be a continuous state-continuous time process or a
continuous state-discrete time process.
(iii) Consider a model Xt : t ∈ T for the daily maximal temperatures observed in

Nairobi; Xt is the temperature reading observed on day t.


The index set T is T is T =0, 1, 2, .... is discrete and the state space is continuous.
Therefore, the model is continuous state-discrete time process.
(c) Dependence relationship:
The random variables are interdependent; this dependence is described using joint
distribution of the variables using via conditional probabilities

To deal with the dependence of the random variables, we usually


need to make some simplifying assumptions about the nature of dependence.
The most common assumptions made are:

(i) Markov property.

(ii) Stationarity.

(iii) Independent and stationary increments.


Markov property

A random process that evolves in such away that the future depends only on the
present state and not on past history is said to have Markov property; i.e.
the process with memoryless property
Stationary and independent increments

Process with independent increments:


A continuous time process is said to have independent increments if for
parameters tj; j = 0, 1, .., n such that t0 < t1 < t2 < ... < tn
the random variables
X(t1) − X(t0),X(t2) − X(t1), ...,X(tn) − X(tn-1)
are independent of each other.
X(t4)
No. of realizations

X(t3) X(t4)-X(t3)

X(t2) X(t3)-X(t2)

X(t1) X(t2)-X(t1)

X(t1)-X(t0)

t0 t1 t2 t3 t4
Process with stationary increments:
A continuous time process is said to have stationary increments if the random
variables X(t + s) − X(t) ∀t have the same distribution; i.e distribution of the
increments depend only the length of time intervals.
Stationarity
Stationary process
If the joint probability distribution of the random variables
[X(t1 ) , X(t2 ) , ..., X(tn ) ] is the same as the joint distribution of the variables [X(t1 +

h),X(t2 + h), ...,X(tn + h)]; h > 0; i.e.

Pr [X(t1 )  x1 , X(t2 )  x2 , ….., X(tn )  xn]=Pr [X(t1 + h)  x1,X(t2 + h)  x2, ...,

X(tn + h) xn]


then the distribution of the process X(t) is the same for all t.

You might also like