0% found this document useful (0 votes)
16 views181 pages

Module-3 Part-1 - Merged

Uploaded by

siddhanth.you
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views181 pages

Module-3 Part-1 - Merged

Uploaded by

siddhanth.you
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 181

Module-3

Random Processes – Temporal Characteristics

1 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Synopsis
 Introduction
 Random Process: Classifications.
 Stationarity and Independence.
 Time Averages and Ergodic Random process.
 Characterizing a Random Process: The Mean, Correlation Functions,
Covariance Functions, and their Properties
 Different processes:
 Gaussian Random Process
 Poisson Random Process,
 Weiner Process,
 Markov process,
 Complex Random Process.

2 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Introduction
 We were concerned with outcomes of random experiments and the
random variable used to represent them in previous two modules.

 We know that a random variable is a mapping of an event 𝒔 𝝐 𝑺


where S is the sample space to some real number X(s).

 The random variable approach is applied to random problems that


are not functions of time.

 However, in certain random experiments, the outcome may be a


function of time.

 Especially in engineering, many random problems are time


dependent.

3 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Introduction
 Speech signal which is a variation in voltage due to speech utterance is a
function of time.

 In a communication system, a set of messages that are to be transmitted


over a channel is also a function of time.

 Such time functions are called random processes.


 In communication systems, a desired deterministic signal is often
accompanied by undesired random waveform known as noise which
limits the performance of the system.

 Since the noise is a function of time and cannot be represented by a


mathematical equation, it can be treated as a random process.

4 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Introduction
 we will study such random processes which may be viewed as a
collection of random variables with ′𝒕′ as a parameter.

 That is, instead of a single number X(s), we deal with X(s,t) where
t 𝝐 𝑻 and T is called the parameter set of the process.

5 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Introduction
 For a random process X(s, t), the sample space is a collection of time
function or sample function or member function.

 A random process becomes a random variable when time is fixed at


some particular value.

 For example, if ′𝒕′ is fixed to a value 𝒕𝟏 then the value of 𝑿(𝒔, 𝒕) at 𝒕𝟏


is a random variable 𝑿(𝒔, 𝒕𝟏 ).

 On the other hand, if we fix the sample points ′𝑺′ , X(t) is some real
function of time.

 Thus, for a fixed 𝒔𝟏 , 𝑿(𝒔𝟏 , 𝒕)can be viewed as a collection of time


functions.

6 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Introduction

 𝑿(𝒔, 𝒕𝟏 ) is a random variable for a fixed time.

 𝑿(𝒔𝟏 , 𝒕) is a sample realization for any point s, in the sample space S.

 𝑿(𝒔𝟏 , 𝒕𝟏 ) is a number.

 𝑿(𝒔, 𝒕) is a collection of realization and is called a random process.

7 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Example
 Consider a random experiment in which we toss a dice and observe
the dots on the top face.

Outcomes of a dice toss

The set of the waveforms 𝑥1 𝑡 , 𝑥2 𝑡 , … , 𝑥6 (𝑡) represents the random process.


8 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Example

 In communication systems, the carrier signal is often modelled as a


sinusoid with random phase which is given by
𝒙 𝒕 = 𝑨𝒄𝒐𝒔(𝟐𝝅𝒇𝒕 + 𝜽)

 The phase 𝜃 is a random variable with uniform distribution between


− 𝜋 and 𝜋.

 The reason for using random phase is that the receiver does not
know the time when the transmitter was turned ON or

 The distance from the transmitter to the receiver.

9 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Example

10 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Classification of Random Processes
 In general, a random process or stochastic process 𝑿(𝒔, 𝒕) can be
defined as a random time function assigned to each outcome.

 Depending on the way the values t and s are defined, the random
process can be classified into

1. Continuous random process

2. Discrete random process

3. Continuous random sequence

4. Discrete random sequence

11 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Continuous Random Process
 A continuous random process is one in which the random variable X
is continuous and time ‘t’ can have any of a continuum of values.

 The examples of continuous random process are thermal noise in


conductors, shot noise in transistors and wind velocity.

12 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Discrete Random Process
 A discrete random process is one in which the random variable X
assumes discrete values while t is continuous.

 A random telegraph process is an example for discrete random


process.

 At any time instant, the random telegraph signal takes one of two
possible states, either 0 or 1.

 Figure shows one possible realization of random telegraph signal in


which the signal takes the value 1 for the time interval T1 and 0 for
the time interval T2.
13 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Continuous Random Sequence
 In a continuous random sequence, the random variable X is
continuous but the time has only discrete values.

 A continuous random sequence can be obtained by sampling the


continuous random process.

It is also known as a discrete-time random process.


14 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Discrete Random Sequence
 In a discrete random sequence, both random variables X and time ‘t’
are discrete.

 The discrete random sequence can be obtained by sampling the


discrete random process or by quantizing the continuous random
sequence.

 This process is also known as digital process

15 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Deterministic and Non-Deterministic
Process

 A random process is said to be a non-deterministic random process


if future values of any sample function cannot be exactly predicted
from the observed past values.

 Almost all natural random processes are nondeterministic.

 A random process is said to be deterministic if future values of any


sample function can be predicted from past values.

16 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Distribution and Density Functions
 A random process is completely characterized by the joint pdf.

 The value of a random process 𝑋(𝑡) at a particular time 𝑡1 is a


random variable 𝑋(𝑡1 ).

 Let us define the distribution function associated with this random


variable as

 for any real number x1.

 Similarly, we can write the distribution function for other random


variables x1, x2,…, xn as

17 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Distribution and Density Functions

 Where,

 The function 𝐹𝑋 𝑥1 , 𝑡1 is known as first order distribution function


X(t).

18 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Distribution and Density Functions

 For two random variables, 𝑋(𝑡1 ) = 𝑋1 and 𝑋(𝑡2 ) = 𝑋2 , the joint


distribution function known as second-order joint distribution
function of the random process X(t) is given by

19 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Joint Density Function of a Random Process
 The joint density function of a random process can be obtained by
differentiating the corresponding distribution functions:

 The first order density function is

 The second order density function is

20 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Mean of a Random Process
 The mean value of a random process X(t) is an ensemble average of
the random process X(t).

 It is a function of time and is denoted by

 where E[X(t)] is the excepted value of X(t).

 If the probability density function of a random process is 𝑓𝑋 (𝑥, 𝑡)


then the mean of X(t) is

21 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Autocorrelation Function of a Random
Process
 Autocorrelation provides the similarity between two observations of
a random process.

 Consider a random process 𝑋(𝑡) for different times 𝑡1 and 𝑡2 . Let


the joint density function of these two random variables be
𝑓𝑋 (𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 ).

 The autocorrelation function of 𝑋1 and 𝑋2 are defined as

22 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Autocorrelation Function of a Random
Process

 If we assume the time 𝑡1 = 𝑡 and 𝑡2 = 𝑡 + 𝜏 then


𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝐸[𝑋 𝑡1 𝑋 𝑡2 ]

 The parameter 𝜏 is called as delay time or lag time

23 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


STATISTICAL INDEPENDENCE
 Consider the random processes X(t) and Y(t) with random variables
𝑋 𝑡1 , 𝑋 𝑡2 , … , 𝑋 𝑡𝑁 and 𝑌 𝑡1′ , 𝑌 𝑡2′ , … , 𝑌 𝑡𝑁′

 Let the joint density function of the random variable X(t) be


𝑓𝑋 (𝑥1 , 𝑥2 … , 𝑥𝑁 ; 𝑡1 , 𝑡2 , … , 𝑡𝑁 )

 Let the joint density function of the random variable Y(t) be


𝑓𝑌 (𝑦1 , 𝑦2 … , 𝑦𝑁 ; 𝑡′1 , 𝑡′2 , … , 𝑡′𝑁 )

 The two random processes X(t) and Y(t) are said to be strictly
independent if the joint density function of random process X(t) and
Y(t) is equal to be product of the individual joint density function of
X(t) and Y(t).

24 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


STATIONARY- Strict Sense Stationary
 A random process is said to be stationary if all its statistical
properties such as mean, variance, etc., do not change with time.

 For such processes, the marginal and joint density function are
invariant for any time shift 𝜏.

 Consider a random process X(t) with random variable 𝑋 𝑡1 =


𝑥𝑖 , 𝑖 = 1,2, … 𝑁 with pdf 𝑓𝑋 (𝑥1 , 𝑥2 … , 𝑥𝑁 ; 𝑡1 , 𝑡2 , … , 𝑡𝑁 ).

 The above process is said to be stationary if it satisfies the relation


𝒇𝑿 𝒙𝟏 , 𝒙𝟐 … , 𝒙𝑵 ; 𝒕𝟏 , … , 𝒕𝑵 = 𝒇𝑿 (𝒙𝟏 , 𝒙𝟐 … , 𝒙𝑵 ; 𝒕𝟏 + 𝝉, … , 𝒕𝑵 + 𝝉)
 If any of the probability density functions do change with the choice of
time origin, then the process is non-stationary. For such processes, the
mean and variance also depend on time.

1 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


First-order Stationary
 A random process is first-order stationary if its first-order
distribution and density functions do not change with shift in time.

 That is,

 When a random process satisfies the above condition, the mean


value of the process does not change with time shift.

 The mean value of the random process X(t1) is

2 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


First-order Stationary
 Similarly, the mean value of the random variable X(t2) is

 Since 𝑡2 = 𝑡1 + 𝜏, we have

 we can write

 That is, the mean value of the random process is constant and does
not change with a shift in time origin.

3 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Second-order And Wide-Sense Stationary
 A random process is second-order stationary if its second-order
distribution and density functions do not change with shift in time.

 That is,

 For a second-order stationary process, the mean value is constant


and does not change with change in time shift.

 The autocorrelation function of a second-order stationary process is


a function of time difference and not absolute time.

4 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Second-order And Wide-Sense Stationary
 Consider a random process X(t) with second-order density function
𝒇𝑿 (𝒙𝟏 , 𝒙𝟐 ; 𝒕𝟏 , 𝒕𝟐 )

 The autocorrelation function of the random process is given by

 For a stationary process, the above expression is equal to

5 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Second-order And Wide-Sense Stationary
 For an arbitrary ∆= −𝑡1

 Now let 𝑡2 − 𝑡1 = 𝜏 and 𝑡1 = 𝑡

 The second-order stationary process is also known as Wide Sense


Stationary (WSS), process.

6 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Definition: Wide Sense Stationary (WSS)
 A random process is Wide-Sense Stationary if the mean function and
autocorrelation function are independent to time shift.

 That is,

𝝁𝑿 𝒕 = 𝝁𝑿 = 𝑪𝒐𝒏𝒔𝒕𝒂𝒏𝒕
𝑹𝑿𝑿 𝒕, 𝒕 + 𝝉 = 𝑹𝑿𝑿 (𝝉)
 A strict-sense stationary process is also WSS provided that mean
and autocorrelation functions exist.

 Thus, the WSS process does not necessarily be stationary in the


strict-sense.

7 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


JOINTLY WIDE-SENSE STATIONARY

 Two random processes X(t) and Y(t) are said to be jointly wide-
sense stationary if their cross correlation function 𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 is
function of time difference 𝜏.

 That is
𝑹𝑿𝒀 𝒕, 𝒕 + 𝝉 = 𝑬 𝑿 𝒕 𝒀(𝒕 + 𝝉) = 𝑹𝑿𝒀 𝝉

8 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Time Averages and Ergodicity
 For a random process X(t), the mean and autocorrelation functions
are calculated using the ensemble average.
 To obtain ensemble average, we have to consider all possible
realizations.
 But in most of the processes we may not be able to observe all
realizations and in fact, we may have to be satisfied with a Single
realization.
 This will happen in the experiments where it may not be possible to
take replicate measurements.
 In these cases, the mean and autocorrelation function are obtained
using a single realization of X(t), and these values are assigned to
statistical average of the ensemble.

9 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Time Averages and Ergodicity
 In some other random processes, the mean and autocorrelation
functions of every sample function of the ensemble is same as the
ensemble.

 This implies that it is possible to determine the statistical behavior of


the ensemble from a sample function.

 In both cases, statistical behavior of the ensemble can be obtained


from only one sample function.

 These processes where the time average and statistical average are
equal are known as ergodic processes.

10 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Time Averages and Ergodicity
Definitions

 Ergodic Process: A random process is said to be ergodic if all time


averages are equal to the corresponding statistical averages.

 Jointly Ergodic Processes: Two random processes are jointly ergodic


if they are individually ergodic and also have a time correlation
function that equals the statistical cross-correlation function.

11 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Mean Ergodic Processes
 A random process X(t) is said to be mean ergodic or ergodic in
mean if time average of a sample function 𝝁𝒙 is equal to the
statistical average 𝝁𝑿 with probability 1 for all x(t).

 That is, 𝝁𝑿 = 𝝁𝒙

 The sufficient condition for mean ergodicity of a process X(t) with


AX as a random variable is

 It also can be stated as

12 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Correlation Ergodic Processes

 A stationary continuous random process X(t) with autocorrelation


function 𝑅𝑋𝑋 𝜏 is said to be autocorrelation ergodic or ergodic in
the autocorrelation if and only if
𝑇
1
lim 𝑋 𝑡 𝑋(𝑡 + 𝜏) = 𝑅𝑋𝑋 𝜏
𝑇→∞ 2𝑇
−𝑇

13 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of Autocorrelation function
Property:1

 Autocorrelation function is bounded by its value at the origin.

 That is, 𝑅𝑋𝑋 𝜏 ≤ 𝑅𝑋𝑋 0

 In other words, autocorrelation function 𝑅𝑋𝑋 𝜏 is maximum at


𝜏=0

Property:2

 Autocorrelation function 𝑅𝑋𝑋 𝜏 is an even function


𝑅𝑋𝑋 𝜏 = 𝑅𝑋𝑋 −𝜏

14 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of Autocorrelation function
Property:3

 The autocorrelation function 𝑅𝑋𝑋 𝜏 at 𝜏 = 0 is equal to mean


square value.

 The mean square value is also known as average power of the


process.

15 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of Autocorrelation function
Property:4

 If X(t) has a periodic component then 𝑅𝑋𝑋 𝜏 will have a periodic


component with the same period.

16 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of Autocorrelation function
Property:4

 If X(t) has a dc component then 𝑅𝑋𝑋 𝜏 will have a dc component.

Property:5

 The autocorrelation function 𝑅𝑋𝑋 𝜏 cannot have an arbitrary shape.

Property:6

 If X(t) is ergodic, zero mean and has no periodic component then


lim 𝑅𝑋𝑋 𝜏 = 0
𝜏 →0

17 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of Cross-correlation function
Property:1

 The cross correlation function 𝑹𝑿𝒀 𝝉 is not an even function of 𝝉.


But it satisfies the condition 𝑅𝑋𝑌 𝜏 = 𝑅𝑌𝑋 −𝜏

 The above equation represents that 𝑅𝑋𝑌 𝜏 and 𝑅𝑌𝑋 𝜏 are mirror
images

18 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of Cross-correlation function
Property:2

 Let X(t) and Y(t) are two random processes with respective
autocorrelation functions 𝑅𝑋𝑌 𝜏 and 𝑅𝑌𝑋 𝜏

 Then
 That is cross correlation is bounded by geometric mean of
𝑅𝑋𝑋 0 and 𝑅𝑌𝑌 0

Property:3

 The cross-correlation is bounded by arithmetic average of


𝑅𝑋𝑋 0 and 𝑅𝑌𝑌 0

19 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of Cross-correlation function
Property:4

 If X(t) and Y(t) are uncorrelated random processes, then

 Then

Property:5

 If X(t) and Y(t) are orthogonal random processes, then


𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 = 0

20 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Relationship b/w Two Random Processes
 They are jointly wide-sense stationary if X(t) and Y(t) are both WSS
and their cross-correlation 𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 and 𝑅𝑌𝑋 𝑡, 𝑡 + 𝜏 depends
only on the time difference 𝜏
𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 = 𝑅𝑋𝑌 𝜏 & 𝑅𝑌𝑋 𝑡, 𝑡 + 𝜏 = 𝑅𝑌𝑋 𝜏

 The two random processes are uncorrelated if their cross-


correlation is equal to the product of their mean functions

 OR Two random processes are uncorrelated if their covariance


𝐶𝑋𝑌 𝑡, 𝑡 + 𝜏 = 0

21 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 Given the random process X 𝑡 = 𝐴 sin(𝜔𝑡 + 𝜃) where A and 𝜔 are


constants and 𝜃 is uniformly distributed between −𝜋 and +𝜋 .

 Define a new random process 𝑌 𝑡 = 𝑋 2 (𝑡)

 Are 𝑋 𝑡 and 𝑌 𝑡 WSS ?

 Are 𝑋 𝑡 and 𝑌 𝑡 jointly WSS?

1 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
SOLUTION:

 Given: X 𝑡 = 𝐴 sin(𝜔𝑡 + 𝜃)

 𝜃 is uniformly distributed between −𝜋 and +𝜋 therefore,

2 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 For a WSS process, the mean value is constant, and autocorrelation


function is a function of time difference and not absolute time.

3 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

4 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

5 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

6 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

7 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 A random process 𝑌 𝑡 = 𝑋 𝑡 − 𝑋(𝑡 + 𝜏) is defined in terms


of a process X(t) that is at least wide-sense stationary.
 Show that mean value of Y(t) is zero even if X(t) has a non-
zero mean value.
 Show that 𝜎𝑌2 = 2[𝑅𝑋𝑋 0 − 𝑅𝑋𝑋 𝜏 ]
 If 𝑌 𝑡 = 𝑋 𝑡 + 𝑋(𝑡 + 𝜏), find E[Y(t)] and 𝜎𝑌2

8 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
SOLUTION:

 The given random process is 𝑌 𝑡 = 𝑋 𝑡 − 𝑋(𝑡 + 𝜏)

 where X(t) is a wide-sense stationary random process.

 The mean value of the random process Y(t) is

9 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 Thus, the mean value of the random process Y(t) is zero, even if X(t)
has a non-zero mean value.

10 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

11 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

12 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

13 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 A random variable process X 𝑡 = sin(𝜔𝑡 + ∅) where ∅ is a random
variable uniformly distributed by 0,2𝜋

 Prove that

 SOLUTION:

14 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

15 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

16 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 A dice is tossed and corresponding to the dots S= {1, 2, 3, 4, 5, 6}, a
random process X(t) is formed with the following time functions

 SOLUTION

17 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

18 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

19 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 A stochastic process is described by 𝑋(𝑡) = 𝐴 sin 𝑡 + 𝐵 cos 𝑡 where
A and B are independent random variables with zero mean and equal
standard deviation.

 Show that the process is stationary of the second order.

 SOLUTION

 Given: X(t) = A sin t + B cos t

20 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

21 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 Is the process X(t) with

 Is mean ergodic?

SOLUTION

22 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

23 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

Therefore X(t) is mean ergodic

24 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 Let N(t) be a zero mean wide-sense stationary noise process for


which

 where

 N0 > 0 is a finite constant. Determine if N(t) is mean ergodic.

25 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 SOLUTION:

 The process N(t) is mean ergodic if the variance is zero

26 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 The second term in the above equation is zero at 𝜏 = 0.

 Hence, we can write

27 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 Statistically independent zero mean random process X (t ) and Y (t)


have auto correlation functions

28 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 SOLUTION

29 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

30 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

31 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Time Averages and Ergodicity
 Ergodic Process: A random process is said to be ergodic if all time
averages are equal to the corresponding statistical averages.
𝑬𝑿 𝒕 = 𝑿(𝒕) 𝑎𝑛𝑑 𝑹𝑿𝑿 𝝉 = 𝑹𝑿𝑿 𝝉

 𝑬𝑿 𝒕 - Statistical average of X(t)

 𝑿(𝒕) - Time average of X(t)

 𝑹𝑿𝑿 𝝉 - Statistical average of autocorrelation of X(t)

 𝑹𝑿𝑿 𝝉 - Time average of autocorrelation of X(t)

1 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Mean Ergodic Processes
 A random process X(t) is said to be mean ergodic or ergodic in
mean if time average of a sample function 𝝁𝒙 is equal to the
statistical average 𝝁𝑿 with probability 1 for all x(t).

 That is, 𝝁𝑿 = 𝝁𝒙

 The sufficient condition for mean ergodicity of a process X(t) with


AX as a random variable is

 It also can be stated as

2 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 Is the process X(t) with

 Is mean ergodic?

SOLUTION

3 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

4 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

Therefore X(t) is mean ergodic

5 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 Let N(t) be a zero mean wide-sense stationary noise process for


which

 where

 N0 > 0 is a finite constant. Determine if N(t) is mean ergodic.

6 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

7 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

Since the variance is non zero


value.
Therefore N(t) is not mean
ergodic

8 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

SOLUTION:

9 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

10 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

11 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

12 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

13 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

14 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

15 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

16 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 Given the auto correlation function for a stationary ergodic process
with no periodic component is,
4
𝑅𝑋𝑋 𝜏 = 25 +
1 + 6𝜏 2
 Find the mean and variance of the random process X(t)

SOLUTION:

 From property:4

 If X(t) has no periodic components and is ergodic, and 𝜇𝑥 ≠ 0 then

17 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
4 4
𝜇𝑋2 = 25𝜇𝑋2 = lim 25 + 2
= 25 + 2
= 25 + 0 = 25
𝜏→∞ 1 + 6𝜏 1 + 6∞

𝝁𝑿 = 𝟓

 Variance, 𝜎𝑋2 = 𝐸 𝑋(𝑡)2 − 𝜇𝑋2


4
𝐸 𝑋(𝑡)2 = 𝑅𝑋𝑋 0 = 25 + 2
= 25 + 4 = 29
1 + 6(0)
𝝈𝟐𝑿 = 𝟐𝟗 − 𝟐𝟓 = 𝟒

18 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 If X(t) is ergodic with
2
𝑅𝑋𝑋 𝜏 = 18 + 1 + 4 cos 12𝜏
6 + 𝜏2
 Find the mean, does X(t) has any periodic component and find the
average power of X(t)

SOLUTION:

 From property:4

 If X(t) has no periodic components and is ergodic, and 𝜇𝑥 ≠ 0 then

19 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
2
𝜇𝑋2 = lim 𝑅𝑋𝑋 𝜏 = 18 + 2 1 + 4 cos 12𝜏
𝜏→∞ 6+𝜏
2
= 18 + 1 + 4 cos 12∞ = 18
6 + ∞2
𝝁𝑿 = 𝟏𝟖

 lim 𝑅𝑋𝑋 𝜏 = 𝜇𝑋2 hence the random process doesn’t have any
𝜏→∞
periodic component
2
 Average power, 𝑅𝑋𝑋 0 = 𝐸 𝑋(𝑡)2 = 18 + 1 + 4 cos 12(0)
6+02
2 2 1 𝟏𝟏𝟖
= 18 + 1 + 4 cos 0 = 18 + 1 + 4 = 18 + 5 = 𝑾
6 6 3 𝟔

20 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

21 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 SOLUTION

22 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

23 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

24 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

25 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

 Hence the auto correlation and cross correlation function depends


on 𝝉 and not on time ‘t’.

 Therefore , 𝑋(𝑡) and 𝑌(𝑡) are jointly stationary random process

26 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Gaussian Random Process

1 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Gaussian Random Process

 Gaussian processes are important in the study of random processes,


since many random processes can be modeled as Gaussian process.

 Any random process can be considered as the cumulative effect of


many random phenomena.

 According to central limit theorem, the linear combination of


random processes with different distributions tends to random
process with normal distribution.

2 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Gaussian Random Process
 If samples of a random process X 𝑡1 , X 𝑡2 , … , X(𝑡𝑛 ) taken at
arbitrary points in time 𝑡1 , 𝑡2 , … , 𝑡𝑛 form a set of jointly Gaussian
random variables for any 𝑛 = 1,2,3, …

 Then the real-valued random process {𝑋(𝑡)} is called a Gaussian


random process.

 The random variables X 𝑡1 , X 𝑡2 , … , X(𝑡𝑛 ) will have a joint pdf


given by
𝑓𝑋1, 𝑋2, …,𝑋𝑛 𝑥1 , 𝑥2 , … , 𝑥𝑛
1 1 −1
= 𝑛 𝑛 𝑒𝑥𝑝 − 𝑋 − 𝑋 𝑇 𝐶𝑋𝑋 𝑋−𝑋
(2𝜋) 2 𝐶𝑋𝑋 2 2

 where is 𝐶𝑋𝑋 known as covariance matrix and 𝑋 is mean of X 𝑡𝑘

3 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Gaussian Random Process

4 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Gaussian Random Process
 If the random variables X 𝑡1 , X 𝑡2 , … , X(𝑡𝑛 ) are uncorrelated then

 That is 𝐶𝑋𝑋 is ~ diagonal matrix with elements in the principal


diagonal equal to 𝜎𝑖2 .That is,

5 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Gaussian Random Process

6 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of Gaussian Processes

1. A Gaussian process that is wide-sense stationary is also SSS.

2. If the random variables X 𝑡1 , X 𝑡2 , … , X(𝑡𝑛 ) of a Gaussian


process X(t) are uncorrelated then they are independent.

3. If the input to a linear system is a Gaussian process, the output is


also a Gaussian process.

7 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 X(t) is a WSS process with ACF
𝑹𝑿𝑿 𝝉 = 𝟗 + 𝒆−𝟐 𝝉

 Determine the covariance matrix for the random variables X(0),


X(1), X(2) and X(3).

 SOLUTION

8 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 Let 𝑋 𝑡1 = 𝑋 0 ; 𝑋 𝑡2 = 𝑋 1 ; 𝑋(𝑡3) = 𝑋(2) 𝑎𝑛𝑑 𝑋(𝑡4) = 𝑋(3)

 The elements of the covariance matrix are given by

= 9 + 𝑒 −2 𝑗−𝑖 − 9 = 𝑒 −2 𝑗−𝑖

9 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
𝐶𝑋𝑋 = 𝑒 −2 𝑗−𝑖

10 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 Let X(t) be a zero-mean Gaussian random process with auto-
covariance function given by
𝐶𝑋𝑋 𝑡1 , 𝑡2 = 4𝑒 − 𝑡1 −𝑡2

 Find the joint pdf of 𝑋(𝑡) and 𝑋(𝑡 + 𝜏)

 SOLUTION

11 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 The covariance matrix

12 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

13 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

2 8
𝟖𝝅

14 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process

15 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
 Poisson random process is a discrete random process.

 It is widely used to model the processes that count the number of


occurrences of an event in the interval (0, t).

 The event might be

1. The number of phone calls arriving at a Base station,

2. Customers entering at a bank or supermarket;

3. The random failures of equipment in a system

4. The emission of electrons from the surface of a photodetector.

16 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process

Counting Process

 In all the above examples, the random process X(t) represents the
total number of events that occurred in the time interval (0, t).

 Such types of processes are known as counting processes.

 So we can define the counting process as a random process X(t) that


represents the total number of events that have occurred in the
interval (0, t).

17 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process

A counting process satisfies the following conditions.

1. The counting of events begins at time t = 0; that is, X(0) = 0.

2. X(t) is an integer value and also X(t) ≥ 0 which means that it has
non-negative values.

3. X(t) is a non-decreasing function of time. That is, if t1 < t2 then


𝑋(𝑡1 ) ≤ 𝑋(𝑡2 )

4. 𝑋 𝑡1 − 𝑋(𝑡2 )represents the number of events occurred in the


interval (𝑡1 , 𝑡2 ).

18 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
 The sample function of a counting process is shown below

19 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
Independent Increment Processes

 Let us consider two disjoint time intervals (0, 𝑡1) and (𝑡2, 𝑡3)

 Let the number of events occurring in the time interval (0, 𝑡1) be E1
and the number of events occurring in the time interval (𝑡2, 𝑡3)is E2.

 If E1 is independent of E2 then the process is an independent


increment process.

A counting process is said to


be independent increment process
E2 if the number of events occurring in two
disjoint time intervals is an independent
E1 random vector.

20 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
Stationary Increments

 For every set of time instants 𝑡0 = 0 < 𝑡1 < 𝑡2 … < 𝑡𝑛

 If the increments 𝑋 𝑡1 − 𝑋 𝑡0 , 𝑋 𝑡2 − 𝑋 𝑡1 , … , 𝑋 𝑡𝑛 − 𝑋 𝑡𝑛−1


are identically distributed then the process is said to possess
stationary increments.

21 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
Definition of a Poisson Process
 Let Xi(t) be defined as the number of events that occur during a
specified time interval (0, t).
 The random variable Xi(t) defined above may assume the values
0,1,2, ….
The following five assumptions will now be made:
1. The number of events occurring during non-overlapping time
intervals are independent random variables.
2. If Xi(t) is the number of events occurred during (0, 𝑡1) and 𝑌𝑖(𝑡) is
the number of events occurred during (𝑡1, 𝑡1 + 𝑡) then for any
𝑡1 > 0, the random variables 𝑋𝑖 and 𝑌𝑖 have the same probability
distribution.

22 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
3. If the interval is infinitely small, say ∆𝑡 then the probability of one
event during the interval is directly proportional to the length of
that interval.That is,
𝑝1 ∆𝑡 = 𝑃 𝑋𝑖 𝑡 + ∆𝑡 − 𝑋𝑖 𝑡 = 𝜆∆𝑡 + 𝑂(∆𝑡)

 Where, 𝑂(∆𝑡) is a function of ∆𝑡 that goes to zero faster than


∆𝑡 does.That is,

 We can write
𝒑𝟏 ∆𝒕 = 𝑷 𝑿𝒊 𝒕 + ∆𝒕 − 𝑿𝒊 𝒕 ~𝝀∆𝒕

23 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
4. The probability of obtaining two or more events during a
sufficiently small interval is negligible.

5. Probability that no event in the interval ∆𝑡 is

24 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process

 The probability of ‘k’ number of occurrences at time ‘t’ is given by,

 The probability density of the number of occurrences

25 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
Joint Probability Density Function

 The pdf of the Poisson process in the interval 0 < t1 < t2,

 Mean of Poisson Process

 Variance of Poisson Process

26 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
 The autocorrelation function of a Poisson Random Process is given
by
𝑹𝑿𝑿 𝒕𝟏 , 𝒕𝟐 = 𝝀𝟐 𝒕𝟏 𝒕𝟐 + 𝝀𝒕𝟏

 The autocovariance function of a Poisson Random Process is given


by

27 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Poisson Random Process
 Correlation coefficient of Poisson Process

28 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of the Poisson Process

1. Sum of two independent Poisson processes is also a Poisson


process

2. Difference of two independent Poisson processes is not a Poisson


process

3. The Poisson process is not a stationary process

4. The interval between two successive occurrences of Poisson


process with parameter 𝜆 follows an exponential distribution with
1
mean .
𝜆

29 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Properties of the Poisson Process
 Let T be the time at which the first event occurs. Since T is
continuous random variable the CDF of T is

30 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 The customers arrive at a bank according to Poisson process with
mean rate 5 per minute.

 Find the probability that during a 1-minute interval, no customer


arrives.

31 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 The particles are emitted from a radioactive source at the rate of 50
per hour.

 Find the probability that exactly 10 particles are emitted during a 10-
minute period.

32 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
 Telephone calls are initiated through an exchange at the average rate
of 75 per minute and are described by a Poisson process.

 Find the probability that more than 3 calls are initiated in any 5-
second period.

33 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

34 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process

1 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain
 A Markov process, named after the mathematician Andrey Markov, is
a stochastic process that satisfies the Markov property,
 Which states that the future state of the process depends only on its
current state and not on the sequence of events that preceded it.
 In other words, given the present state, the future behavior of the
process is independent of how it arrived at its current state.
Applications:
 Weather Forecasting: Modeling weather patterns using Markov
processes.
 Financial Modeling:Analyzing stock prices or market behaviors.
 Genetics: Modeling DNA sequences and evolutionary processes.

2 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain
What is a Markov Process?

 A Markov process is a mathematical framework used to model


systems that transition from one state to another in a probabilistic
manner.

PrX t 1  xt 1 | X 1  X t  x1  xt   PrX t 1  xt 1 | X t  xt 

X1 X2 X3 X4 X5

 Stationary Assumption: Transition probabilities are independent


of time (t)
Pr  X t 1  b | X t  a   pab

3 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain
Basic Terminology:

 States: The possible conditions or configurations that the system


can be in.

 Transition Probabilities: The probabilities of moving from one


state to another in a single step.

 Transition Matrix: A matrix representing the transition


probabilities between states.

 Stochastic Finite State Machine (FSM): It is a mathematical


model used to describe systems with discrete states where
transitions between states occur probabilistically.

4 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain
 States: The possible conditions or configurations that the system
can be in.

 Classification of states:

5 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain
 Simple Example
Weather:
• raining today 40% rain tomorrow
60% no rain tomorrow
• not raining today 20% rain tomorrow
80% no rain tomorrow

0.6
Stochastic FSM: 0.4 0.8

rain no rain

0.2
6 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Markov Process : Markov Chain
 Simple Example
Weather:
• raining today 40% rain tomorrow
60% no rain tomorrow
• not raining today 20% rain tomorrow
80% no rain tomorrow

The transition matrix:

 0.4 0.6  • Stochastic matrix:


P    Rows sum up to 1
• Double stochastic matrix:
 0.2 0.8  Rows and columns sum up to 1
7 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Markov Process : Markov Chain
 Markov process - described by a stochastic FSM
 Markov chain - a random walk on this graph
 (distribution over paths)

 Edge-weights give us Pr  X t 1  b | X t  a   pab

 We can ask more complex questions, like


PrX t  2  a | X t  b   ?

8 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain
Coke vs. Pepsi Example

 Given that a person’s last cola purchase was Coke, there is a 90%
chance that his next cola purchase will also be Coke.

 If a person’s last cola purchase was Pepsi, there is an 80% chance


that his next cola purchase will also be Pepsi.
Transition matrix:
0.9 0.1
0.8
0.9 0.1
P 
coke pepsi
0.2 0.8 0.2

9 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain
Given that a person is currently a Pepsi purchaser, what
is the probability that he will purchase Coke two
purchases from now?
Pr[ Pepsi?Coke ] =
Pr[ PepsiCokeCoke ] + Pr[ Pepsi Pepsi Coke ] =
0.2 * 0.9 + 0.8 * 0.2 = 0.34

0.9 0.1 0.9 0.1 0.83 0.17 


P  2
    
0.2 0.8 0.2 0.8 0.34 0.66 
Pepsi  ? ?  Coke
10 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Markov Process : Markov Chain
 Given that a person is currently a Coke purchaser, what is the
probability that he will purchase Pepsi three purchases from
now?

0.9 0.1 0.83 0.17   0.781 0.219 


P 
3
    
0.2 0.8 0.34 0.66  0.438 0.562 

11 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain
 Assume each person makes one cola purchase per week

 Suppose 60% of all people now drink Coke, and 40% drink Pepsi

 What fraction of people will be drinking Coke three weeks from


now?
0.9 0.1  0.781 0.219 
P  P 3

 0.2 0.8  0.438 0.562 
Pr[X3=Coke] = 0.6 * 0.781 + 0.4 * 0.438 = 0.6438

Qi - the distribution in week i


Q0=(0.6,0.4) - initial distribution
Q3= Q0 * P3 =(0.6438,0.3562)
12 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Markov Process : Markov Chain
 Simulation:
2/3
Pr[Xi = Coke]


0.9 0.1 2
2 3 1 
3    3 1
3 
0.2 0.8

stationary distribution

week - i
13 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Markov Process : Markov Chain
 Another Example

14 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain

15 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain

16 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain

17 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain

18 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain

19 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain

20 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Markov Process : Markov Chain

21 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

22 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

23 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example

24 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner Process
Or
Brownian Motion Process

25 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process
 Brownian motion and the Wiener process are closely related
concepts in mathematics and stochastic calculus.
 In fact, they are often used interchangeably.
 Brownian motion, named after the botanist Robert Brown
 who observed the erratic movement of pollen particles in water, is a
continuous stochastic process.
 It describes the random motion of particles suspended in a fluid
(usually a gas or liquid), subject to random collisions with molecules
of the fluid.
 Mathematically, it is characterized by its properties such as being
continuous, Markovian, and having stationary independent
increments.

26 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process
 The Wiener process, named after mathematician Norbert Wiener, is
a specific type of continuous-time stochastic process that serves as a
mathematical model for Brownian motion.

 It is defined by three key properties:

1. Continuity:The Wiener process is a continuous function of time.

2. Stationary independent increments: Increments of the Wiener


process over disjoint time intervals are independent and identically
distributed (i.i.d.).

3. Normal distribution of increments: Increments of the Wiener


process over a time interval are normally distributed with mean
zero and variance proportional to the length of the time interval.

27 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

28 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

29 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

30 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

31 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

32 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

33 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

34 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

35 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

36 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

37 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

38 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Weiner / Brownian Process

39 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Complex Random Processes

40 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Complex Random Processes
Z (t )  X (t )  jY (t )
Z (t ) stationary  X (t ) and Y (t ) jointly stationary
Z (t ) w.s.s.  X (t ) and Y (t ) jointly w.s.s.

E[ Z (t )]  E[ X (t )]  jE[Y (t )]

autocorrelation function RZZ (t , t   )  E[ Z (t )* Z (t   )]

autocovariance function
CZZ (t , t   )  E[{Z (t )  E[ Z (t )]}*{Z (t   )  E[ Z (t   )]}]
 RZZ (t , t   )  E[ Z (t )]* E[ Z (t   )]
41 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Complex Random Processes
Z (t ) w.s.s.  Z  X  jY = const
RZZ (t , t   )  RZZ ( )
CZZ (t , t   )  CZZ ( )

cross-correlation function RZi Z j (t , t   )  E[ Zi (t )* Z j (t   )]


cross-covariance function
CZi Z j (t , t   )  E[{Zi (t )  Zi (t )}*{Z j (t   )  Z j (t   )}]

jointly w.s.s.  RZi Z j (t , t   )  RZi Z j ( ), CZi Z j (t , t   )  CZi Z j ( )

Zi (t ) and Z j (t ) uncorrelated  CZi Z j (t , t   )  0


Zi (t ) and Z j (t ) orthogonal  RZi Z j (t , t   )  0
42 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT
Tutorial Example
Example:

 A complex random process V(t) is comprised of sum of N complex


Signals
N
V (t )   An e j0t  jn
n1

0  const
{ An , n n  1, , N } indep.
n -- uniformly distributed on (0, 2 )
 Find the autocorrelation function of V(t)

43 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Tutorial Example
N N
RVV (t , t   )  E[V (t ) V (t   )]  E[ An e
*  j0t  jn
 m
A e j0t  j0  jm
]
n1 m1
N N N N
  e j0 E[ An Am e j ( m n ) ]  e j0  E[ An Am ]E[e j ( m n ) ]
n 1 m1 n 1 m1

 2 2 1
 0 0 (2 ) 2
j ( m  n )
e d n d m , m  n
j ( m  n )  0, m  n
E[ e ] 
 2 1 j ( n  n )  1, m  n
 0 e d n , mn
 2
N
RVV (t , t   )  e j0
 n ]  RVV ( )
E[
n1
A 2

44 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT


Summary
 Introduction
 Random Process: Classifications.
 Stationarity and Independence.
 Time Averages and Ergodic Random process.
 Characterizing a Random Process: The Mean, Correlation Functions,
Covariance Functions, and their Properties
 Different processes:
 Gaussian Random Process
 Poisson Random Process,
 Weiner Process,
 Markov process,
 Complex Random Process.

45 Dr. R.K.Mugelan, Associate Professor, SENSE, VIT

You might also like