Random Processes and Type
Random Processes and Type
Random Processes and Type
Random Processes
Dr. Ali Muqaibel
Dr. Ali Hussein Muqaibel 1
Introduction
• Real life: time (space) waveform (desired +undesired)
• Our progress and development relays on our ability to deal with such wave
forms.
• The set of all the functions that are available (or the menu) is call the
ensemble of the random process.
The graph of the function , ,
versus t for fixed, is called a
realization, Sample path, or sample
function of the random process.
For each fixed from the indexed set
, , is a random variable
Dr. Ali Hussein Muqaibel 2
1
3/31/2013
Formal Definition
• Consider a random experiment specified by the outcomes from some
sample space , and by the probabilities on these events.
• Suppose that to every outcome ∈ ,we assign a function of time
according to some rule: , , ∈ .
• We have created an indexed family of random variables, , , ∈ .
• This family is called a random process (stochastic processes).
• We usually suppress the and use to denote a random process.
• A stochastic process is said to be discrete‐time if the index set is a
countable set (i.e., the set of integers or the set of nonnegative integers).
• A continuous‐time stochastic process is one which is continuous
(thermal noise)
Dr. Ali Hussein Muqaibel 3
Deterministic and non‐deterministic
Processes
• Non‐deterministic: future values cannot be
predicted from current ones.
– most of the random processes are non‐
deterministic. Sinusoid with random amplitude [‐1,1]
• Deterministic:
• like :
X (t , s) s cos(2 t ) t
Y (t , s ) cos(2 t s )
Sinusoid with random phase ,
Dr. Ali Hussein Muqaibel 4
2
3/31/2013
Distribution and Density Functions
• A r.v. is fully characterized by a pdf or CDF. How do we characterize
random processes?
• To fully define a random processes, we need dimensional joint
density function.
• Distribution and Density Functions
• First order:
;
• Second‐order joint distribution function
, ; , ,
• Nth order joint distribution function
• ,…, ; ,…, ,….,
• ,…, ; ,…, ,…, ; ,…,
…
Dr. Ali Hussein Muqaibel 5
Stationary and Independence
• Statistical Independence
• ,…, , ,…, ; ,…, , ́ ,…, ́ =
,…, ,; ,…, ,…, ; ́ , … , ́
• Stationary
– If all statistical properties do not change with time
• First order Stationary Process
• ; ; ∆ , stationary to order one
• =>
• Proof
,
;
;
Let ∆
∆
Dr. Ali Hussein Muqaibel 6
3
3/31/2013
Cyclostationary
A discrete‐time or continuous‐time random process is said to be
cyclostationary if the joint cumulative distribution function of any set of
samples is invariant with respect to shifts of the origin by integer multiples of
some period
Dr. Ali Hussein Muqaibel 7
4
3/31/2013
Cross‐Correlation Function and its properties
• ,
• If X and Y are jointly w.s.s. we may write
.
• Orthogonal processes , 0
• If X and Y are statistically independent
=
• If in addition to being independent they are at
least w.s.s.
Dr. Ali Hussein Muqaibel 8
Some properties for
•
• 0 0
• 0 0
• The geometric mean is tighter than the
arithmetic mean
• 0 0 0 0
Dr. Ali Hussein Muqaibel 9
1
3/31/2013
Measurement of Correlation Function
• In real life, we can never measure the true correlation .
• We assume ergodicity and use portion of the available time.
• Assume ergodicity, no need to prove mathematically “physical sense”
• Assume jointly ergodic => stationary
• Let 0, 2
• Similarly, we may find &
1
Delay 2
2
1 2
Product .
2
Delay
Dr. Ali Hussein Muqaibel 10
Example
• Use the above system to measure the for .
• 2 cos cos
• cos cos 2 2
• where cos
• cos 2
• If we require the to be at least 20 times less than the largest value of
the true autocorrelation 0.05 0
• 0.05 ⇒
• Wait enough time! Depending on the frequency
Dr. Ali Hussein Muqaibel 11
2
3/31/2013
• % Dr. Ali Muqaibel -1
-40 -30 -20 -10 0 10 20 30 40
• % Measurement of Correlation function Measured
•
•
clear all
close all
T=20, OMEGA=0.2 1
• clc 0
• T=100;
-1
• A=1; -40 -30 -20 -10 0 10 20 30 40
• omeg=0.2; Error
0.1
• t=‐T:T;
• thet=2*pi*rand(1,1);
0
• X=A*cos(omeg*t+thet);
• [R,tau]=xcorr(X,'unbiased'); -0.1
• %R=R/(2*T); -40 -30 -20 -10 0 10 20 30 40
True Variance
• True_R=A^2/2*cos(omeg*tau); 0.5
• Err=A^2/2*cos(omeg*tau+2*thet)*sin(2*omeg*T)/(2*omeg*T);
• subplot(3,1,1) 0
• plot(tau,True_R,tau,R+Err,':')
• title ('True Variance') -0.5
-100 -80 -60 -40 -20 0 20 40 60 80 100
• subplot(3,1,2) Measured
• 0.5
• plot (tau,R,':')
0
• title ('Measured')
• % error
-0.5
• -100 -80 -60 -40 -20 0 20 40 60 80 100
• subplot (3,1,3) Note the error is less than 5% Error
0.05
• plot (tau,Err)
• title ('Error')
0
• % error T=50, OMEGA=0.2
-0.05
-100 -80 -60 -40 -20 0 20 40 60 80 100
Dr. Ali Hussein Muqaibel 12
where
(2 ) k / 2 C 2
Dr. Ali Hussein Muqaibel 13
3
3/31/2013
{C X (t1 , t j )} { 2 ij } 2 I ,
1 k
f X 1 ,..... X k ( x1 , x2 ,....., xk ) k
exp ( xi m) 2 / 2 2
(2 2 ) 2 i 1
f X ( x1 ) f X ( x2 )... f X ( xk )
Dr. Ali Hussein Muqaibel 14
Example of a Gaussian Random
Process
• A Gaussian Random Process which is W.S.S. 4 and
25 | |
16
• Specify the joint density function for three r.v. , 1,2,3 … ,
,
• , 1,2,3, …
• 25 16
• 25 16 4
1
• 25 1
1
Dr. Ali Hussein Muqaibel 15
4
3/31/2013
Complex Random Processes
• A complex random process is given by
•
∗
• ,
• ,
∗
• Note the conjugate
• There could be a factor of in some books
• See example in Peebles
Dr. Ali Hussein Muqaibel 16
Example Signal Plus Noise
Suppose we observe a process , which consists of a desired signal
plus noise .
Find the cross‐correlation between the observed signal and the desired
signal assuming that and are independent random processes.
5
3/31/2013
See Leon Garcia
Probability, Statistics, and Random Processes
for Electrical Engineers, 3rd Edition
9.5 GAUSSIAN RANDOM PROCESSES,WIENER
PROCESS,AND BROWNIAN MOTION
Dr. Ali Hussein Muqaibel 18
Let be a discrete‐time random process consisting of a sequence of in‐
dependent, identically distributed (iid) random variables with common cdf
mean and variance . The sequence is called the iid random process.
The joint cdf for any time instants , … . , is given by
m X ( n ) E[ X n ] m for all n
Thus, the mean is constant.
The autocovariance function is obtained from as follows. If , then
Dr. Ali Hussein Muqaibel 19
6
3/31/2013
C X (n1 , n2 ) E[( X n m) 2 ] 2
C X (n1 , n2 ) 2 n1n2 ,
n1n2 1
where if and 0 otherwise
The autocorrelation function of the iid process is:
RX ( n1 , n2 ) C X (n1 , n2 ) m 2
Dr. Ali Hussein Muqaibel 20
Example : Bernoulli Random Process
Let be a sequence of independent Bernoulli random variables. is then
an iid random process taking on values from the set {0,1}. A realization of such
a process is shown in Figure.
For example, could be an indicator function for the event “ a light bulb fails
and is replaced on day n.”
Since is a Bernoulli random variable, it has mean and variance
1
The independence of the makes probabilities easy to compute. For example,
the probability that the first 4 bits in the sequence are 1001 is
1, 0, 0, 1 1 0 0 1
1
Similarly, the probability that the second bit is 0 and the seventh is 1 is
0, 1 0 1 1
Dr. Ali Hussein Muqaibel 21
7
3/31/2013
Sum Processes: The Binomial Counting and Random Walk Processes
(b) Realization of a binomial
process. denotes the
number of light bulbs that
have failed up to time n.
Many interesting random processes are obtained as the sum of a sequence
of iid random variables, , , … .
⋯. , 1,2, …
The sum process ⋯
0 , can be generated in this way.
Dr. Ali Hussein Muqaibel 22
Let the be the sequence of independent Bernoulli random variables in
a previous Example, and let be the corresponding sum process. is then the
counting process that gives the number of successes in the first Bernoulli
trials. The sample function for corresponding to a particular sequence of
is shown in the Figure up. If indicates that a light bulb fails and is replaced
on day n, then denotes the number of light bulbs that have failed up to day .
Since is the sum of independent Bernoulli random variables, is a
binomial random variable with parameters n and 1
n
P[ S n j ] p j (1 p) n j for 0 j n,
j
and zero otherwise. Thus has mean and variance 1 .
Note that the mean and variance of this process grow linearly with time ( ).
Dr. Ali Hussein Muqaibel 23
8
3/31/2013
Example One‐Dimensional Random Walk
(n‐j)/2 (n‐j)/2
n‐j J
Dr. Ali Hussein Muqaibel 24
f S n1 , S n 2 ( y1 , y2 ) f S n 2n1 ( y2 y1 ) f S n1 ( y1 )
1 2 /[ 2 ( n2 n1 ) 2 ] 1
e ( y2 y1 ) e y1 / 2 n1
2 2
2 (n2 n1 ) 2 2n1 2
Dr. Ali Hussein Muqaibel 25
9
3/31/2013
Poisson Process
Consider a situation in which events occur at random instants of time at
an average rate of a customer to a service station or the breakdown of a
component in some system. Let be the number of event occurrences
in the time interval [0,t]. is then a nondecreasing, integer‐valued,
continuous‐time random process as shown in Figure.
A sample path of the Poisson
counting process. The event
occurrence times are denoted by
, , … . .The j th interevent time
is denoted by
Dr. Ali Hussein Muqaibel 26
Poisson Process.. From Binomial
If the probability of an event occurrence in each subinterval is p, then the expected
number of event occurrences in the interval [0,t] is np. Since events occur at a rate of
events per second, the average number of events in the interval [0,t] is also
.Thus we must have that
, 0,1, …
!
For this reason is called the Poisson process.
Replace p with /
For detailed derivation , please see n
http://www.vosesoftware.com/ModelRiskHelp/index.htm#P P[ S n j ] p j (1 p ) n j for 0 j n,
robability_theory_and_statistics/Stochastic_processes/Derivi j
ng_the_Poisson_distribution_from_the_Binomial.htm
Dr. Ali Hussein Muqaibel 27
10
3/31/2013
Poisson Random Process
• Also known as Poisson Counting Process
• Arrival of customers, failure of parts, lightning,….internet t>0
• Two conditions:
Events do not coincide.
# of occurrence in any given time interval is independent of the number in any non overlapping
time interval. (independent increments)
• Average rate of occurrence= .
•
!
, 0,1,2, … 0,
• ∑
!
•
•
• 1
• The probability distribution of the waiting time until the next occurrence is an
exponential distribution.
• The occurrences are distributed uniformly on any interval of time.
http://en.wikipedia.org/wiki/Poisson_process
Dr. Ali Hussein Muqaibel 28
Joint probability density function for
Poisson Random Process
• The joint probability density function for the poison process at times 0
• , 0,1,2, …
!
• The probability of occurrence over 0, given that events occurred over
0, , is just the probability that events occurred over , , which is
•
!
• For , the joint probability is given by
• , ]
•
! !
• The joint density becomes
• , ∑ ∑ ,
• Example : demonstrate the higher‐dimensional pdf
Dr. Ali Hussein Muqaibel 29
11
3/31/2013
Example I
Inquiries arrive at a recorded message device according to a Poisson process of rate 15
inquiries per minute. Find the probability that in a 1‐minute period, 3 inquiries arrive
during the first 10 seconds and 2 inquiries arrive during the last 15 seconds.
The arrival rate in seconds is inquiries per second.
Writing time in seconds, the probability of interest is
10 3 60 45 2
By applying first the independent increments property, and then the stationary
increments property, we obtain
Example II
Find the mean and variance of the time until the arrival of the tenth
inquiry in the previous Example. The arrival rate is 1/4 inquiries per
second, so the inter‐arrival times are exponential random variables with
parameter .
From Tables, the mean and variance of an inter‐arrival time are then
1/ and 1/ , respectively.
The time of the tenth arrival is the sum of ten such iid random variables,
thus 10
E[ S10 ] 10 E[T ] 40 sec
10
VAR[ S10 ] 10VAR[T] 160 sec 2
2
Dr. Ali Hussein Muqaibel 31
12
3/31/2013
Example Random Telegraph Signal
Consider a random process that assumes the values 1 . Suppose that
0 1 with probability and suppose that then changes polarity
with each occurrence of an event in a Poisson process of rate . The next figure
shows a sample function of .
Sample path of a random
telegraph signal. The times
between transitions are iid
exponential random variables.
Dr. Ali Hussein Muqaibel 32
1
e t {et e t }
2
1
{1 e 2t }
2
and X(0) will differ in sign if the number of events in t is odd:
(t ) 2 j 1 t
P[ X (t ) 1 X (0) 1] e
j 0 (2 j 1)!
1
e t {et e t }
2
1
{1 e 2t }.
2
Dr. Ali Hussein Muqaibel 33
13
3/31/2013
Mean & Variance of the Random
Telegraph Signal
We obtain the pmf for by substituting into :
P[ X (t ) 1] P[ X (t ) 1 X (0) 1]P[ X (0) 1]
P[X (t ) 1 X (0) 1]P[ X (0) 1].
11 11 1
P[ X (t ) 1] {1 e 2t } {1 e 2t }
22 22 2
1
P[ X (t ) 1] 1 P[ X (t ) 1]
2
Thus the random telegraph signal is equally likely to be 1 at any time
The mean and variance of are
Dr. Ali Hussein Muqaibel 34
Auto‐covariance of the Random
Telegraph Signal
The autocovariance of is found as follows:
C X (t1 , t 2 ) E [ X (t1 ) X ( t 2 )]
1P[ X (t1 ) X (t 2 )] ( 1) P[ X ( t1 ) X (t 2 )]
1 2 t 2 t1 1 2 t 2 t1
{1 e } {1 e }
2 2
2 t 2 t1
e
Thus time samples of become less and less correlated as the time between
them increases.
Dr. Ali Hussein Muqaibel 35
14
3/31/2013
Wiener Process and Brownian Motion
• A continuous‐time Gaussian random process as a limit of a
discrete time process.
• Suppose that the symmetric random walk process (i.e.,
0.5 ) takes steps of magnitude every seconds.
• We obtain a continuous‐time process by letting be
the accumulated sum of the random step process up to
time t.
• is a staircase function of time that takes jumps of
every seconds.
• At time t, the process will have taken jumps, so it is
equal to
Dr. Ali Hussein Muqaibel 36
• The mean and variance of
E[ 0
VAR[ VAR
We used the fact that 4 1 1
• By shrinking the time between jumps and letting → 0 and →
0 with
• then has a mean and variance
0
• X(t) is called the Wiener random process. It is used to model
Brownian motion, the motion of particles suspended in a fluid that
move under the rapid and random impact of neighboring particles.
Dr. Ali Hussein Muqaibel 37
15
3/31/2013
Wiener Process
• As approaches the sum of infinite number of random variables since
→∞
• lim → lim →
• By the central limit theorem the pdf therefor approaches that o a
Gaussian variable with mean zero and variance :
•
• inherits the property of independent and stationary increments from
the random walk process from which it is derived.
• The independent increments property and the same sequence of
steps can be used to show that the autocovariance of X(t) is given by
• , min ,
• Wiener and Poisson process have the same covariance despite the fact that
they are different.
Dr. Ali Hussein Muqaibel 38
Practice Problem :Poisson Process
• Suppose that a secretary receives calls that
arrive according to a Poisson process with a
rate of 10 calls per hour.
• What is the probability that no calls go
unanswered if the secretary is a way from the
office for the first and last 15 minutes of an
hour?
Dr. Ali Hussein Muqaibel 39
16
3/31/2013
In class practice: Wide‐Sense
Stationary Random Process
• Let be an iid sequence of Gaussian random variables with
zero mean and variance , and let be the average of two
consecutive values of ,
2
• Find the mean of .
• Find the covariance ,
• What is the distribution of the random variable . Is it
stationary?
Dr. Ali Hussein Muqaibel 40
17