Lecture-2
Lecture-2
1
CUMULATIVE JOINT PDF JOINT PDF
Often encountered when dealing with combined
experiments or repeated trials of a single
experiment.
Multiple random variables are basically
multidimensional functions defined on a sample
space of a combined experiment.
Experiment 1
S1 = {x1, x2, …,xm}
Experiment 2
S2 = {y1, y2 , …, yn}
If we take any one element from S1 and S2
0 <= P(xi, yj) <= 1 (Joint Probability of two or more
outcomes)
Marginal probabilty distributions
Sum all j P(xi, yj) = P(xi) 2
Sum all i P(xi, yj) = P(yi)
EXPECTATION OF RANDOM VARIABLES
(STATISTICAL AVERAGES)
Statistical averages, or moments,
play an important role in the
characterization of the random
variable.
The first moment of the probability
distribution of a random variable X
is called mean value mx or
expected value of a random
variable X
The second moment of a
probability distribution is mean-
square value of X
Central moments are the
moments of the difference
between X and mx, and second
central moment is the variance of
x.
Variance is equal to the difference
between the mean-square value
and the square of the mean
3
Contd
The variance provides a measure of the variable’s
“randomness”.
The mean and variance of a random variable give a
partial description of its pdf.
4
TIME AVERAGING AND ERGODICITY
A process where any member of the ensemble
exhibits the same statistical behavior as that of the
whole ensemble.
For an ergodic process: To measure various
statistical averages, it is sufficient to look at only
one realization of the process and find the
corresponding time average.
For a process to be ergodic it must be stationary.
The converse is not true.
5
GAUSSIAN (OR NORMAL) RANDOM VARIABLE
(PROCESS)
A continuous random variable whose pdf is:
6
CENTRAL LIMIT THEOREM
CLT provides justification for using Gaussian
Process as a model based if
The random variables are statistically independent
The random variables have probability with same mean
and variance
7
CLT
The central limit theorem states that
“The probability distribution of Vn approaches a
normalized Gaussian Distribution N(0, 1) in the limit as
the number of random variables approach infinity”
At times when N is finite it may provide a poor
approximation of for the actual probability
distribution
8
AUTOCORRELATION
Autocorrelation of Energy Signals
Correlation is a matching process; autocorrelation refers to the
matching of a signal with a delayed version of itself
The autocorrelation function of a real-valued energy signal x(t) is
defined as:
9
AUTOCORRELATION
symmetrical in about zero
maximum value occurs at
the origin
autocorrelation and ESD
form a Fourier transform
pair, as designated by the
double-headed arrows
value at the origin is equal
to the energy of the signal
10
AUTOCORRELATION OF A PERIODIC (POWER)
SIGNAL
When the power signal x(t) is periodic with period T0, the
autocorrelation function can be expressed as:
11
AUTOCORRELATION OF POWER SIGNALS
The autocorrelation function of a real-valued periodic signal
has properties similar to those of an energy signal:
symmetrical in about zero
maximum value occurs at
the origin
autocorrelation and PSD
form a Fourier transform
pair, as designated by the
double-headed arrows
value at the origin is equal
to the average power of the
signal
12
13
14
SPECTRAL DENSITY
15
SPECTRAL DENSITY
The spectral density of a signal characterizes the
distribution of the signal’s energy or power, in the
frequency domain
This concept is particularly important when considering
filtering in communication systems while evaluating the
signal and noise at the filter output.
The energy spectral density (ESD) or the power spectral
density (PSD) is used in the evaluation.
Need to determine how the average power or energy of
the process is distributed in frequency.
16