0% found this document useful (0 votes)
2 views

Lecture-2

The document discusses key concepts in probability density functions (pdf), joint pdfs, and statistical averages of random variables, including the mean and variance. It also covers ergodicity, Gaussian random variables, the Central Limit Theorem, and autocorrelation of energy and power signals. Additionally, it highlights the importance of spectral density in characterizing signal energy distribution in the frequency domain.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lecture-2

The document discusses key concepts in probability density functions (pdf), joint pdfs, and statistical averages of random variables, including the mean and variance. It also covers ergodicity, Gaussian random variables, the Central Limit Theorem, and autocorrelation of energy and power signals. Additionally, it highlights the importance of spectral density in characterizing signal energy distribution in the frequency domain.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

PROBABILITY DENSITY FUNCTION

 The pdf is defined as the derivative of the cdf:


fx(x) = d/dx Fx(x)
 It follows that:

 Note that, for all i, one has pi ≥ 0 and ∑pi = 1.

1
CUMULATIVE JOINT PDF JOINT PDF
 Often encountered when dealing with combined
experiments or repeated trials of a single
experiment.
 Multiple random variables are basically
multidimensional functions defined on a sample
space of a combined experiment.
 Experiment 1
 S1 = {x1, x2, …,xm}
 Experiment 2
 S2 = {y1, y2 , …, yn}
 If we take any one element from S1 and S2
 0 <= P(xi, yj) <= 1 (Joint Probability of two or more
outcomes)
 Marginal probabilty distributions
 Sum all j P(xi, yj) = P(xi) 2
 Sum all i P(xi, yj) = P(yi)
EXPECTATION OF RANDOM VARIABLES
(STATISTICAL AVERAGES)
 Statistical averages, or moments,
play an important role in the
characterization of the random
variable.
 The first moment of the probability
distribution of a random variable X
is called mean value mx or
expected value of a random
variable X
 The second moment of a
probability distribution is mean-
square value of X
 Central moments are the
moments of the difference
between X and mx, and second
central moment is the variance of
x.
 Variance is equal to the difference
between the mean-square value
and the square of the mean

3
Contd
 The variance provides a measure of the variable’s
“randomness”.
 The mean and variance of a random variable give a
partial description of its pdf.

4
TIME AVERAGING AND ERGODICITY
 A process where any member of the ensemble
exhibits the same statistical behavior as that of the
whole ensemble.
 For an ergodic process: To measure various
statistical averages, it is sufficient to look at only
one realization of the process and find the
corresponding time average.
 For a process to be ergodic it must be stationary.
The converse is not true.

5
GAUSSIAN (OR NORMAL) RANDOM VARIABLE
(PROCESS)
 A continuous random variable whose pdf is:

μ and are parameters. Usually denoted as


N(μ, ) .
 Most important and frequently encountered random
variable in communications.

6
CENTRAL LIMIT THEOREM
 CLT provides justification for using Gaussian
Process as a model based if
 The random variables are statistically independent
 The random variables have probability with same mean
and variance

7
CLT
 The central limit theorem states that
 “The probability distribution of Vn approaches a
normalized Gaussian Distribution N(0, 1) in the limit as
the number of random variables approach infinity”
 At times when N is finite it may provide a poor
approximation of for the actual probability
distribution

8
AUTOCORRELATION
Autocorrelation of Energy Signals
 Correlation is a matching process; autocorrelation refers to the
matching of a signal with a delayed version of itself
 The autocorrelation function of a real-valued energy signal x(t) is
defined as:

 The autocorrelation function Rx() provides a measure of how


closely the signal matches a copy of itself as the copy is shifted 
units in time.
 Rx() is not a function of time; it is only a function of the time
difference  between the waveform and its shifted copy.

9
AUTOCORRELATION
 symmetrical in  about zero
 maximum value occurs at
the origin
 autocorrelation and ESD
form a Fourier transform
pair, as designated by the
double-headed arrows
 value at the origin is equal
to the energy of the signal

10
AUTOCORRELATION OF A PERIODIC (POWER)
SIGNAL

 The autocorrelation function of a real-valued power


signal x(t) is defined as:

 When the power signal x(t) is periodic with period T0, the
autocorrelation function can be expressed as:

11
AUTOCORRELATION OF POWER SIGNALS
The autocorrelation function of a real-valued periodic signal
has properties similar to those of an energy signal:
 symmetrical in  about zero
 maximum value occurs at
the origin
 autocorrelation and PSD
form a Fourier transform
pair, as designated by the
double-headed arrows
 value at the origin is equal
to the average power of the
signal

12
13
14
SPECTRAL DENSITY
15
SPECTRAL DENSITY
 The spectral density of a signal characterizes the
distribution of the signal’s energy or power, in the
frequency domain
 This concept is particularly important when considering
filtering in communication systems while evaluating the
signal and noise at the filter output.
 The energy spectral density (ESD) or the power spectral
density (PSD) is used in the evaluation.
 Need to determine how the average power or energy of
the process is distributed in frequency.

16

You might also like