0% found this document useful (0 votes)
64 views

Random Variables and Process

The document discusses key concepts in probability theory and random processes including: - Random variables assign numerical values to random outcomes and have probability distributions like the cumulative distribution function and probability density function. - Joint distributions describe the relationship between two or more random variables. Marginal distributions give the distribution of a single variable. - Statistical averages like the expected value and moments describe the central tendency and variation of a random variable. - Random processes extend the analysis of random variables over time and include concepts like autocorrelation, cross-correlation, and stationarity.

Uploaded by

Santosh Kumar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views

Random Variables and Process

The document discusses key concepts in probability theory and random processes including: - Random variables assign numerical values to random outcomes and have probability distributions like the cumulative distribution function and probability density function. - Joint distributions describe the relationship between two or more random variables. Marginal distributions give the distribution of a single variable. - Statistical averages like the expected value and moments describe the central tendency and variation of a random variable. - Random processes extend the analysis of random variables over time and include concepts like autocorrelation, cross-correlation, and stationarity.

Uploaded by

Santosh Kumar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Random Variables and Process

Contents
 Introduction,
 Probability
 Conditional Probability
 Random variables
 Several Random Variables
 Statistical Averages: Moments
 Random Processes
 Mean, Correlation and Covariance function
 Properties of autocorrelation function
 Properties of Cross–correlation functions
Introduction
 Signal: is a function of 1 or more independent variables which conveys some
information based on the nature of Physical phenomenon.
Example: speech signal. Video signal
 Deterministic signals: the class of signals that may be modelled as completely
specified functions of time, it can be defined interms of mathematical
expression.
Ex: x(t)= cos(20t)
 Fourier transform is a mathematical tool used for the representation of
deterministic signals.
 Random Signal: it is not possible to predict the feature values in advance and
cannot be represented by mathematical expressions.
Ex: noise generated at the receiver.
 Probability Theory: The branch of mathematics which deals with the
statistical characteristics of random signals is called Probability theory.
Probability theory
 Probability theory is phenomena that, explicitly or implicitly modelled by an
experiment with an outcome that is subject to chance.
Ex: tossing a coin, outcomes are head and tail.
 Random Experiment: if an experiment is repeated, the outcome can differ
because of the influence of random phenomenon, such an experiment is called
random experiment.
 Experiment: is a set of rules which governs an operation to be performed.
 Sample Space: The set of all possible outcomes of the experiment is called the
sample space, which we denote by S.
Ex: tossing a coin, S={ H, T}
 Event: An event corresponds to either a single sample point or collection of
outcomes.
Ex: Coin is tossed 5 times, outcomes are E={ H, H, T ,H,T}
 Elementary event: A single sample point is called an elementary event.
Ex: EL = {H}
Contd…
 Sure Event: The entire sample space S is called the sure event.
 Null event: the null set is called the null or impossible event.
 Mutually Exclusive Events: Two events are said to be mutually
exclusive events, if they cannot occur at same time, mutually
exclusive events are also called disjoint events.
Ex: in tossing a coin, head and tail cannot occur simultaneously.
 Probability: Probability P is function that assigns a non-negative
number to an event A in the sample space.
 Relative Frequency Definition: If a random experiment is
conducted n times and if the event occurs nA times the probability of
occurrence of an event A is defined as

 Ex: E= {H, T, H, H T}, P(H)=3/5, P(T) =2/5


Properties of Probability:

Extended Properties of Probability:


Conditional probability:
 Conditional probability is used to determine how two events are related;
that is, we can determine the probability of one event given the occurrence
of another related event.
 Let P[B|A] denote the probability of event B, given that event A has
occurred. The probability P[B|A] is called the conditional probability of B
given A.

 Bayes’ rule: We may write above Eq as


 P[A∩B] = P[B|A]P[A], P[A∩B] is joint probability of A and B.
 we may also write P[A∩B] = P[A|B]P[B],
 we may determine P[B|A] by using the above relation
Statistically independent events:
 Statistically independent events: Occurrence one event does not depends of the
occurrence of the other events.
 Suppose that the condition probability P[B|A] is simply equal to the elementary
probability of occurrence of event B, that is

 The joint probability P[A∩B] is equal to the product of the individual


probabilities P[A] and P[B], such an events are called Statistically independent
events.
Random Variables
 The outcome of a Random experiment are not convenient representation for mathematical analysis.
Ex: Head or Tail.
 It is more convenient if we assign a number or range of values to the outcome of Random
Experiment.
Ex: for head ‘1’ is assigned and for tail ‘0’ is assigned.
 A function whose domain is a sample space and whose range is a set of real numbers is called a
Random Variable.
 If the outcome of an experiment is S, random variable is denoted as X(s) or just X, X is a function
and also called as a Random Variable.
Cumulative distribution Function:
 Consider the random variable X and the probability of the event X ≤ x. We
denote this probability by P[X ≤ x].
 The probability of this event is a number which depends on X, that is the
function F(x). This function describe how the probabilities are distributed for
different values of Real numbers x and this function is called Cumulative
distribution function of random variable.

 Properties of Cumulative distribution function FX(x):


1. The distribution function FX(x) is bounded between 0 and 1,
i.e the probability of some event is always lies between 0 and 1.
2. FX(-∞) =0, FX( + ∞) =1
X= -∞ means no possible event P(X≤ -∞) =0, X= +∞ includes probability of
all possible event P(X≤ ∞) = 1
3. The distribution function is a monotonic non-decreasing function of x
Probability density function
 The derivative of the Cumulative distribution Function with respect to the random variable X is
called as Probability density Function denoted as fX(x).

 Properties
1: PDF is non-zero for all values of x
i.e. fX(x)>0, for all x
2. The area under the PDF curve is equal to 1.

3. CDF is obtained by integrating PDF

4. The name density function arises from the fact that the probability of the event x1 < X ≤ x2 equals
Uniform Distribution Function:
 The Uniform Distribution Function satisfies the requirement of a Probability
density function and also the area under the curve is unity and is said to be
uniformly distributed.
Several Random Variables:
 Consider two random variables X and Y. We define the joint distribution function
FXY(x,y) as the probability that the random variable X is less than or equal to a
specified value x and that the random variable Y is less than or equal to a specified
value y.

 Properties:
1. The joint distribution function is always a non-negative function, i.e. FXY ( x, y)≥ 0
2. The joint distribution function is always monotone non decreasing function of both X
and Y.
3. The joint distribution function is continuous every where in x-y plane.
Joint probability density function:
 The partial derivative of joint distribution function FXY ( x, y) with respect to the
random variables x and y is

 Properties of Joint PDF:


1. The joint PDF is always non-negative function
fXY(x,y)≥0
2. The area under the surface of joint PDF is equal to 1.
∞ ∞
‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦 = 1
3. The joint PDF is continuous every where.
Marginal Densities:

 When the Probability density functions fx x and fy y for a single random


variable are obtained from the joint PDF fXY(x, y), the fx x and fy y are
called marginal PDF or marginal Densities. It is given by

 𝑓𝑋 (𝑥) = ‫׬‬−∞ 𝑓𝑥𝑦 𝑥, 𝑦 𝑑𝑥

 𝑓𝑌 (𝑦) = ‫׬‬−∞ 𝑓𝑥𝑦 𝑥, 𝑦 𝑑𝑦
Conditional probability density function
 Let X and Y are two continuous random variables with joint probability density
function fXY(x,y). The Conditional PDF of Y given that X = x is defined by

 Properties of Joint Conditional PDF:


1. Conditional PDF is always non-negative function
fY(x/y)≥0 and fx(y/x)≥0.
2. The area under the Conditional PDF is equal to 1.
∞ ∞ 𝑦
‫׬‬−∞ 𝑓𝑥 𝑥Τ𝑦 𝑑𝑥 = 1 and ‫׬‬−∞ 𝑓𝑦 Τ𝑥 𝑑𝑦 = 1
3. If the random Variable X and Y are statically independent then,
𝑓𝑥 𝑥Τ𝑦 =𝑓𝑥 𝑥 and 𝑓𝑦 𝑦Τ𝑥 =𝑓𝑦 𝑦
Statistical Averages
 The expected value or mean of a random variable X is defined by

Function of a Random Variable :


 Let X denote a random variable, and let g(X) denote a real valued function
defined on the real line. We denote as

 To find the expected value of the random variable Y.


 Let Y= g(X)= cos(X)
 X is a random variable uniformly distributed in the interval (-π, π)
Moments:
 For the special case of g(X) = Xn
 We obtain the nth moment of the probability distribution of the random variable X given
by

 If n=1, then mean of Random Variable X

 If n=2, then is called mean square value.


Central Moments:
Are the moments of the difference between random variable X and its mean MX.
The nth central moment is defined as:

For n =1, the first central moment is equal to zero.


Variance:
 For n = 2 the second central moment is referred to as the variance of the random
variable X, given by

 Variance= mean square value- square of the mean value.


 Standard Deviation: is the square root of the variance of the random Variable X.
Gaussian Random Variable:
 Let X denotes a Gaussian distributed (bell shaped) random variable having
mean µX and variance σX2. The probability density function of such a Gaussian
random variable is defined by:
Joint Moments
 Consider a pair of random variables X and Y called as joint moments, the expected
value of Xi Yk, where i and k are the positive integer values, then

 For i=k=1, the expectation is called as the correlation given by

 The random variable X and Y are said to be orthogonal if E[XY]= 0


 Covariance: is a measure of the joint variability of two random variables, defined as
the expected product of their derivatives from their individual expected values.
Correlation coefficient
 Let σX2 and σY2 denotes the variance of X and Y respectively, then the covariance of X and Y
normalised with respect to σX and σy is called Correlation coefficient, given by

 Two random variables X and Y are said to be uncorrelated if and only if cov[XY] = 0.
 Note that if X and Y are statistically independent, then they are uncorrelated.
Random Process:
 A random process or stochastic process is defined as the ensemble(collection) of time
functions together with a probability rule.
 For a random variable, the outcome of a random experiment is mapped into a number.
 For a random process, the outcome of a random experiment is mapped into a waveform,
which is a function of time.
 Averages for a stochastic process are called ensemble averages.
 Stationary Random Process: A random process is said to be stationary for the first
order if the distribution function X(t) does not vary with time.
 Mean: The mean of the random process X(t) is the expectation of the random
variable obtained by observing the process at some time t, given by

 Autocorrelation Function: of the Random process X(t) is the expectation of


the product of the two random variables X(t1) and X(t2) obtained by observing X(t)
at time t1 and t2 respectively

 If X(t) is a stationary random process, then autocorrelation of X(t) depnds on the


time difference t2-t1 given by
Properties of Autocorrelation
 For convenience of notation, we redefine the autocorrelation function of a stationary
process X(t) as

 Property 1: The mean square value of the autocorrelation function is obtained by


letting τ=0,

 Property 2: The autocorrelation function RX(τ) is an even function of τ

Proof:

 Property 3: The autocorrelation function RX(τ) has its maximum


value at τ=0
 Auto covariance function : of a stationary Random Process X(t) is defined as

Cross correlation function:


 Let us consider two random process X(t) and Y(t), with autocorrelation
function RX(t, u) and RY(t, u) respectively.
 The cross correlation function of random process X(t)and Y(t) is given by

Properties:
 Property 1: Cross correlation function is not generally an even function of τ
 Property 2: It does not have maximum at the origin
 Property 3: It obeys the symmetry relationship given by RXY(τ) = RYX(-τ)
Questions….???

You might also like