Random Variables and Process
Random Variables and Process
Contents
Introduction,
Probability
Conditional Probability
Random variables
Several Random Variables
Statistical Averages: Moments
Random Processes
Mean, Correlation and Covariance function
Properties of autocorrelation function
Properties of Cross–correlation functions
Introduction
Signal: is a function of 1 or more independent variables which conveys some
information based on the nature of Physical phenomenon.
Example: speech signal. Video signal
Deterministic signals: the class of signals that may be modelled as completely
specified functions of time, it can be defined interms of mathematical
expression.
Ex: x(t)= cos(20t)
Fourier transform is a mathematical tool used for the representation of
deterministic signals.
Random Signal: it is not possible to predict the feature values in advance and
cannot be represented by mathematical expressions.
Ex: noise generated at the receiver.
Probability Theory: The branch of mathematics which deals with the
statistical characteristics of random signals is called Probability theory.
Probability theory
Probability theory is phenomena that, explicitly or implicitly modelled by an
experiment with an outcome that is subject to chance.
Ex: tossing a coin, outcomes are head and tail.
Random Experiment: if an experiment is repeated, the outcome can differ
because of the influence of random phenomenon, such an experiment is called
random experiment.
Experiment: is a set of rules which governs an operation to be performed.
Sample Space: The set of all possible outcomes of the experiment is called the
sample space, which we denote by S.
Ex: tossing a coin, S={ H, T}
Event: An event corresponds to either a single sample point or collection of
outcomes.
Ex: Coin is tossed 5 times, outcomes are E={ H, H, T ,H,T}
Elementary event: A single sample point is called an elementary event.
Ex: EL = {H}
Contd…
Sure Event: The entire sample space S is called the sure event.
Null event: the null set is called the null or impossible event.
Mutually Exclusive Events: Two events are said to be mutually
exclusive events, if they cannot occur at same time, mutually
exclusive events are also called disjoint events.
Ex: in tossing a coin, head and tail cannot occur simultaneously.
Probability: Probability P is function that assigns a non-negative
number to an event A in the sample space.
Relative Frequency Definition: If a random experiment is
conducted n times and if the event occurs nA times the probability of
occurrence of an event A is defined as
Properties
1: PDF is non-zero for all values of x
i.e. fX(x)>0, for all x
2. The area under the PDF curve is equal to 1.
4. The name density function arises from the fact that the probability of the event x1 < X ≤ x2 equals
Uniform Distribution Function:
The Uniform Distribution Function satisfies the requirement of a Probability
density function and also the area under the curve is unity and is said to be
uniformly distributed.
Several Random Variables:
Consider two random variables X and Y. We define the joint distribution function
FXY(x,y) as the probability that the random variable X is less than or equal to a
specified value x and that the random variable Y is less than or equal to a specified
value y.
Properties:
1. The joint distribution function is always a non-negative function, i.e. FXY ( x, y)≥ 0
2. The joint distribution function is always monotone non decreasing function of both X
and Y.
3. The joint distribution function is continuous every where in x-y plane.
Joint probability density function:
The partial derivative of joint distribution function FXY ( x, y) with respect to the
random variables x and y is
Two random variables X and Y are said to be uncorrelated if and only if cov[XY] = 0.
Note that if X and Y are statistically independent, then they are uncorrelated.
Random Process:
A random process or stochastic process is defined as the ensemble(collection) of time
functions together with a probability rule.
For a random variable, the outcome of a random experiment is mapped into a number.
For a random process, the outcome of a random experiment is mapped into a waveform,
which is a function of time.
Averages for a stochastic process are called ensemble averages.
Stationary Random Process: A random process is said to be stationary for the first
order if the distribution function X(t) does not vary with time.
Mean: The mean of the random process X(t) is the expectation of the random
variable obtained by observing the process at some time t, given by
Proof:
Properties:
Property 1: Cross correlation function is not generally an even function of τ
Property 2: It does not have maximum at the origin
Property 3: It obeys the symmetry relationship given by RXY(τ) = RYX(-τ)
Questions….???