0% found this document useful (0 votes)
54 views

Random Signals and Processes: Chapter 4: Pairs of Random Variables

This chapter discusses analyzing experiments that produce two random variables, X and Y. It introduces key concepts such as joint cumulative distribution functions, joint probability mass functions, marginal distributions, covariance, correlation, and independence of random variables. An example experiment is presented that tests integrated circuits and analyzes the joint probability mass function of the number of acceptable circuits (X) and number of successful tests before failure (Y). The chapter derives formulas for expected values, variance, and covariance of functions of two random variables.

Uploaded by

Nazifa Nawer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views

Random Signals and Processes: Chapter 4: Pairs of Random Variables

This chapter discusses analyzing experiments that produce two random variables, X and Y. It introduces key concepts such as joint cumulative distribution functions, joint probability mass functions, marginal distributions, covariance, correlation, and independence of random variables. An example experiment is presented that tests integrated circuits and analyzes the joint probability mass function of the number of acceptable circuits (X) and number of successful tests before failure (Y). The chapter derives formulas for expected values, variance, and covariance of functions of two random variables.

Uploaded by

Nazifa Nawer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Random Signals and Processes

Chapter 4: Pairs of Random Variables

Dr. Mohammad Rakibul Islam


Professor, EEE Department,
Islamic University of Technology
• Introduction:
• Chapter 2 and Chapter 3 analyze
experiments in which an outcome is one
number.
• This chapter analyzes experiments that
produce two random variables, X and Y.
• An example of two random variables that
we encounter all the time in our research is
the signal (X), emitted by a radio
transmitter, and the corresponding signal
(Y) that eventually arrives at a receiver.
• Joint Cumulative Distribution Function:
• The joint cumulative distribution function of random
variables X and Y is

• The area of the ( X , Y ) plane corresponding to the joint


cumulative distribution function FX,Y(x,y )?.
• Theorem 4.1:
• Quiz 4.1:

• Solution:
• Joint Probability Mass Function:
– The joint probability mass function of discrete random variables X
and Y is

• Example 4.1
Test two integrated circuits one after the other. On each
test, the possible outcomes are a (accept) and r (reject).
Assume that all circuits are acceptable with probability
0.9 and that the outcomes of successive tests are
independent. Count the number of acceptable circuits X
and count the number of successful tests Y before you
observe the first reject. (If both tests are successful, let Y
= 2.) Draw a tree diagram for the experiment and find the
joint PMF of X and Y.
• Solution:
• Solution (continued):
• Joint PMF can be represented in the following 3 ways
• Theorem 4.2

• Example 4.2
• Continuing Example 4.1, find the probability of the event B that X,
the number of acceptable circuits, equals Y, the number of tests
before observing the first failure.
• Solution:
• Quiz 4.2
• Solution
• Marginal PMF
• In an experiment that produces two random variables X
and Y, it is always possible to consider one of the random
variables, Y, and ignore the other one, X.
• Two ways:
– use the methods of Chapter 2 to analyze the experiment and
derive Py ( y )
– if we have already analyzed the experiment to derive the joint
PMF PX(x, y), it would be convenient to derive PY (y) from PX (x, y)
• Theorem 4.3
• Example 4.3

• Solution:
• Solution (continued):
• For the PMF of Y
• Joint Probability Density Function

• Example 4.4
• Solution:
• Marginal PDF

• Example 4.7
• Solution:
• Solution (continued):
• Functions of Two Random Variables:

• Example 4.8
• Solution:
• Theorem 4.11:

• Example 4.9:

• Solution:
• Solution (continued):
• Expected Values:
– For random variables X and Y, the expected value of W = g(X, Y) is

• Example 4.11: Self study


• Theorem 4.14: For any two random variables X and Y,

• Theorem 4.15: The variance of the sum of two random variables is

• Proof: Self study


• Covariance:
– The covariance of two random variables X and Y is

• Correlation
– The correlation of X and Y is

• Theorem 4.16
• Example 4.12:

• Solution:
• Orthogonal Random Variables:
– Random variables X and Y are orthogonal if rx, y = 0.
• Uncorrelated Random Variables:
– Random variables X and Y are uncorrelated if Cov[X, Y] = 0.
• Correlation Coeffient
– The correlation coefficient of two random variables X and Y is
• Independent Random Variables:
– Random variables X and Y are independent if and only if

• Example 4.23

• Solution:
• Theorem 4.27:
• Example 4.25:

• Solution:
That is the end of chapter 4

Syllabus for mid semester is up to this slide

Best Wishes for your Mid Semester Exam

You might also like