0% found this document useful (0 votes)
2 views

03-Multiple Random Variables

This document covers Chapter 3 of a course on Probability and Random Processes, focusing on multiple random variables. It discusses concepts such as joint distribution functions, marginal statistics, independence, conditional distributions, and correlation. Additionally, it includes examples and properties related to joint probability density functions and mass functions for both continuous and discrete random variables.

Uploaded by

diakonsurafel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

03-Multiple Random Variables

This document covers Chapter 3 of a course on Probability and Random Processes, focusing on multiple random variables. It discusses concepts such as joint distribution functions, marginal statistics, independence, conditional distributions, and correlation. Additionally, it includes examples and properties related to joint probability density functions and mass functions for both continuous and discrete random variables.

Uploaded by

diakonsurafel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 42

Addis Ababa Science & Technology University

Department of Electrical & Computer Engineering

Probability and Random Process (EEEg-2114)

Chapter 3: Multiple Random Variables


Outline
 Introduction
 Bivariate Random Variables
 Joint Distribution Functions
 Discrete Random Variables - Joint pmf
 Continuous Random Variables – Joint pdf
 Marginal Statistics >>>individual from
joint
 Independence
 Conditional Distributions
 Correlation and Covariance Coefficient
2
In many cases it is more natural to describe the
outcome of a random experiment by two or more
numerical numbers simultaneously. For example,
the characterization of both weight and height
heigh in

a given population,
 the study of temperature and pressure
variations in a
physical experiment
In these situations, two or more random variables are
considered jointly and the description of their joint
behavior is
our concern.
They help us to compute probabilities involving the
output of systems with two (and sometimes three or
more) random inputs.
inputs
3
Vector Random Variables
 Suppose two random variable X and Y are defined on a
sample space S, where specific values of X and Y are
denoted by x and y, respectively.
Then any ordered pair of numbers (x,y) may be
conveniently considered to be a random point in the xy
 plane.
The point may be taken as a specific value of a vector
random variable or a random vector.
 The plane of all points (x,y) in the ranges of X and Y may
be considered a new sample space.
 It is in reality a vector space where the components of any
vector are the values of the random variables X and Y.
NB:
 A bivariate random variable, like the above, maps
sample space to a point in a plane. i.e 2D
 A trivariate random variable maps outcomes to a
point in a 3D 4
 The new space has been called the
range sample space or the two-
dimensional product space.
 As in the case of one X x variable, let
A random
us define an event A by B Y  y
 A similar event B can be defined for Y:
 Events A andAB refer
X  x to B sample
andthe Y  y space
S , while events
refer to the joint sample space

5
Fig. 3.1 (X, Y) as a
function from Ω to the
plane.

Fig. 3.2 comparison of


events in Ω with those in

6
The Joint Cumulative Distribution Function (JCDF)
 The joint cdf of two random variables X and Y denoted by
FXY(x, y) is a function defined by:
FXY ( x, y ) P[ X ( )  x and Y ( )  y ]
 FXY ( x, y ) P ( X  x, Y  y ) P ( A  B )
where x and y are arbitrary real numbers.

Properties of the Joint cdf, FXY(x, y):


i. 0 FXY ( x, y ) 1
ii. lim FXY ( x, y ) FXY (, ) 1
x 
y 

iii. lim FXY ( x, y ) FXY ( , ) 0


x  
y  
7
The Joint Probability Density Function (JPDF)
 The joint probability function (JPDF) of two continuous
random variables X and Y in terms of JCDF is defined as:
2
 FXY ( x, y )
f XY ( x, y ) 
xy
 Thus, the joint cumulative distribution function
(JCDF) in terms of Jpdf is given by:

8
The Joint Probability Density Function Cont’d…..

Properties of the Joint pdf, fXY(x, y):

1. f XY ( x, y ) 0
 
2.  f
- -
XY ( x, y )dxdy 1
y2 x2
3. P( x1  X  x2 , y1 Y  y2 )   f XY ( x, y )dxdy
y1 x1

9
The Joint Probability Mass Function (JPMF)
 The joint probability mass function (pmf) of two discrete
random variables X and Y is defined as:
PXY ( xi , y j ) P ( X  xi , Y  y j )
 The joint cdf can be written as:

FXY ( x, y )    PXY ( xi , y j )
xi x y j y

Properties of the Joint pmf, PXY (xi , yj ):


1. 0 PXY ( xi , y j ) 0
2. P
xi yj
XY ( xi , y j ) 1

10
Marginal Distribution functions
 the distribution function of one random variable
can be obtained by setting the value of the other
variable to x, y 
FX ,Yinfinity in F.X The
x  or functions
FY y 
obtained in this manner are called
marginal distribution functions.

 Thus, we conclude the following:

11
i. Marginal cdf of X and Y
FX ( x) lim FXY ( x, y ) FXY ( x, )  P ( A  S ) P ( A)
y 

FY ( y ) lim FXY ( x, y ) FXY (, y )


x 

ii. Marginal pdf of X and Y


 
f X ( x)  f XY ( x, y )dy f Y ( y )  f XY ( x, y )dx
- -

iii. Marginal pmf of X and Y

P ( X  xi ) PX ( xi )  PXY ( xi , yi )
yj

P (Y  y j ) PY ( y j )  PXY ( xi , yi )
xi

12
Independence
 If two random variables X and Y are independent, then
i. from the joint cdf

FXY ( x, y ) FX ( x) FY ( y )  P ( A  B ) P ( A) P ( B )

ii. from the joint pdf

f XY ( x, y )  f X ( x) f Y ( y )

iii.from the joint pmf

PXY ( xi , y j ) PX ( xi ) PY ( y j )

13
Conditional Distributions
i. Conditional Probability Density Functions
f XY ( x, y ) P( A  B)
f X /Y ( x / y)  , fY ( y )  0  P( A / B) 
fY ( y ) P( B)
f XY ( x, y )
fY / X ( y / x)  , f X ( x)  0
f X ( x)
i. Conditional Probability Mass Functions
PXY ( xi , y j )
PX / Y ( xi / y j )  , PY ( y j )  0
PY ( y j )
PXY ( xi , y j )
PY / X ( y j / xi )  , PX ( xi )  0
PX ( xi )
14
The (k, n)th moment of a bivariate r.v. (X, Y) is
defined by

If n = 0, we obtain the kth moment of X, and if k = 0, we


obtain the nth moment of Y.

If (X, Y) is a discrete bivariate r.v. mean is given by

 Similarly, we have

15
If (X, Y) is a continuous bivariate rv mean of each is:

Similarly, we have

16
Correlation and Covariance
Correlation :measures how much X and Y are related to each other.
Important when one is a linear function of the other
m11 RXY Cor ( X , Y ) E ( XY )

If RXY 0 , X & Y are called Orthogonal


Covariance: measures how much X and Y vary together around their
mean. If  XY 0 , X & Y are uncorrelated

Sometimes we look at the Correlation Coefficient is the


correlation b/n the “normalized” correlation of their
random variables. normalized versions,

17
Examples on Two Random Variables

Example-1: The joint pdf of two continuous random variables X & Y is


kxy , 0  x  1, 0  y  1
f XY ( x, y ) 
0 , otherwise
whe re k is a constant.
a. Find the value of k .
b. Find the marginal pdf of X and Y .
c. Are X and Y independent?
d . Find P ( X  Y  1)
(f)
e. Find the
Find the mean and
conditiona l pdfthe X and Y . of X &
of variance
Y
(g) Find the correlation & covariance of X
and Y.
(h) Find the correlation coefficient of X and 18
Y.
Examples on Two Random Variables Cont’d……

Solution:
  1 1
a. 
-  
f XY ( x, y )dxdy 1  kxydxdy 1
0 0

1 x2  1
 k y  1
0
 2 0
k 1  y2 1 k
 ydy k    1
2 0  4 0 4
 k 4

19
Examples on Two Random Variables Cont’d……

Solution:
b. Marginal pdf of X and Y
i. Marginal pdf of X
 1
f X ( x)  f XY ( x, y )dy 4 xydy
 0

 y2
1
 f X ( x) 4 x  2 x
 2 0
2 x , 0  x 1
 f X ( x) 
0, otherwise
20
Examples on Two Random Variables Cont’d……

Solution:
b. Marginal pdf of X and Y
ii. Marginal pdf of Y
 1
fY ( y )  f XY ( x, y )dx 4 xydx
 0

 x2
1
 fY ( y ) 4 y  2 y
 2 0
2 y , 0  y 1
 fY ( y ) 
0, otherwise
21
Examples on Two Random Variables Cont’d……

Solution:
c. f XY ( x, y )  f X ( x) fY ( y )
 X and Y are independent
1 1 y 1  x2  1 y
d . P( X  Y  1)  4 xydxdy 4 y  dy
0 0 0
 2 0
1 1
4 y[1 / 2(1  y ) ]dy 2( y  2 y 2  y 3 )dy
2
0 0

2( y 2 / 2  2 y 3 / 3  y 4 / 4) 1 / 6
 P( X  Y  1) 1 / 6

22
Examples on Two Random Variables Cont’d……

Solution:
e. Conditional pdf of X and Y
i. Conditional pdf of X
f XY ( x, y ) 4 xy
f X /Y ( x / y)   2 x
fY ( y ) 2y

2 x, 0  x  1, 0  y  1
 f X / Y ( x / y ) 
0, otherwise

23
Examples on Two Random Variables Cont’d……

Solution:
e. Conditional pdf of X and Y

ii. Conditional pdf of Y

f XY ( x, y ) 4 xy
fY / X ( y / x)   2 y
f X ( x) 2x

2 y, 0  x  1, 0  y  1
 fY / X ( y / x) 
0, otherwise

24
Examples on Two Random Variables Cont’d……
Example-2: The joint pmf of two discrete random variables
X & Y is k ( 2 x  y ) , x 1, 2; y 1, 2
i j i
PXY ( xi , y j ) 
0 , otherwise
where k is a constant.
a. Find the value of k .
b. Find the marginal pmf of X and Y .
c. Are X and Y independent?
(d) Find the mean and the variance of X.
(e) Find the mean and the variance of Y.
(f) Find the covariance of X and Y.
(g) Find the correlation coefficient of X and
25
Y.
Examples on Two Random Variables Cont’d……

Solution:

a. Pxi yj
XY ( xi , y j ) 1

2 2
   k (2 x
xi 1 y j 1
i  y j ) 1

 k[(2  1)  (2  2)  (4  1)  (4  2)] 1

 18k 1

 k 1 / 18
26
Examples on Two Random Variables Cont’d……

Solution:
b. Marginal pmf of X and Y
i. Marginal pmf of X
2
1
PX ( xi )  PXY ( xi , y j )   (2 xi  y j )
yj y j 118

1 1
 PX ( xi )  (2 xi  1)  (2 xi  2)
18 18
1
 (4 xi  3), xi 1, 2
 PX ( xi ) 18

0, otherwise
27
Examples on Two Random Variables Cont’d……

Solution:
b. Marginal pmf of X and Y
ii. Marginal pmf of Y
2
1
PY ( y j )  PXY ( xi , y j )  (2 xi  y j )
xi xi 1 18

1 1
 PY ( y j )  (2  y j )  (4  y j )
18 18
1
 (2 y j  6), y j 1, 2
 PY ( y j ) 18

0, otherwise
28
Examples on Two Random Variables Cont’d……

Solution:

c. PXY ( xi , y j )  PX ( xi ) PY ( y j )
 X and Y are not independent.

29
Examples on Two Random Variables Cont’d……

Example-3: The joint pdf of two continuous random variables X


and Y is given by:
k , 0  y x  1
f XY ( x, y ) 
0, otherwise
where k is a constant.
a. Determine the value of k .
b. Find the marginal pdf of X and Y .
c. Are X and Y independent?
d . Find P (0  X  1 / 2)
e. Find the conditional pdf of X and Y . 30
Examples on Two Random Variables Cont’d……

Solution:
  1 1
a. 
-  
f XY ( x, y )dxdy 1  kdxdy 1
0 y

1 1
 k x  1
0 y
1  y2 1 k
 k (1  y )dy k  y    1
0
 2 0 2
 k 2

31
Examples on Two Random Variables Cont’d……

Solution:
b. Marginal pdf of X and Y
i. Marginal pdf of X
 x
f X ( x)  f XY ( x, y )dy  2dy
 0

x
 f X ( x) 2 y  2 x
0
2 x , 0  x 1
 f X ( x) 
0, otherwise
32
Examples on Two Random Variables Cont’d……

Solution:
b. Marginal pdf of X and Y
ii. Marginal pdf of Y
 1
fY ( y )  f XY ( x, y )dx 2dx
 y

1
 fY ( y ) 2 x  2(1  y )
y
2(1  y ), 0  y 1
 fY ( y ) 
0, otherwise
33
Examples on Two Random Variables Cont’d……

Solution:
c. f XY ( x, y )  f X ( x) fY ( y )
 X and Y are not independent
1/ 2 x
d . P(0  X  1 / 2)  f XY ( x, y )dydx
0 0

1/ 2 x 1/ 2 x
 2dydx  (2 y ) dx
0 0 0 0
1/ 2 1/ 2
 2 xdx  x 2
1 / 4
0 0
 P(0  X  1 / 2) 1 / 4
34
Examples on Two Random Variables Cont’d……

Solution:

e. Conditional pdf of X and Y


i. Conditional pdf of X
f XY ( x, y ) 2 1
f X /Y ( x / y)   
fY ( y ) 2(1  y ) (1  y )
 1
 , 0  y x  1
 f X / Y ( x / y ) 1  y
0, otherwise

35
Examples on Two Random Variables Cont’d……

Solution:

e. Conditional pdf of X and Y


ii. Conditional pdf of Y
f XY ( x, y ) 2 1
fY / X ( y / x)   
f X ( x) 2x x
1
 , 0  y x  1
 fY / X ( y / x)  x
0, otherwise

36
37
Example: The joint cdf of a bivariate r.v. (X, Y) is
given by

(a) Find the marginal cdf's of X and Y.


(b) Show that X and Y are independent.
(c) Find P(X ≤1, Y ≤ 1), P(X ≤ 1), P(Y > 1).
Solution:

38
Example: Consider the binary communication channel
shown in Fig. 3-4 (Prob. 1.52). Let (X, Y) be a bivariate r.v.,
where X is the input to the channel and Y is the output of
the channel. Let
P(X = 0) = 0.5, P(Y = 1/ X = 0) = 0.1, and P(Y = 0 / X = 1)
= 0.2.

Binary communication channel

(a)Find the joint pmf's of


(X, Y).
(b)Find the marginal pmf's
39
of X and Y.
40
Exercises

41
4. Let X and Y be two random variables with means
mX and mσY2X , variances
σ2 Y and , and correlation
coefficient ρ.
Suppose X cannot be observed, but we are able to
measure Y.
We wish to estimate X by using the quantity aY ,
where a is a suitable constant. Assuming mX =
mY = 0, find the constant a that minimizes the mean
squared error E[(X −aY )2]. Your answer should
depend on σX , σY , and ρ

42

You might also like