0% found this document useful (0 votes)
10 views5 pages

Chapter 5

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views5 pages

Chapter 5

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

CHAPTER 5

Multiple Random Variables

5.1. Joint Distributions: Two Random Variables

In real life, we are often interested in several random variables that are related to each other. For
example, suppose that we choose a random family, and we would like to study the number of
people in the family, the household income, the ages of the family members, etc. Each of these is a
random variable, and we suspect that they are dependent. In this chapter, we develop tools to study
joint distributions of random variables. The concepts are similar to what we have seen so far in the
previous chapter. The only difference is that instead of one random variable, we consider two or
more. In this chapter, we will focus on two random variables, but once you understand the theory
for two random variables, the extension to n random variables is straightforward. We will first
discuss joint distributions of discrete random variables and then extend the results to continuous
random variables.

5.1.1 Joint Probability Mass Function (pmf) and Joint Probability density
Function (pdf)

Suppose X and Y are random variables on the probability space (S, A, P(.)). Their distribution
functions f(x) and f(y) contain information about their associated probabilities. But how do we
describe information about their properties relative to each other? We think of X and Y as
components of a random vector (X, Y) taking values in R2, rather than as unrelated random
variables each taking values in R.

Remember that for a discrete random variable X, we define the PMF as P (x) = P(X= x). Now, if
we have two random variables X and Y, and we would like to study them jointly, we define the
joint probability mass function as follows:

The joint probability mass function of two discrete random variables X and Y is defined as

( ) ( )

1
Properties of the joint pmf and pdf

i) ( )
ii) Total probability is 1. i.e

∑ ∑ ( )

Continuous case: probability density function (pdf)


i. f (x, y) ≥ 0
ii. Total probability is 1. i.e

∫ ∫ ( )

Marginal PMFs

The joint PMF contains all the information regarding the distributions of X and Y. This means that,
for example, we can obtain PMF of X from its joint PMF with Y. Indeed, we can write

( ) ( )

( ) = ∑ ( ) Total law of probability

Here, we call P(x) the marginal PMF of X. Similarly, we can find the marginal PMF of Y as

( )= ∑ ( )

Marginal PDFs

The joint PDF contains all the information regarding the distributions of X and Y. This means that,
for example, we can obtain PDF of X from its joint PDF with respect to Y. Indeed, we can write

( ) ∫ ( )

Similarly, we can find the marginal PDF of Y as

2
( ) ∫ ( )

5.1.2 Conditioning and Independence

We have discussed conditional probability before, and you have already seen some problems
regarding random variables and conditional probability. Here, we will discuss conditioning for
random variables more in detail and introduce the conditional PMF, conditional CDF, and
conditional expectation. We would like to emphasize that there is only one main formula regarding
conditional probability which is

( )
( | ) Where ( )
( )

Any other formula regarding conditional probability can be derived from the above formula.
Specifically, if you have two random variables X and Y, you can write

( )
( | ) where C , D
( )

Conditional PMF

Remember that the PMF is by definition a probability measure, i.e., it is P(X = xi). Thus, we can
talk about the conditional PMF. Specifically, the conditional PMF of X given event A, is defined
as:

( )
( ) ( )=
( )

Conditional PMF of X Given Y

In some problems, we have observed the value of a random variable Y, and we need to update the

PMF of another random variable X whose value has not yet been observed. In these problems, we

use the conditional PMF of X given Y. The conditional PMF of X given Y is defined as

3
( )
( ) =
( )

( )
=
( )

Similarly, we can define the conditional probability of Y given X:

( )
( ) =
( )

( )
=
( )

Independent Random Variables

We have defined independent random variables previously. Now that we have seen joint PMF

Two discrete random variables X and Y are independent if:

( ) ( ) ( )

= ( ) ( ) ( ) for all x , y

So, if X and Y are independent, we have

( )
( )= ( ) ( )

( ) ( )
=
( )

= ( )

As we expect, for independent random variables, the conditional PMF is equal to the marginal
PMF. In other words, knowing the value of Y does not provide any information about X.

Example:

Consider two random variables X and Y with joint PMF given below

4
X/Y Y= 0 Y= 1 Y= 2

X= 0

X= 1

A) Find P(X=0,Y≤1)
B) Find the marginal PMFs of X and Y
C) Find P(Y=1/X=0)
D) Are X and Y independent?
Example: Let X and Y be jointly continuous random variable with pdf.

( ) {

A) Find the constant C.


B) Find ( , )

C) Marginal Pdf of X and Y.


D) ( )

E)
F) ( ),E(Y), E(XY)

Exercise: Let X and Y be jointly continuous random variable with pdf.

( ) {

A) Find the constant C.


B) Find marginal PDFs, fX(x) and fY(y).
C) Find P(Y ≤ ).

D) Find P(Y ≤ /Y ≤ )

You might also like