0% found this document useful (0 votes)
180 views6 pages

Eda Joint Probability Distribution

The document discusses joint probability distributions for discrete and continuous random variables. It defines joint probability mass functions (jpmf) and joint probability density functions (jpdf) and how they describe the simultaneous behavior of two random variables. It also defines marginal probability distributions and how to calculate them from the joint distribution. Expected values and variances of the random variables are defined in terms of their joint distributions. Conditional probability distributions given one variable is also introduced.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
180 views6 pages

Eda Joint Probability Distribution

The document discusses joint probability distributions for discrete and continuous random variables. It defines joint probability mass functions (jpmf) and joint probability density functions (jpdf) and how they describe the simultaneous behavior of two random variables. It also defines marginal probability distributions and how to calculate them from the joint distribution. Expected values and variances of the random variables are defined in terms of their joint distributions. Conditional probability distributions given one variable is also introduced.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Page 1 of 6

ENGINEERING DATA ANALYSIS


JOINT PROBABILITY DISTRIBUTION

 So far we have studied probability models for a single discrete or continuous random variable.
 In many practical cases it is appropriate to take more than one measurement of a random observation. For
example:
1. Height and weight of a medical subject.
2. Grade on quiz 1, quiz 2, and quiz 3 of a math student.
 How are these variables related?
 The air quality type of situation is very important and is the foundation of much of inferential statistics.

Joint Probability Distribution


 In general, if X and Y are two random variables, the probability distribution that defines their simultaneous behavior
is called a joint probability distribution.
 If X and Y are discrete, this distribution can be described with a joint probability mass function.
 If X and Y are continuous, this distribution can be described with a joint probability density function.

Joint Probability Mass function


 Definition: Let X and Y be two discrete random variable, and let S denote the two-dimensional support of X and Y.
Then the function 𝑓𝑋𝑌 (𝑥, 𝑦) = 𝑃(𝑋 = 𝑥, 𝑌 = 𝑦) is a joint probability mass function (jpmf) if it satisfies the
following conditions:
1. 0 ≤ 𝑓(𝑥, 𝑦) ≤ 1
2. ∑𝑥∈𝑆 ∑𝑦∈𝑆 𝑓(𝑥, 𝑦) = 1

Marginal Probability Mass Function


 Let X be a discrete random variable with support 𝑆1, and let Y be a discrete random variable with support 𝑆2 . Let X
and Y have the joint probability mass function 𝑓(𝑥, 𝑦) with support S. Then the probability mass function of X
alone, which is called the marginal probability mass function of X is defined by:
𝑓𝑋 (𝑥) = 𝑃(𝑋 = 𝑥) = ∑ 𝑓(𝑥, 𝑦) , 𝑥 ∈ 𝑆1
𝑦
Where, for each x in the support 𝑆1, the summation is taken over all possible values of y. Similarly, Then the
probability mass function of Y alone, which is called the marginal probability mass function of Y is defined by:
𝑓𝑌 (𝑦) = 𝑃(𝑌 = 𝑦) = ∑ 𝑓(𝑥, 𝑦) , 𝑦 ∈ 𝑆2
𝑥
Where, for each y in the support 𝑆2 , the summation is taken over all possible values of x.

Example #1:
Suppose we toss a pair of fair, four sided dice, in which one of the dice is red and the other is black. Let X be the
outcome on the red die, and Y be the outcome on the black die.
a. Find the joint probability mass distribution of red and black die.
b. Find the marginal probability mass function of X
c. Find the marginal probability mass function of Y

EDA: Joint Probability Distribution


Page 2 of 6

Example#2:
Measurements for the length and width of a rectangular plastic covers for CDs are rounded to the nearest mm (so
they are discrete). Let X denote the length and Y denote the width. The possible values of X are 129, 130, and 131
mm. The possible values of Y are 15 and 16 mm (Thus, both X and Y are discrete). The joint probability mass
distribution table is given below.

a. Determine if the jpmf is valid or not.


b. Find the marginal probability mass function of X
c. Find the marginal probability mass function of Y

Expected Value and Variance


 Let X be a discrete random variable with support 𝑆1, and let Y be a discrete random variable with support 𝑆2 . Let X
and Y have the joint probability mass function 𝑓(𝑥, 𝑦) with support S. Then the expected value of X and Y is given
by:
𝐸(𝑋) = ∑ ∑ 𝑥 𝑓(𝑥, 𝑦)
𝑥 𝑦

𝐸(𝑌) = ∑ ∑ 𝑦 𝑓(𝑥, 𝑦)
𝑥 𝑦

 Let X be a discrete random variable with support 𝑆1, and let Y be a discrete random variable with support 𝑆2 . Let X
and Y have the joint probability mass function 𝑓(𝑥, 𝑦) with support S. Then the expected value of X and Y is given
by:
𝑉(𝑋) = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2
𝑉(𝑌) = 𝐸(𝑌 2 ) − [𝐸(𝑌)]2

Example#3:
a. Find the mean of X and the mean of Y in example#1
b. Find the variance of X and the variance of Y in example#2

Joint Probability Density Function


 Let X and Y be two continuous random variables, and let S denote the two-dimensional support of X and Y. Then
the function 𝑓 (𝑥, 𝑦) is a joint probability density function (jpdf) if it satisfies the following conditions:
1. 𝑓(𝑥, 𝑦) ≥ 0
∞ ∞
2. ∫−∞ ∫−∞ 𝑓 (𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 1

Example#4: Determine if the joint pdf given below is valid or not.


2
1. 𝑓 (𝑥, 𝑦) = 21 𝑥 2 𝑦 , 1 ≤ 𝑥 ≤ 2 and 0 ≤ 𝑦 ≤ 3.

2. 𝑓𝑋𝑌 (𝑥, 𝑦) = 4𝑦 − 2𝑥 , 0 < 𝑦 < 1; 0 < 𝑥 < 1

EDA: Joint Probability Distribution


Page 3 of 6

Marginal Probability Density Function


 The marginal probability density function of the continuous random variables X and Y are given respectively by:

𝑓𝑋 (𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦
−∞


𝑓𝑌 (𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥
−∞
Where 𝑠1 and 𝑠2 are the respective supports of X and Y.

Example#5: Find the marginal probability density function of the following:


2
1. 𝑓 (𝑥, 𝑦) = 21 𝑥 2 𝑦 , 1 ≤ 𝑥 ≤ 2 and 0 ≤ 𝑦 ≤ 3
2. 𝑓 (𝑥, 𝑦) = 4𝑦 − 2𝑥 , 0 < 𝑦 < 1; 0 < 𝑥 < 1

Expected Value and Variance


 The expected value of a continuous random variable X can be found from the joint pdf of X and Y by:

𝐸(𝑋) = ∫ 𝑥 𝑓𝑋 (𝑥)𝑑𝑥
−∞
 The expected value of a continuous random variable Y can be found from the joint pdf of X and Y by:

𝐸(𝑌) = ∫ 𝑦 𝑓𝑌 (𝑦)𝑑𝑦
−∞

 The variance of a continuous random variable X can be found from the joint pdf of X and Y by:
𝑉(𝑋) = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2
 The varinace of a continuous random variable Y can be found from the joint pdf of X and Y by:
𝑉(𝑌) = 𝐸(𝑌 2 ) − [𝐸(𝑌)]2

Example#6: Find the mean and variance of X and Y


2
1. 𝑓 (𝑥, 𝑦) = 21 𝑥 2 𝑦 , 1 ≤ 𝑥 ≤ 2 and 0 ≤ 𝑦 ≤ 3
2. 𝑓 (𝑥, 𝑦) = 4𝑦 − 2𝑥 , 0 < 𝑦 < 1; 0 < 𝑥 < 1

EDA: Joint Probability Distribution


Page 4 of 6

CONDITIONAL DISTRIBUTION FOR DISCRETE RANDOM VARIABLE

In the last two lessons, we've concerned ourselves with how two random variables X and Y behave jointly. We'll now turn to
investigating how one of the random variables, say Y, behaves given that another random variable, say X, has already
behaved in a certain way.

Definition. A conditional probability distribution is a probability distribution for a sub-population. That is, a conditional
probability distribution describes the probability that a randomly selected person from a sub-population has the one
characteristic of interest.

Definition. The conditional probability mass function of X, given that Y = y, is defined by:
𝑓(𝑥, 𝑦)
𝑔(𝑥|𝑦) = 𝑝𝑟𝑜𝑣𝑖𝑑𝑒𝑑 𝑓𝑌(𝑦) > 0
𝑓𝑌(𝑦)
Similarly, the conditional probability mass function of Y, given that X = x, is defined by:
𝑓(𝑥, 𝑦)
ℎ(𝑦|𝑥) = 𝑝𝑟𝑜𝑣𝑖𝑑𝑒𝑑 𝑓𝑋(𝑥) > 0
𝑓𝑋(𝑥)

Example
Let X be a discrete random variable with support S1 = {0, 1}, and let Y be a discrete random variable with support S2 = {0, 1,
2}. Suppose, in tabular form, that X and Y have the following joint probability distribution f(x,y):

X 0 1
Y 0 1/8 2/8
1 2/8 1/8
2 1/8 1/8
a. What is the conditional distribution of X given Y? That is, what is 𝑔(𝑥|𝑦)?
b. What is the conditional distribution of Y given X? That is, what is ℎ(𝑥|𝑦)?

Conditional Distribution holds the following properties:

(1) Conditional distributions are valid probability mass functions in their own right. That is, the conditional
probabilities are between 0 and 1, inclusive:
0 ≤ 𝑔(𝑥|𝑦) ≤ 1 𝑎𝑛𝑑 0 ≤ ℎ(𝑦|𝑥) ≤ 1
and, for each subpopulation, the conditional probabilities sum to 1:

∑ 𝑔(𝑥|𝑦) = 1 𝑎𝑛𝑑 ∑ ℎ(𝑦|𝑥) = 1


𝑥 𝑦
(2) In general, the conditional distribution of X given Y does not equal the conditional distribution of Y given X. That
is:
𝑔(𝑥|𝑦) ≠ ℎ(𝑦|𝑥)

EDA: Joint Probability Distribution


Page 5 of 6

Conditional Means and Variances

 Suppose X and Y are discrete random variables. Then, the conditional mean of Y given X = x is defined as:
𝜇𝑌|𝑋 = 𝐸[𝑌|𝑥] = ∑ 𝑦ℎ(𝑦|𝑥)
𝑦
 The conditional mean of X given Y = y is defined as:
𝜇𝑋|𝑌 = 𝐸[𝑋|𝑦] = ∑ 𝑥𝑔(𝑥|𝑦)
𝑥
 The conditional variance of Y given X = x is:
𝜎 2 𝑌|𝑥 = 𝑉(𝑌|𝑥) = 𝐸[𝑌 2 |𝑥] − (𝐸[𝑌|𝑥])2

 The conditional variance of X given Y = y is:


𝜎 2 𝑋|𝑦 = 𝑉(𝑋|𝑦) = 𝐸[𝑋 2 |𝑦] − (𝐸[𝑋|𝑦])2

Example: Calculate the following using the previous data

a. What is the conditional mean of Y given X = x?


b. What is the conditional mean of X given Y = y?
c. What is the conditional variance of Y given X = 0?
d. What is the conditional variance of X given Y = 1?

Conditional Distributions for Continuous Random Variables

Definition. Suppose X and Y are continuous random variables with joint probability density function f(x,y) and marginal
probability density functions fX(x) and fY(y), respectively.

 Then, the conditional probability density function of X given Y = y is defined as:


𝑓(𝑥, 𝑦)
𝑔(𝑥|𝑦) = provided 𝑓𝑌(𝑦) > 0.
𝑓𝑌 (𝑦)
 Then, the conditional probability density function of Y given X = x is defined as:
𝑓(𝑥, 𝑦)
ℎ(𝑦|𝑥) = provided 𝑓𝑋(𝑥) > 0.
𝑓𝑋 (𝑥)
 The conditional mean of X given Y = y is defined as:

𝐸(𝑋|𝑦) = ∫ 𝑥 𝑔(𝑥|𝑦)𝑑𝑥
−∞
 The conditional mean of Y given X = x is defined as:

𝐸(𝑌|𝑥) = ∫ 𝑦 ℎ(𝑦|𝑥)𝑑𝑦
−∞
 The conditional variance of X given Y = y is defined as:
𝑉𝑎𝑟(𝑋|𝑦) = 𝐸[𝑋 2 |𝑦] − [𝐸(𝑋|𝑦)]2

EDA: Joint Probability Distribution


Page 6 of 6

 The conditional variance of Y given X = x is defined as:


𝑉𝑎𝑟(𝑌|𝑥) = 𝐸[𝑌 2 |𝑥] − [𝐸(𝑌|𝑥)]2
Example
1. Suppose the continuous random variables X and Y have the following joint probability density function:
3
𝑓(𝑥, 𝑦) = 2 for 𝑥 2 ≤ 𝑦 ≤ 1 and 0 < 𝑥 < 1.
a. What is the conditional joint probability density function of Y given X = x?
b. If X=1/4, What is the conditional joint probability density function of Y?
c. If X=1/2, What is the conditional joint probability density function of Y?

2. What is the conditional mean of Y given X=x?


a. What us the conditional mean of Y given X=1/2 ?

3. What is the conditional variance of X given Y = y?

EDA: Joint Probability Distribution

You might also like