0% found this document useful (0 votes)
36 views28 pages

IE101 Module 2 Part 1 Reference1 1

This document provides an overview of key concepts related to joint probability distributions for two or more random variables. It defines a joint probability distribution as describing the simultaneous behavior of multiple random variables. Marginal probability distributions are also introduced as describing the behavior of individual random variables. Conditional probability distributions demonstrate how knowledge of one random variable can change the probabilities associated with other variables. The document notes that distributions can also be derived for linear and general functions of random variables. Several examples are provided to illustrate joint, marginal, and conditional probability distributions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views28 pages

IE101 Module 2 Part 1 Reference1 1

This document provides an overview of key concepts related to joint probability distributions for two or more random variables. It defines a joint probability distribution as describing the simultaneous behavior of multiple random variables. Marginal probability distributions are also introduced as describing the behavior of individual random variables. Conditional probability distributions demonstrate how knowledge of one random variable can change the probabilities associated with other variables. The document notes that distributions can also be derived for linear and general functions of random variables. Several examples are provided to illustrate joint, marginal, and conditional probability distributions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

IE 10 1

ENGINEERING DATA ANALYSIS


M od u l e 2 – Pa r t 1
Joint Probability Distribution
Contents
I. Two or More Random Variables
i. Joint Probability Distribution
ii. Marginal Probability Distribution
iii. Conditional Probability Distribution
iv. More than Two Random Variables*
II. Linear Functions of Random Variables*
III. General Functions of Random Variables*

Note: *Optional topics that we can discuss the overview of concept


and solving
Joint Probability Distribution
In Module 1 we studied probability distributions for a single random variable.
However, it is often useful to have more than one random variable defined in a
random experiment. For example:
• In the classification of transmitted and received signals, each signal can be
classified as high, medium, or low quality. We might define the random
variable X to be the number of high-quality signals received and the random
variable Y to be the number of low-quality signals received.
• In the continuous random variable X can denote the length of one dimension
of an injection-molded part, and the continuous random variable Y might
denote the length of another dimension. We might be interested in
probabilities that can be expressed in terms of both X and Y.
Joint Probability Distribution
If the specifications for X (2.95 to 3.05) and Y (7.60 to 7.80), we might be
interested in the probability that a part satisfies both specifications;
P (2.95 < X < 3.05) and P(7.60 < Y < 7.80)

If X and Y are two random variables, the probability distribution that defines
their simultaneous behavior is called a joint probability distribution.

The joint probability distribution of two random variables is sometimes referred to


as the bivariate probability distribution or bivariate distribution of the random
variables represented as:
𝒇𝑿𝒀 (𝒙, 𝒚) = 𝑷(𝑿 = 𝒙, 𝒀 = 𝒚)
Joint Probability Distribution

(1) For all (x, y), there are non-negative probabilities.


(2) The probabilities of all possible combinations of x and y within the range add up
to 1.
(3) The probability of P(X=x,Y=y) is defined by the function of X and Y at point (x,y)
Joint Probability Distribution
Joint Probability Distribution
Example 1:
• The X number of hours watching TV as related to the Y number of hours spent
exercising.
• X number of pages on a report and Y grade given by teacher.
• X minutes spent on a sports activity and Y number of blisters created.
• X hours spent at studying and Y grade of student in that course.
Joint Probability Distribution
Example 2: Calls are made to check the airline schedule at your departure city.
You monitor the number of bars of signal strength on your cell phone and the
number of times you have to state the name of your departure city before the
voice system recognizes the name.
𝐿𝑒𝑡
𝑋 = 𝑑𝑒𝑛𝑜𝑡𝑒 𝑡ℎ𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑏𝑎𝑟𝑠 𝑜𝑓 𝑠𝑖𝑔𝑛𝑎𝑙 𝑠𝑡𝑟𝑒𝑛𝑔𝑡ℎ 𝑜𝑛 𝑦𝑜𝑢𝑟 𝑐𝑒𝑙𝑙 𝑝ℎ𝑜𝑛𝑒
𝑌 = 𝑑𝑒𝑛𝑜𝑡𝑒 𝑡ℎ𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑡𝑖𝑚𝑒𝑠 𝑦𝑜𝑢 𝑛𝑒𝑒𝑑 𝑡𝑜 𝑠𝑡𝑎𝑡𝑒 𝑦𝑜𝑢𝑟 𝑑𝑒𝑝𝑎𝑟𝑡𝑢𝑟𝑒 𝑐𝑖𝑡𝑦
Joint Probability Distribution
Example 3: Two ballpoint pens are selected at random from a box that contains
3 blue pens, 2 red pens, and 3 green pens. If X is the number of blue pens
selected and Y is the number of red pens selected, what are the possible pairs
of values? What are the probabilities of each pair?
Recall: Multivariate
hypergeometric
Joint Probability Distribution
Example 4: From a sack of fruit containing 3 oranges, 2 apples, and 3 bananas,
a random sample of 4 pieces of fruit is selected. If X is the number of oranges
and Y is the number of apples in the sample, what are the possible pairs of
values? What are the probabilities of each pair?
Recall: Multivariate
hypergeometric
Joint Probability Distribution
Example 5: Consider two continuous random variables X and Y with joint p.d.f.

𝑥+𝑦
𝑥 > 0, 𝑦 > 0,3𝑥 + 𝑦 < 3
𝑓 𝑥, 𝑦 = & 2
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Find the probability P (X < Y ).


Recall: Graph the
Intersection of
! !
x=y and 3x+y=3 > (" , ")
equality portion of the
constraint variables
Marginal Probability Distribution
If more than one random variable is defined in a random experiment, it is
important to distinguish between the joint probability distribution of X and Y and
the probability distribution of each variable individually. The individual
probability distribution of a random variable is referred to as its marginal
probability distribution.

For example, consider discrete random variables X and Y. To determine P(X=x),


we sum P(X=x, Y=y) over all points in the range of (X,Y) for which X=x. Subscripts
on the probability mass functions distinguish between the random variables.
Marginal Probability Distribution
Marginal Probability Distribution
Example 6: Similar to the given table from Example 2, evaluate the marginal
probability distribution of X and Y.

𝑺𝒂𝒎𝒑𝒍𝒆 𝑪𝒐𝒎𝒑𝒖𝒕𝒂𝒕𝒊𝒐𝒏:

𝑓# 3 = 𝑃 𝑋 = 3 = 𝑃 𝑋 = 3, 𝑌 = 1 + 𝑃 𝑋 = 3, 𝑌 = 2 + 𝑃 𝑋 = 3, 𝑌 = 3 + 𝑃 𝑋 = 3, 𝑌 = 4
= 0.25 + 0.2 + 0.05 + 0.05 = 𝟎. 𝟓𝟓
Marginal Probability Distribution
Example 7: Similar to the given table from Example 3, evaluate the marginal
probability distribution of X and Y.
Marginal Probability Distribution
Example 8: Similar to the given table from Example 4, evaluate the marginal
probability distribution of X and Y.
Marginal Probability Distribution
Example 9: From Example #5, let the joint probability density
function for ( X, Y ) be
𝑥+𝑦
𝑥 > 0, 𝑦 > 0,3𝑥 + 𝑦 < 3
𝑓 𝑥, 𝑦 = & 2
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

a.) Find the marginal probability density function of 𝑋, 𝑓! 𝑥 .


Marginal Probability Distribution
Example 9: From Example #5, let the joint probability density
function for ( X, Y ) be
𝑥+𝑦
𝑥 > 0, 𝑦 > 0,3𝑥 + 𝑦 < 3
𝑓 𝑥, 𝑦 = & 2
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

b.) Find the marginal probability density function of Y, 𝑓" 𝑌 .


Conditional Probability Distribution
When two random variables are defined in a random experiment, knowledge
of one can change the probabilities that we associate with the values of the
other.
Homework #2
• One whole yellow paper. Black permanent pen.
• Write your complete details (Name, Student Number, Course, Section, Date,
• Submission: October 3, 2023 (face-to-face)

1. Consider two continuous random variables X and Y with joint p.d.f.


2 $
𝑥 𝑦
𝑓 𝑥, 𝑦 = R 81 0 < 𝑥 < 3, 0 < 𝑦 < 3
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

𝑎. ) 𝐹𝑖𝑛𝑑 𝑃 𝑋 > 3𝑌
𝑏. ) 𝐹𝑖𝑛𝑑 𝐹% 𝑥
𝑐. ) 𝐹𝑖𝑛𝑑 𝐹& (𝑦)

Note: Show complete integral solution


Homework #2
2. At a men’s clothing store, 12 men purchased blue golf sweaters, 8 purchased
green sweaters, and 4 purchased gray sweaters. If 3 customers are selected at
random, then let X is the number of customers that purchased blue and Y is the
number of customers that purchased gray, what are the possible pairs of
values? Construct the joint and marginal probability distribution table.
Conditional Probability Distribution
Example 10: Similar to the given table from Example 3, find the conditional
probability of X, given Y = 1 and use it to determine P(X = 0 |Y = 1).
More than Two Random Variables*
More than two random variables can be defined in a random experiment.
Linear Functions of Random Variables
A random variable is sometimes defined as a function of one or more random
variables. In this section, results for linear functions are highlighted.
• For example, if the random variables 𝑿𝟏 and 𝑿𝟐 denote the length and
width, respectively, of a manufactured part, 𝒀 = 𝟐𝑿𝟏 + 𝟐𝑿𝟐 is a random
variable that represents the perimeter of the part.

Thus, we develop results for random variables that are linear combinations of
random variables.
Linear Functions of Random Variables
General Functions of Random Variables
In many situations in statistics, it is necessary to derive the probability
distribution of a function of one or more random variables.
General Functions of Random Variables
References
Montgomery, D. & Runger, C. (2011). Applied Statistics and
Probability for Engineers Fifth Edition. John Wiley and
Sons, Inc.

You might also like