0% found this document useful (0 votes)
13 views40 pages

CO1.6 Joint Probability Distribution

The document discusses joint probability distributions, including both discrete and continuous random variables. It covers concepts such as joint probability mass functions, marginal and conditional distributions, and the calculation of covariance and correlation. Additionally, it provides examples and exercises to illustrate these concepts in the context of engineering data analysis.

Uploaded by

nay shiii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views40 pages

CO1.6 Joint Probability Distribution

The document discusses joint probability distributions, including both discrete and continuous random variables. It covers concepts such as joint probability mass functions, marginal and conditional distributions, and the calculation of covariance and correlation. Additionally, it provides examples and exercises to illustrate these concepts in the context of engineering data analysis.

Uploaded by

nay shiii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

JOINT PROBABILITY

DISTRIBUTION
MATH142
Engineering Data Analysis
Outcomes
Compute the probability distribution of a random variable for both discrete and continuous data.

• At the end of the lesson, the students are


expected to
• Use joint probability mass functions and joint
probability density functions to calculate
probabilities;
• Calculate marginal and conditional probability
distributions from joint probability
distributions; and
• Interpret and calculate covariance and
correlation between random variables.
Joint Probability Mass Function
The joint probability mass function of the The marginal probability mass functions of X alone
discrete random variables X and Y, denoted as and of Y alone are
fXY(x, y), satisfies 𝑓𝑋 𝑥 = ෍ 𝑓 𝑥, 𝑦 𝑓𝑌 𝑦 = ෍ 𝑓 𝑥, 𝑦
(1) fXY(x, y) ≥ 0 𝑦 𝑥

(2) σ𝑥 σ𝑦 𝑓𝑋𝑌 𝑥, 𝑦 = 1
(3) fXY(x, y) = P(X = x, Y = y) 5-1/156 Mobile Response Time The response time
is the speed of page downloads and it is critical for a
(5-1) mobile Web site. As the response time increases,
customers become more frustrated and potentially
Sometimes referred to as the bivariate
abandon the site for a competitive one. Let X denote
probability distribution or bivariate the number of bars of service, and let Y denote the
distribution of the random variables X and Y. response time (to the nearest second) for a particular
P(X = x and Y = y) is usually written as user and site
P(X = x, Y = y).
The individual probability distribution of a random
variable
By specifying the probability of each of the points in The marginal probability distribution for X is found by
Fig. 5-1, we specify the joint probability distribution summing the probabilities in each column, whereas
of X and Y. Similarly to an individual random
variable, we define the range of the random the marginal probability distribution for Y is found by
variables (X, Y) to be the set of points (x, y) in two- summing the probabilities in each row. The results
dimensional space for which the probability that X = are shown in Fig. 5-6.
x and Y = y is positive.
x=1 x=2 x=3 fy

4 0.15 0.1 0.05 0.3


3 0.02 0.1 0.05 0.17
2 0.02 0.03 0.2 0.25
5-3/159 Marginal Distribution The joint probability
distribution of X and Y in Fig. 5-1 can be used to 1 0.01 0.02 0.25 0.28
find the marginal probability distribution of X. For
example, fx 0.2 0.25 0.55
fX(3) = P(X = 3) =
P(X=3, Y=1) + P(X=3, Y=2) + P(X=3, Y=3) + P(X=3, Y=4)
fX(3) = 0.25 + 0.2 + 0.05 + 0.05 = 0.55
Guided Learning Activity
3.16/98 Show that the column and row totals 3.50/106 Suppose that X and Y have the
of Table 3.1 give the marginal distribution of 𝑋 following joint probability distribution:
alone and of 𝑌 alone. f(x, y) x
x Row 2 4
f(x, y) 0 1 2 Totals
y 1 0.10 0.15
0 3/28 9/28 3/28 15/28 3 0.20 0.30
y 1 3/14 3/14 0 3/7 5 0.10 0.15
2 1/28 0 0 1/28
Column Totals 5/14 15/28 3/28 1
(a) Find the marginal distribution of X.
Table 3.1: Joint Probability Distribution for (b) Find the marginal distribution of Y.
Example 3.16
Joint Probability Distribution
3.14/95 Two refills for a ballpoint pen are
selected at random from a box that contains 3
blue refills, 2 red refills, and 3 green refills. If X
is the number of blue refills and Y is the
number of red refills selected, find
(a) the joint probability function f(x, y), and
(b) P[(X, Y) ∈ A], where A is the region
{(x, y)|x + y ≤ 1}.
Joint Probability Density Function
A joint probability density function for the 5-2/158 Server Access Time Let the random variable
continuous random variables X and Y, denoted as X denote the time until a computer server connects to
fXY(x, y), satisfies the following properties:
your machine (in milliseconds), and let Y denote the
(1) fXY(x, y) ≥ 0 for all x, y time until the server authorizes you as a valid user (in
∞ ∞ milliseconds). Each of these random variables
(2) ‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦 = 1
measures the wait from a common starting time and X
(3) For any region R of two-dimensional space, < Y. Assume that the joint probability density function
𝑃 𝑋, 𝑌 ∈ 𝑅 = න න 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦 for X and Y is
𝑅 fXY(x, y) = 6 × 10−6e−0.001x − 0.002y for x < y.
(5-2)
If the joint probability density function of random
variables X and Y is 𝑓𝑋𝑌 𝑥, 𝑦 , the marginal
probability density functions of X and Y are
𝑓𝑋 𝑥 = ‫𝑥 𝑌𝑋𝑓 𝑦׬‬, 𝑦 𝑑𝑦 and 𝑓𝑌 𝑦 = ‫𝑥 𝑌𝑋𝑓 𝑥׬‬, 𝑦 𝑑𝑥
(5-3)
where the first integral is over all points in the range of
𝑋, 𝑌 for which X = x and the second integral is over
all points in the range of 𝑋, 𝑌 for which Y = y.
Joint Density Function
The region with nonzero probability is shaded
in Fig. 5-4. The property that this joint
probability density function integrates to 1 can
be verified by the integral of fXY(x, y) over his
region as follows:
∞ ∞
‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 The probability that X < 1000 and Y < 2000 is
determined as the integral over the darkly shaded
= region in Fig 5-5.
∞ ∞
‫׬‬0 ‫ 𝑥׬‬6 × 10−6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑦 𝑑𝑥 1000 2000
𝑋 ≤ 1000, 𝑌 ≤ 2000 = ‫׬‬0 ‫𝑥׬‬ 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥
= 6× 1000 2000 −0.002𝑦

10−6 ‫׬‬0
∞ = 6 × 10−6 ‫׬‬0 ‫𝑥׬‬ 𝑒 𝑑𝑦 𝑒 −0.001𝑥 𝑑𝑥
‫ 𝑒 𝑥׬‬−0.002𝑦 𝑑𝑦 𝑒 −0.001𝑥
𝑑𝑥
−0.002𝑥 1000 𝑒 −0.002𝑥 −𝑒 −4
−6 ∞ 𝑒 =6 × 10−6 ‫׬‬0 𝑒 −0.001𝑥 𝑑𝑥
= 6𝑥10 ‫׬‬0 𝑒 −0.001𝑥 𝑑𝑥 0.002
0.002 1000 −0.003𝑥
∞ −0.003𝑥 = 0.003 ‫׬‬0 𝑒 − 𝑒 −4 𝑒 −0.001𝑥 𝑑𝑥
= 0.003 ‫׬‬0 𝑒 𝑑𝑥
1−𝑒 −3 −4 1−𝑒 −1
∞ ∞ 1 = 0.003 0.003 − 𝑒
‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 = 0.003 0.003 = 1 0.001
=0.003(316.738 − 11.578)
P(X ≤ 1000, Y ≤ 2000 = 0.915
Joint Density Distributions
3.15/96 A privately owned business operates a
drive-in facility and a walk-in facility. On a randomly
selected day, let X and Y, respectively, be the
proportions of the time that the drive-in and the
walk-in facilities are in use, and suppose that the
joint density function of these random variables is
2
2𝑥 + 3𝑦 , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1,
𝑓 𝑥, 𝑦 = ൝5
0, elsewhere.
(a) Verify condition 2.
(b) Find P[(X, Y) ∈ A],
where A = {(x, y)|0 < x < ½, ¼ < y < ½}.
Marginal Probability Density Function
2000 ∞
Examples:
𝑃 𝑌 > 2000 = න න 6 × 10−6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑦 𝑑𝑥
5-4/160 Server Access Time For the random 0 2000
variables that denote times in Example 5-2, ∞ ∞

calculate the probability that Y exceeds 2000 + න න 6 × 10−6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑦 𝑑𝑥


milliseconds. 2000 𝑥

This probability is determined as the integral of The first integral is


2000 ∞
fXY(x, y) over the darkly shaded region in Fig. 5-7. 𝑒 −0.002𝑦
The region is partitioned into two parts and different 6 × 10−6 න อ 𝑒 −0,001𝑥 𝑑𝑥
−0.002
limits of integration are determined for each part. 0
2000
2000
−6
6 × 10
= 𝑒 −4 න 𝑒 −0.001𝑥 𝑑𝑥
0.002
0
6 × 10−6 −4 1 − 𝑒 −2
= 𝑒 = 0.0475
0.002 0.001
The second integral is
∞ ∞ ∞
−0.002𝑦 −6
𝑒 6 × 10
6 × 10−6 න อ 𝑒 −0.001𝑥 𝑑𝑥 = න 𝑒 −0.003𝑥 𝑑𝑥
−0.002 0.002
2000 𝑥 2000
6 × 10−6 𝑒 −6
= = 0.0025
0.002 0.003
Therefore,
P(Y > 2000) = 0.0475 + 0.0025 = 0.05
Marginal Probability Density Function
A probability for only one random variable, say, for fY(y) = 6 × 10−3e−0.002y(1 − e−0.001y) for y > 0
example, P(a < X < b), can be found from the marginal
probability distribution of X or from the integral of the
joint probability distribution of X and Y as We have obtained the marginal probability density
𝑏 𝑏 ∞
function of Y. Now,
𝑃 𝑎 < 𝑋 < 𝑏 = න 𝑓𝑋 𝑥 𝑑𝑥 = න න 𝑓 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 ∞
𝑎 𝑎 −∞
𝑏 ∞
𝑃 𝑌 > 2000 = 6 × 10−3 න 𝑒 −0.002𝑦 1 − 𝑒 −0.001𝑦 𝑑𝑦
= ‫׬ 𝑎׬‬−∞ 𝑓 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 2000
∞ ∞
Alternatively, the probability can be calculated from the 𝑒 −0.002𝑦 𝑒 −0.003𝑦
marginal probability distribution of Y as follows. = 6 × 10−3 อ − อ
−0.002 −0.003
2000 2000
For y > 0, 𝑦 𝑒 −4
𝑒 −6
= 6 × 10−3 − = 0.05
𝑓𝑌 𝑦 = න 6 × 10−6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑥 0.002 0.003
0 𝑦

= 6 × 10−6 𝑒 −0.002𝑦 න 𝑒 −0.001𝑥 𝑑𝑥


0
𝑦
𝑒 −0.001𝑥
= 6 × 10−6 𝑒 −0.002𝑦 อ
−0.001
0
1 − 𝑒 −0.001𝑦
= 6 × 10−6 𝑒 −0.002𝑦
0.001
Marginal Probability Density Function
3.40/105 A fast-food restaurant operates both a drive-
through facility and a walk-in facility. On a randomly
selected day, let X and Y, respectively, be the proportions
of the time that the drive-through and walk-in facilities
are in use, and suppose that the joint density function of
these random variables is

2
𝑓 𝑥, 𝑦 = ቐ3 𝑥 + 2𝑦 , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1,
0, elsewhere.

(a) Find the marginal density of X.


(b) Find the marginal density of Y.
(c) Find the probability that the drive-through facility is
busy less than one-half of the time.
Conditional Distribution
CONDITIONAL PROBABILITY MASS / DENSITY Because the conditional probability mass/density
FUNCTION function fY|x(y) is a probability density function for
Let X and Y be two random variables, discrete or all y in Rx, the following properties are satisfied:
continuous. The conditional distribution of the (1) 𝑓𝑌|𝑥(𝑦) ≥ 0
random variable Y, given that X = x, is
𝑓𝑋𝑌 𝑥, 𝑦 2 σ 𝑓𝑌|𝑥 (𝑦) 𝑑𝑦 = 1 𝑜𝑟 ‫ = 𝑦𝑑 )𝑦(𝑥|𝑌𝑓 ׬‬1
𝑓𝑌 |𝑥 𝑦 = , 𝑓𝑋 (𝑥) > 0.
𝑓𝑋 𝑥
3 𝑃 𝑌 ∈ 𝐵|𝑋 = 𝑥 =

Similarly, the conditional distribution of the random ෍ 𝑓𝑌|𝑥 𝑦 = න 𝑓𝑌|𝑥 𝑦 𝑑𝑦


variable X, given that Y = y, is 𝑎𝑙𝑙 𝑦 𝑖𝑛 𝐵 𝐵
𝑓𝑋𝑌 𝑥, 𝑦
𝑓𝑋|𝑦 𝑥 = , 𝑓𝑌 (𝑦) > 0. for any set B in the range of Y
𝑓𝑌 𝑦
(5-5)
Conditional Probability Mass Function
Examples: x=1 x=2 x=3 fy
5-5/162 Conditional Probabilities for Mobile
4 0.15 0.1 0.05 0.3
Response Time For Example 5-1, X and Y denote
the number of bars of signal strength and times you 3 0.02 0.1 0.05 0.17
need to state your departure city received,
2 0.02 0.03 0.2 0.25
respectively. Then,
𝑃 𝑋 = 3, 𝑌 = 1 1 0.01 0.02 0.25 0.28
𝑃 𝑌 = 1𝑋 = 3 =
𝑃 𝑋 = 3 fx 0.2 0.25 0.55
𝑓𝑋𝑌 3, 1 0.25
= = = 0.454
𝑓𝑋(3) 0.55 Additional Conclusion: Further work shows that
P(Y = 3|X = 3) = 0.091
The probability that Y = 2 given that X = 3 is P(Y = 4|X = 3) = 0.091.
𝑃 𝑋 = 3, 𝑌 = 2
𝑃 𝑌 = 2𝑋 = 3 = Note that
𝑃(𝑋 = 3)
𝑓𝑋𝑌 3,2 0.2 P(Y = 1|X = 3)+P(Y = 2|X = 3) + P(Y = 3|X = 3) +
= = = 0.364 P(Y = 4|X = 3) = 1.
𝑓𝑋( 3) 0.55
This set of probabilities defines the conditional
probability distribution of Y given that X = 3.
Conditional Probability Mass Function
3.18/99 Referring to Joint Probability
Distribution below, find the conditional
distribution of X, given that Y = 1, and
use it to determine P(X = 0|Y = 1).
x Row
f(x, y) 0 1 2 Totals

0 3/28 9/28 3/28 15/28


y 1 3/14 3/14 0 3/7
2 1/28 0 0 1/28
Column Totals 5/14 15/28 3/28 1
Conditional Probability Mass Function
3.49/106 Let X denote the number of times a
certain numerical control machine will
malfunction: 1, 2, or 3 times on any given day.
Let Y denote the number of times a technician
is called on an emergency call. Their joint
probability distribution is given as
f(x, y) x
1 2 3
y 1 0.05 0.05 0.10
3 0.05 0.10 0.35
5 0.00 0.20 0.10
(a) Evaluate the marginal distribution of X.
(b) Evaluate the marginal distribution of Y.
(c) Find P(Y = 3|X = 2).
Conditional Probability Density Function
Examples:
3.19/100 The joint density function for the
random variables (X, Y), where X is the unit
temperature change and Y is the proportion of
spectrum shift that a certain atomic particle
produces is
10𝑥𝑦 2 , 0 < 𝑥 < 𝑦 < 1
𝑓𝑋𝑌 𝑥, 𝑦 = ቊ
0, elsewhere.
a) Find the marginal densities fX(x), fY(y), and
the conditional probability fY|x(y).
b) Find the probability that the spectrum
shifts more than half of the total
observations, given the temperature is
increased to 0.25 units.
Conditional Probability Density Function
• 3.20/100 Given the joint density
function
𝑥 1 + 3𝑦 2
𝑓𝑋𝑌 𝑥, 𝑦 = ቐ , 0 < 𝑥 < 2, 0 < 𝑦 < 1
4
0, elsewhere,
find
fX(x), fY(y), fX|y(x)

and evaluate
P(¼ < X < ½|Y = ⅓).
Conditional Probability Density Function
3.53/106 Given the joint density function

6−𝑥−𝑦
𝑓𝑋𝑌 𝑥, 𝑦 = ቐ , 0 < 𝑥 < 2, 2 < 𝑦 < 4,
8
0, elsewhere,

find P(1 < Y < 3|X = 1).


Independence
For random variables X and Y, if any one of
the following properties is true, the others are
also true, and X and Y are independent.
(1) fXY(x, y) = fX(x)fY(y) for all x and y
(2) fY|x(y) = fY(y) for all x and y with fX(x) > 0
(3) fX|y(x) = fX(x) for all x and y with fY(y) > 0
(4) P(X ∈ A, Y ∈ B) = P(X ∈ A) P(Y ∈ B) for
any sets A and B in the range of X and Y,
respectively.
(5-7)
Independence
Examples: • Therefore, fXY(x, y) = fX(x)fY(y) for all x and y,
5-11/166 Independent Random Variables and X and Y are independent.
Suppose that Example 5-2 is modified so that the
joint probability density function of X and Y is
fXY(x, y) = 2 × 10−6e−0.001x − 0.002y for x ≥ 0 and y ≥ 0.
Show that X and Y are independent and determine To determine the probability requested, property
P(X > 1000, Y < 1000). (4) of Equation 5-7 and the fact that each
The marginal probability density function of X is random variable has an exponential distribution

can be applied.
𝑓𝑋 𝑥 = න 2 × 10−6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑦
P(X > 1000, Y < 1000) = P(X > 1000)P(Y < 1000)
0
= 0.001e−0.001x for x > 0 = e−1(1 − e−2) = 0.0318
The marginal probability

density function of Y is

𝑓𝑌 𝑦 = න 2 × 10−6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑥
0
= 0.002e−0.002y for y > 0
Independence
• 3.21/102 Show that the random variables of
Example 3.14 are not statistically
independent.
• Table 3.1: Joint Probability Distribution for
Example 3.14

x Row Totals
f(x, y) 0 1 2
0 3/28 9/28 3/28 15/28
y 1 3/14 3/14 0 3/7
2 1/28 0 0 1/28
Column Totals 5/14 15/28 3/28 1
Examples
• 5-1/170 Show that the following function satisfies the
properties of a joint probability mass function.
x y fXY(x, y)
Determine the following:
1 1 1/4
(a) P(X < 2.5, Y < 3)
1.5 2 1/8
(b) P(X < 2.5)
1.5 3 1/4
(c) P(Y < 3)
2.5 4 1/4
(d) P(X > 1.8, Y > 4.7)
3 5 1/8
(e) E(X) E(Y) V(X) V(Y)
(f) Marginal probability distribution of the random variable X
(g) Conditional probability distribution of Y given that X = 2.5
(h) Conditional probability distribution of X given that Y = 2.
(i) E(Y|X = 1.5)
(j) Are X and Y independent?
Examples
• 5-2/170 Determine the value of c that makes the • 5-6/171 A small-business Web site contains 100
function f(x, y) = c(x + y) a joint probability mass pages and 60%, 30%, and 10% of the pages
function over the nine points with x = 1, 2, 3 and y contain low, moderate, high graphic content,
= 1, 2, 3. Determine the following: respectively. A sample of four pages is selected
• (a) P(X = 1, Y < 4) (b) P(X = 1) without replacement, and X and Y denote the
• (c) P(Y = 2) (d) P(X < 2, Y < 2) number of pages with moderate and high graphics
output in the sample. Determine:
• (e) E(X), E(Y), V(X), and V(Y)
• (a) fXY(x, y)
• (f) Marginal probability distribution of the random
variable X • (b) fX(x)
• (g) Conditional probability distribution of Y given • (c) E(X)
that X = 1
• (d) fY|3(y)
• (h) Conditional probability distribution of X given
that Y = 2 • (e) E(Y|X = 3)
• (i) E(Y|X = 1) • (f) V(Y|X = 3)
• (j) Are X and Y independent? • (g) Are X and Y independent?
Expected Value of a Function of Two Random Variables
• 𝐸 ℎ 𝑋, 𝑌 =
σ σ ℎ 𝑥, 𝑦 𝑓𝑋𝑌 𝑥, 𝑦 𝑋, 𝑌 discrete

‫ ׬ ׬‬ℎ 𝑥, 𝑦 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦 𝑋, 𝑌 continuous

(5-13)
5-19/174 Expected Value of a Function of
Two Random Variables For the joint
probability distribution of the two random
variables in Example 5-1, calculate E[(X −
μX)(Y − μY)].
Covariance
A measure of linear relationship between the
random variables
If the relationship between the random
variables is nonlinear, the covariance might
not be sensitive to the relationship, as
illustrated in Fig. 5-12(d). The only points with
nonzero probability are the points on the circle.

cov(X, Y) = σXY = E[(X − μX)(Y − μY)]


(5-14)
Covariance
• 5-20/176 In Example 5-1, the random
variables X and Y are the number of signal
bars and the response time (to the nearest
second), respectively. Interpret the
covariance between X and Y as positive or
negative. As the signal bars increase, the
response time tends to decrease. Therefore,
X and Y have a negative covariance. The
covariance was calculated to be −0.5815 in
Example 5-19.
Covariance
4.47/127 For the random variables X
and Y whose joint density function is
given in Exercise 3.40 on page 105,
find the covariance.

2
𝑓 𝑥, 𝑦 = ቐ3 𝑥 + 2𝑦 , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1,
0, elsewhere
Covariance
• 5-21/176 Covariance For the discrete random
variables X and Y with the joint distribution
shown in Fig. 5-13, determine σXY and ρXY.
Correlation
• The correlation between random variables X and Y,
denoted as ρXY, is
cov 𝑋, 𝑌 σ𝑋𝑌
𝜌𝑋𝑌 = =
𝑉 𝑋 𝑉 𝑌 𝜎𝑋 𝜎𝑌
(5-15)
For any two random variables X and Y
−1 ≤ ρXY ≤ +1
(5-16)

If X and Y are independent random variables,


σXY = ρXY = 0
(5-17)
Correlation
• 5-22/177 Correlation Suppose that the random
variable X has the following distribution:
P(X = 1) = 0.2, P(X = 2) = 0.6, P(X = 3) = 0.2.
Let Y = 2X + 5. That is,
P(Y = 7) = 0.2, P(Y = 11) = 0.2.
Determine the correlation between X and Y. Refer
to Fig. 5-14. Because X and Y are linearly related, ρ
= 1. This can be verified by direct calculations.
Examples
• 5-33/178 Determine the covariance and
correlation for the following joint probability
distribution

x 1 1 2 4
y 3 4 5 6
fXY(x, y) 1/8 1/4 1/2 1/8
Examples
• 5-35/178 Determine the value for c and the
covariance and correlation for the joint
probability mass function fXY(x, y) = c(x + y)
for x = 1, 2, 3 and y = 1, 2, 3.

• 5-41/179 Determine the covariance and


correlation for the joint probability density
function fXY(x, y) = e−x − y over the range 0 < x
and 0 < y.
Examples
• 5-43/179 The joint probability distribution is
x −1 0 0 1
y 0 −1 1 0
fXY(x, y) ¼ 1/4 1/4 1/4
• Show that the correlation between X and Y
is zero, but X and Y are not independent.
Summary
• A joint probability mass function is a • A conditional probability mass function is the
function used to calculate probabilities for probability mass function of the conditional
two or more discrete random variables. probability distribution of a discrete random
• A joint probability density function is a variable.
function used to calculate probabilities for • A conditional probability density function is
two or more continuous random variables. the probability density function of the
• A marginal probability mass function is the conditional probability distribution of a
probability mass function of a discrete continuous random variable.
random variable obtained from the joint • The covariance is a measure of association
probability distribution of two or more between two random variables obtained as
random variables. the expected value of the product of the two
• A marginal probability density function is random variables around their means; that is
the probability density function of a cov(X, Y) = σXY = E[(X − μX)(Y − μY)].
continuous random variable obtained from • In the most general usage, the correlation is
the joint probability distribution of two or a measure of the interdependence among
more random variables. data. The concept may include more than two
variables.
References
• Montgomery and Runger. Applied Statistics and Probability for Engineers, 6th Ed. © 2014
• Walpole, et al. Probability and Statistics for Engineers and Scientists 9th Ed. © 2012, 2007, 2002

You might also like