0% found this document useful (0 votes)
72 views

Chapter 7 PDF Lecture Notes

1) The joint probability density function of two continuous random variables X and Y is given by a nonnegative function f(x,y). 2) The marginal densities of X and Y are found by integrating the joint density over the other variable. 3) X and Y are independent if their joint density equals the product of their marginal densities. 4) The covariance of X and Y is E(XY) - E(X)E(Y), where the expectations are calculated using the joint density.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views

Chapter 7 PDF Lecture Notes

1) The joint probability density function of two continuous random variables X and Y is given by a nonnegative function f(x,y). 2) The marginal densities of X and Y are found by integrating the joint density over the other variable. 3) X and Y are independent if their joint density equals the product of their marginal densities. 4) The covariance of X and Y is E(XY) - E(X)E(Y), where the expectations are calculated using the joint density.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

Chapter 7 Jointly distributed

random variables
Discrete case
Joint probability mass function
• Let X and Y be two discrete random variables on the sample space S
of an experiment. The joint probability mass function p(x,y) is defined
for each pair of numbers as p(x,y)=P(X=x,Y=y).
• It must be the case that 𝑝(𝑥, 𝑦) ≥ 0, and σ𝑥 σ𝑦 𝑝 𝑥, 𝑦 =1.
• Example: Outcomes for X and Y: 0,1
x=0 x=1
y=0 1/8 1/4
y=1 3/8 1/4
Example continued
• P(x=0,y=0)=1/8
• P(x=0,y=1)=3/8
• P(x=1,y=0)=1/4
• P(x=1,y=1)=1/4
• Clearly p(x,y) is a joint probability mass function since 𝑝 𝑥, 𝑦 ≥ 0
and σ𝑥 σ𝑦 𝑝 𝑥, 𝑦 =1.
Marginal probability mass functions
• The marginal probability mass function of X , denoted 𝑝𝑋 𝑥 is given
by 𝑝𝑋 𝑥 =σ𝑦 𝑝 𝑥, 𝑦 for each possible value of x
• Similarly, The marginal probability mass function of Y , denoted 𝑝𝑌 𝑦
is given by 𝑝𝑌 𝑦 =σ𝑥 𝑝 𝑥, 𝑦 for each possible value of y.
• Marginal probability mass function for X from the previous example
1 1
• 𝑝𝑋 0 =𝑝 0,0 +𝑝 0,1 = , 𝑝𝑋 1 =𝑝 1,0 +𝑝 1,1 =
2 2
• Marginal probability mass function for Y from the previous example
3 5
• 𝑝𝑌 0 =𝑝 0,0 +𝑝 1,0 = , 𝑝𝑌 1 =𝑝 0,1 +𝑝 1,1 =
8 8
Independence of two discrete random
variables X and Y
• X and Y are independent if for every pair of x and y values we have:
𝑝 𝑥, 𝑦 = 𝑝𝑋 𝑥 𝑝𝑌 𝑦
• Are X and Y from our example dependent or independent?
1 1 3
• 𝑝 0,0 = , 𝑝𝑋 0 = , 𝑝𝑌 0 =
8 2 8
• Note that 𝑝 0,0 ≠ 𝑝𝑋 0 𝑝𝑌 0 .
• Hence, X and Y are dependent.
Expected value and covariance for discrete
random variables
• The covariance is given by 𝑐𝑜𝑣 𝑋, 𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸 𝑌 where
the expected value 𝐸 𝑋𝑌 = σ𝑥 σ𝑦 𝑥𝑦𝑝 𝑥, 𝑦 .
1
• 𝐸 𝑋𝑌 = 𝑝 1,1 = . (There are four terms involved. All terms
4
containing x=0 or y=0 cancel out. The only term which does not
vanish corresponds to x=1 and y=1)
1 5
• 𝐸 𝑋 = σ𝑥 𝑥𝑃𝑋 𝑥 = 𝑝𝑋 1 = , 𝐸 𝑌 = σ𝑦 𝑦𝑃𝑌 𝑦 = 𝑝𝑌 1 = .
2 8
1
• Hence, 𝑐𝑜𝑣 𝑋, 𝑌 = −
16
• If 𝑋 and 𝑌 are independent then their covariance is equal to 0.

• 𝐸 𝑋𝑌 = σ𝑥 σ𝑦 𝑥𝑦𝑝 𝑥, 𝑦 = σ𝑥 σ𝑦 𝑥𝑦𝑝𝑋 𝑥 𝑝𝑌 𝑦 .

• = σ𝑥 𝑥𝑝𝑋 𝑥 σ𝑦 𝑦𝑝𝑌 𝑦 = 𝐸 𝑋 𝐸 𝑌
Example
• Two caplets are selected at random from a bottle containing three
aspirin, two sedative, and four cold caplets. If X and Y are,
respectively, the numbers of aspirin and sedative caplets included
among the two caplets drawn from the bottle, find the probabilities
associated with all possible pairs of values of X and Y.
• Find the marginal probability mass functions of X and Y.
• Are X and Y independent?
Solution
• The possible pairs are (0,0), (0,1),(1,0),(1,1),(0,2), and (2,0).
• For example, the number of ways associated with the pair (1,0) is
3 2 4
1 0 1
= 12.
• Clearly, the total number of ways in which two of the nine caplets can
be selected is 29 = 36.
• Hence, the probability associated with (1,0), namely p(1,0) is
12/36=1/3
• Joint probability of X and Y is given by the following general formula
3 2 4
𝑥 𝑦 2−𝑥−𝑦
𝑝 𝑥, 𝑦 = 𝑥 = 0,1,2 , 𝑦 = 0,1,2, 0 ≤ 𝑥 + 𝑦 ≤ 2
36

Y | X 0 1 2
0 1/6 1/3 1/12
1 2/9 1/6 0
2 1/36 0 0
15 3 1
• Marginal of X: 𝑝𝑋 0 = , 𝑝𝑋 1 = ,𝑝 2 =
36 6 𝑋 12

7 7 1
• Marginal of Y: 𝑝𝑌 0 = , 𝑝𝑌 1 = , 𝑝𝑌 2 =
12 18 36

1 1
• Dependent: for example, 𝑝𝑋 2 = , 𝑝𝑌 2 = , and 𝑝 2,2 =0
12 36
• Clearly 𝑝𝑋 2 𝑝𝑌 2 ≠ 𝑝 2,2
• E(X)=0p_X(0)+1p_X(1)+2p_X(2)=2/3
• E(Y)=4/9
• E(XY)=σ2𝑥=0 σ2𝑦=0 𝑥𝑦𝑝 𝑥, 𝑦 = 1/6

• Cov(X,Y)=E(XY)-E(X)E(Y)=
• 1) Two construction contracts are to be randomly assigned to one or
more of three firms (I, II, and III). A firm may receive more than one
contract. Let
• X: the number of contracts assigned to firm I
• Y: the number assigned to firm II
• A) Find the joint probability distribution for X and Y.
• B) Find the marginal probability distribution for X.
• C) Find p(X=1|Y=1)
• A)

Y |X 0 1 2
0 1/9 2/9 1/9
1 2/9 2/9 0
2 1/9 0 0
• B)

x 0 1 2

P_X(x) 4/9 4/9 1/9


• C) Find p(X=1|Y=1)

2/9
• p(X=1|Y=1)=p(1,1)/p_Y(1)= = 1/2
4/9
Continuous case
• Let 𝑋 and 𝑌 be continuous random variables. The joint probability
density of 𝑋 and 𝑌 is given by a nonnegative function 𝑓 𝑥, 𝑦 , which
𝑑 𝑏
is such that 𝑃 𝑎 ≤ 𝑋 ≤ 𝑏, 𝑐 ≤ 𝑌 ≤ 𝑑 = ‫𝑥 𝑓 𝑎׬ 𝑐׬‬, 𝑦 𝑑𝑥𝑑𝑦

∞ ∞
• Of course, 𝑓 𝑥, 𝑦 satisfies ‫׬‬−∞ ‫׬‬−∞ 𝑓 𝑥, 𝑦 𝑑𝑥𝑑𝑦 = 1

• The marginal probability density functions of 𝑋 and 𝑌 respectively are


∞ ∞
given by 𝑓𝑋 𝑥 = ‫׬‬−∞ 𝑓 𝑥, 𝑦 𝑑𝑦, and 𝑓𝑌 𝑦 = ‫׬‬−∞ 𝑓 𝑥, 𝑦 𝑑𝑥
Independence and covariance
• 𝑋 and 𝑌 are independent if 𝑓 𝑥, 𝑦 = 𝑓𝑋 𝑥 𝑓𝑌 𝑦 . Otherwise 𝑋 and 𝑌
are said to be dependent.

• The covariance is given by 𝑐𝑜𝑣 𝑋, 𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸 𝑌 where


∞ ∞
the expected value 𝐸 𝑋𝑌 = ‫׬‬−∞ ‫׬‬−∞ 𝑥𝑦𝑓 𝑥, 𝑦 𝑑𝑥𝑑𝑦.

• The conditional probability density function of 𝑋 given 𝑌 = 𝑦 is given


𝑓 𝑥,𝑦
by 𝑓 𝑥|𝑦 =
𝑓𝑌 𝑦
Example 1
• Let the joint density of two random variables X and Y given by
2𝑦𝑒 −𝑥 0 ≤ 𝑦 ≤ 1, 0 ≤ 𝑥
𝑓 𝑥, 𝑦 = ቊ
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
• Find the marginal densities of X and Y.
• Are X and Y independent?
• Compute P(0<y<0.5,0<x<2).
• Compute P(0<y<0.5|0<x<2).
• Compute the covariance
solution
1
• 𝑓𝑋 𝑥 =‫׬‬0 2𝑦𝑒 −𝑥 𝑑𝑦 = 𝑦 2 𝑒 −𝑥 |10 = 𝑒 −𝑥


• 𝑓𝑌 𝑦 =‫׬‬0 2𝑦𝑒 −𝑥 𝑑𝑥 = −2𝑦𝑒 −𝑥 |∞
0 = 2𝑦

• X and Y are independent since 𝑓 𝑥, 𝑦 = 𝑓𝑋 𝑥 𝑓𝑌 𝑦 .

0.5 2
• P(0<y<0.5,0<x<2)= ‫׬‬0 ‫׬‬0 2𝑦𝑒 −𝑥 𝑑𝑥𝑑𝑦 =
2 −𝑥 0.5
‫׬‬0 𝑒 𝑑𝑥 ‫׬‬0 2𝑦𝑑𝑦 = −𝑒 −𝑥 |20 𝑦 2 |0.5
0 = −𝑒 −2 + 1 0.52
continue
• Since X and Y are independent, clearly, the conditional probability
P(0<y<0.5,0<x<2)
P(0<y<0.5|0<x<2) is equal to = 𝑃 (0 < 𝑦 <
𝑃(0<𝑥<2)
0.5)𝑃(0 < 𝑥 < 2)/ 𝑃(0 < 𝑥 < 2)=𝑃 0 < 𝑦 < 0.5 =0.52 .

0.5
• 𝑝 0 < 𝑦 < 0.5 = ‫׬‬0 2𝑦𝑑𝑦 = (𝑦 2 |0.5
0 = 0.52

• Also, clearly, since X and Y are independent, Their covariance is equal


to zero.
• If X and Y are independent, E(XY)=E(X)E(Y)
• If X and Y are independent, E(XY)=E(X)E(Y)

• Proof:𝐸 𝑋𝑌 = ‫𝑥 𝑓𝑦𝑥 ׬ ׬‬, 𝑦 𝑑𝑥𝑑𝑦 = ‫𝑦𝑑𝑥𝑑 𝑦 𝑓 𝑥 𝑓𝑦𝑥 ׬ ׬‬

• = ‫𝑦 𝐸 𝑋 𝐸 = 𝑦𝑑 𝑦 𝑓𝑦 ׬ 𝑥𝑑 𝑥 𝑓𝑥 ׬‬
Example 2
• Let the joint density of two random variables X and Y given by
𝑘 𝑥 + 4𝑦 0 < 𝑥 < 2, 0 < 𝑦 < 1
𝑓 𝑥, 𝑦 = ቊ
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Find the value of k.

• Are X and Y independent?

• Find the covariance


Solution
1 2
• The double integral ‫׬‬0 ‫׬‬0 𝑘 𝑥 + 4𝑦 𝑑𝑥𝑑𝑦 must be equal to 1.
1 𝑥2 1
• We get 𝑘 ‫׬‬0 ( + 4𝑥𝑦|20 𝑑𝑦 = 𝑘 ‫׬‬0 2 + 8𝑦 𝑑𝑦 = 𝑘(2𝑦 + 𝑦 2 |10 =6k
2
• Hence k=1/6
11
• The marginal probability density of X is 𝑓𝑋 𝑥 = ‫׬‬0 6 𝑥 + 4𝑦 𝑑𝑦
1 1
which is equal to (𝑥𝑦 + 2𝑦 2 |10 = 𝑥+2 .
6 6
21
• The marginal probability density of Y is 𝑓𝑌 𝑦 = ‫׬‬0 6 𝑥 + 4𝑦 𝑑𝑥
1 𝑥2 1
which is equal to ( + 4𝑥𝑦|20 = 1 + 4𝑦 .
6 2 3
continue
• The covariance is given by 𝑐𝑜𝑣 𝑋, 𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸(𝑌).
2 10
• The expected value of X is 𝐸 𝑋 = ‫׬‬0 𝑥𝑓𝑋 𝑥 𝑑𝑥 =
9
1 11
• The expected value of Y is 𝐸 𝑌 = ‫׬‬0 𝑦𝑓𝑌 𝑦 𝑑𝑦 =
18
1 21 2
• The expected value of XY is 𝐸 𝑋𝑌 = ‫׬‬0 ‫׬‬0 6 𝑥𝑦 𝑥 + 4𝑦 𝑑𝑥𝑑𝑦 =
3
• We deduce the value of the covariance
Mean and variance of linear combination of
random variables
• Let 𝑋1 , 𝑋2 be random variables (Discrete or continuous) with 𝐸 𝑋𝑖 =
𝜇𝑖 .
• Define 𝑋 = 𝑎1 𝑋1 + 𝑎2 𝑋2 , where 𝑎𝑖 ’s are constants.
• 𝐸 𝑋 = 𝐸 𝑎1 𝑋1 + 𝑎2 𝑋2 = 𝑎1 𝐸 𝑋1 + 𝑎2 𝐸 𝑋2 = 𝑎1 𝜇1 +𝑎2 𝜇2 .

• 𝑉 𝑋 = 𝐸 𝑋2 − 𝐸 𝑋 2 = 𝐸 𝑎1 𝑋1 + 𝑎2 𝑋2 2 − ൫𝑎1 𝐸 𝑋1 +
2
𝑎2 𝐸 𝑋2 ൯
continue
• 𝑉 𝑋 = 𝐸 𝑎12 𝑋12 + 2𝑎1 𝑎2 𝑋1 𝑋2 + 𝑎22 𝑋22 −
𝑎12 𝐸 𝑋1 2 + 𝑎22 𝐸 𝑋2 2 +2𝑎1 𝑎2 𝐸 𝑋1 𝐸 𝑋2

• 𝑉 𝑋 = 𝑎12 𝐸 𝑋12 − 𝐸 𝑋1 2 + 𝑎22 𝐸 𝑋22 − 𝐸 𝑋2 2


+
2𝑎1 𝑎2 𝐸 𝑋1 𝑋2 − 𝐸 𝑋1 𝐸 𝑋2 .

• 𝑉 𝑋 = 𝑎12 𝑉 𝑋1 + 𝑎22 𝑉 𝑋2 + 2𝑎1 𝑎2 𝑐𝑜𝑣 𝑋1 , 𝑋2


General case
• Let 𝑋1 , 𝑋2 , … 𝑋𝑛 be random variables (Discrete or continuous) with
𝐸 𝑋𝑖 = 𝜇𝑖 .
• Define 𝑋 = 𝑎1 𝑋1 + 𝑎2 𝑋2 + ⋯ + 𝑎𝑛 𝑋𝑛 , where 𝑎𝑖 ’s are constants.

• 𝐸 𝑋 = 𝐸 𝑎1 𝑋1 + 𝑎2 𝑋2 + ⋯ + 𝑎𝑛 𝑋𝑛 = 𝑎1 𝐸 𝑋1 + 𝑎2 𝐸 𝑋2 + ⋯ +
𝑎𝑛 𝐸 𝑋𝑛 = 𝑎1 𝜇1 +𝑎2 𝜇2 + ⋯ + 𝑎𝑛 𝜇𝑛 .

• 𝑉 𝑋 = σ𝑛𝑖=1 𝑎𝑖2 𝑉 𝑋𝑖 + 2 σ σ 𝑎𝑖 𝑎𝑗 𝑐𝑜𝑣 𝑋𝑖 , 𝑋𝑗 where the double


sum is over all pairs 𝑖, 𝑗 with 𝑖 < 𝑗.
Exercise 1
• Let 𝑋1 , 𝑋2 , … , 𝑋𝑛 be independent random variables with 𝐸 𝑋𝑖 = 𝜇
1 𝑛
2 ത
and 𝑉 𝑋𝑖 = 𝜎 . Find the mean and variance of 𝑋 = σ𝑖=1 𝑋𝑖 .
𝑛
1 𝑛 1

• 𝐸 𝑋 = σ𝑖=1 𝐸 𝑋𝑖 = 𝑛𝜇 = 𝜇
𝑛 𝑛

1 2 𝑛 1 2 𝜎2
• 𝑉 𝑋ത = σ𝑖=1 𝜎 2
= 𝑛 𝜎2 = (covariance terms are all
𝑛 𝑛 𝑛
zero by independence)
Application
• An important case is when 𝑋1 , 𝑋2 , … , 𝑋𝑛 is a random sample of size 𝑛
drawn from a population with mean 𝜇 and variance 𝜎 2 .

• Application: how to estimate the mean 𝜇 of a large population?

• Take a sample at random with large size.

• 𝑥ҧ is a good estimator of 𝜇!
Mean and variance of Binomial random
variable
• 𝑌 = σ𝑛𝑖=1 𝑋𝑖 , where 𝑋𝑖 ’s are Bernoulli distributed random variables
with probability of success equal to 𝑝.

• 𝐸 𝑋𝑖 = 𝑝, 𝑉 𝑋𝑖 = 𝑝 1 − 𝑝

• 𝐸 𝑌 = 𝑛𝑝

• 𝑉 𝑌 = 𝑛𝑝 1 − 𝑝
Example
• A firm purchases two types of industrial chemicals. The amount of
type I chemical, 𝑋1 , purchased per week has 𝐸 𝑋1 = 40 gallons with
𝑉 𝑋1 = 4. The amount of type II chemical, 𝑋2 , purchased per week
has 𝐸 𝑋2 = 65 gallons with 𝑉 𝑋2 = 8. The type I chemical costs $3
per gallon, whereas type II costs $5 per gallon. Find the mean and
variance of the total weekly amount spent for these types of
chemicals, assuming that 𝑋1 and 𝑋2 are independent.
Solution
• 𝑌 = 3𝑋1 + 5𝑋2 ;

• 𝐸 𝑌 = 3𝐸 𝑋1 + 5𝐸 𝑋2 = 445

• 𝑉 𝑌 = 32 𝑉 𝑋1 + 52 𝑉 𝑋2 = 236
4
• An environmental engineer measures the amount (by weight) of
particulate pollution in air samples (of certain volume) collected over
the smokestack of a coal-operated power plant. Let X denote the
amount of pollutant per sample when a certain cleaning device on
the stack is not operating, and let Y denote the amount of pollutant
per sample when the cleaning device is operating, under similar
environmental conditions. It is observed that X is always greater than
2Y, and the relative frequency behavior of (X,Y) can be modeled by
𝑘 0 ≤ 𝑥 ≤ 2, 0 ≤ 𝑦 ≤ 1, 2𝑦 ≤ 𝑥
• 𝑓 𝑥, 𝑦 = ቊ
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
• That is X and Y are randomly distributed over the region inside the
triangle bounded by x=2, y=0, and 2y=x.

• A) Find the value of k that makes this a probability density function.

• B) Find 𝑝 𝑥 ≥ 3𝑦 . (That is the probability that the cleaning service


will reduce the amount of pollutant by one-third or more.)
y=x/2
x=2
y=0
• A) The double integral of f(x,y) over the triangular region must be
one.
2 𝑥/2 2𝑥 𝑥2 2
• ‫׬‬0 ‫׬‬0 𝑘𝑑𝑦𝑑𝑥 =1→ 𝑘 ‫׬‬0 𝑑𝑥 =1→ 𝑘( |0 = 1→𝑘=1
2 4
• Or,
1 2 1
• ‫׬‬0 ‫׬‬2𝑦 𝑘𝑑𝑥𝑑𝑦 = 1 → 𝑘 ‫׬‬0 (2 −
2𝑦)𝑑𝑦 = 1 → 𝑘(2𝑦 − 𝑦 2 |10 = 1 →𝑘=1

2 𝑥/3 2𝑥 𝑥2 2 4 2
• B) 𝑝 𝑥 ≥ 3𝑦 = ‫׬‬0 ‫׬‬0 𝑑𝑦𝑑𝑥 = ‫׬‬0 3 𝑑𝑥 = ( |0 = =
6 6 3
6)
• Refer to exercise 4)
• A) Find the marginal density functions of Y and X.
2 2
•𝑓 𝑦 = ‫׬‬2𝑦 𝑓 𝑥, 𝑦 𝑑𝑥 = ‫׬‬2𝑦 𝑑𝑥 = 2 − 2𝑦

𝑥/2 𝑥
•𝑓 𝑥 = ‫׬‬0 𝑑𝑦 =
2
• B) Are X and Y independent?
• No, since f(x,y) is different from f(x)f(y)
19
• The proportions X and Y of two chemicals found in samples of an
insecticide have the joint probability density function
2 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1, 0 ≤ 𝑥 + 𝑦 ≤ 1
• 𝑓 𝑥, 𝑦 = ቊ
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
• The random variable Z=X+Y denotes the proportion of the insecticide
due to both chemicals combined.
• Find E(Z)
y

y=1-x

x
• Marginal densities of X and Y are respectively:
1−𝑥
• 𝑓 𝑥 = ‫׬‬0 2𝑑𝑦 = 2 1 − 𝑥 , 0 ≤ 𝑥 ≤ 1
1−𝑦
• 𝑓 𝑦 = ‫׬‬0 2𝑑𝑥 = 2 1 − 𝑦 , 0 ≤ 𝑦 ≤ 1

1 1 𝑥2 𝑥3 1 1
•𝐸 𝑋 = ‫׬‬0 𝑥𝑓 𝑥 𝑑𝑥 = ‫׬‬0 2 𝑥 − 𝑥2 𝑑𝑥 = 2( − |0 =
2 3 3

• Likewise, E(Y)=1/3

• E(Z)=E(X+Y)=E(X)+E(Y)=2/3
• Find the variance of Z.
• Z=X+Y; V(Z)=V(X)+V(Y)+2cov(X,Y)
1 1
•𝐸 𝑋2 = ‫׬‬0 2𝑥 2 1 − 𝑥 𝑑𝑥 = 1/6, 𝐸 𝑌2 =
6
2 2 1 1
•𝑉 𝑋 =𝐸 𝑋 −𝜇 = ,𝑉 𝑌 =
18 18
• Cov(X,Y)=E(XY)-E(X)E(Y)
1 1−𝑦
• 𝐸 𝑋𝑌 = ‫׬‬0 ‫׬‬0 2𝑥𝑦𝑑𝑥𝑑𝑦 = 1/12
• Cov(X,Y)=1/12-1/9.

You might also like