0% found this document useful (0 votes)
110 views6 pages

Math 3215 Intro. Probability & Statistics Summer '14

1. This homework assignment contains 3 problems about probability and statistics. 2. The first problem involves finding a conditional probability density function and expected value given certain continuous random variables. 3. The second problem involves finding marginal and conditional probability mass functions for discrete random variables. 4. The third problem defines a trinomial distribution with 3 outcomes and asks to find the joint probability mass function and distribution of one variable given the other, as well as their expected values and correlation.

Uploaded by

Pei Jing
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
110 views6 pages

Math 3215 Intro. Probability & Statistics Summer '14

1. This homework assignment contains 3 problems about probability and statistics. 2. The first problem involves finding a conditional probability density function and expected value given certain continuous random variables. 3. The second problem involves finding marginal and conditional probability mass functions for discrete random variables. 4. The third problem defines a trinomial distribution with 3 outcomes and asks to find the joint probability mass function and distribution of one variable given the other, as well as their expected values and correlation.

Uploaded by

Pei Jing
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Math 3215 Intro.

Probability & Statistics Summer ’14

Homework 5: Due 7/3/14

1. Let X and Y be continuous random variables with joint/marginal p.d.f.’s

f (x, y) = 2, 0 ≤ x ≤ y ≤ 1,
f1 (x) = 2(1 − x), 0 ≤ x ≤ 1,
f2 (y) = 2y, 0 ≤ y ≤ 1.
1 3

X=1 .

Find the conditional p.d.f. h(y|x) and the conditional probability P 2
≤Y ≤ 4 4
What is the expected value of Y when X = 14 ?
Solution: The conditional p.d.f h(y|x) = f (x, y)/f1 (x) is immediately seen to be
2 1
h(y|x) = = .
2(1 − x) 1−x

To find P ( 21 ≤ Y ≤ 34 | X = 14 ) we integrate the conditional p.d.f. h(y| 41 ) on the interval


1/2 ≤ y ≤ 3/4, and we obtain
  Z 3/4  
1 3 1 1 1 1 1
P ≤Y ≤ X= = 1 dy = 3 = .
2 4 4 1/2 1 − 4 4 4 3

Since expectation is linear, we have E(Y |X = 14 ) = E(4/3) = 4/3 .

2. Let X and Y be discrete random variables with joint p.m.f.


x+y
f (x, y) = , x = 1, 2, y = 1, 2, 3, 4.
32
Find the marginal p.m.f.’s of X and Y and the conditional p.m.f.’s g(x|y) and h(y|x). Find
P (1 ≤ Y ≤ 3 | X = 1) and P (Y ≤ 2 | X = 2). Finally, find E(Y | X = 1) and find
V ar(Y | X = 1).
Solution: The marginal p.m.f.’s of X and Y are immediately seen to be
X 4x + 10
f1 (x) = f (x, y) = ,
y=1,2,3,4
32
X 3 + 2y
f2 (y) = f (x, y) = .
x=1,2
32
2

The conditional p.m.f.’s are thus seen to be

f (x, y) (x + y)/32 x+y


h(y|x) = = = ,
f1 (x) (4x + 10)/32 4x + 10
f (x, y) (x + y)/32 x+y
g(x|y) = = = .
f2 (y) (3 + 2y)/32 3 + 2y

Also, we have
X 2 3 4 9
P (1 ≤ Y ≤ 3 | X = 1) = h(y|1) = + + = .
y=1,2,3
14 14 14 14

5 9
Note, this can also be computed as 1 − h(4|1) = 1 − 14
= 14
. Next, we compute

X 2+1 2+2 7
P (Y ≤ 2 | X = 2) = h(y|2) = + = .
y=1,2
18 18 18

Finally, we have that


X 2 3 4 4 18
E(Y | X = 1) = y · h(y|1) = (1) + (2) + (3) + (4) = ,
y=1,2,3,4
14 14 14 14 7

and  2
2 57 18
V ar(Y | X = 1) = E(Y | X = 1) − E(Y | X = 1) = − = 1.503 .
7 7

2
3

3. Let W equal the weight of a box of oranges which is supposed to weight 1-kg. Suppose that
P (W < 1) = .05 and P (W > 1.05) = .1. Call a box of oranges light, good, or heavy depending
on if W < 1, 1 ≤ W ≤ 1.05, or W > 1.05, respectively. In n = 50 independent observations
of these boxes, let X equal the number of light boxes and Y the number of good boxes.
Find the joint p.m.f. of X and Y . How is Y distributed? Name the distribution and state the
values of the parameters associated to this distribution. Given X = 3, how is Y distributed?
Determine E(Y | X = 3) and find the correlation coefficient ρ of X and Y .
Solution: The random variables are said to come from a trinomial distribution in this case
since there are three exhaustive and mutually exclusive outcomes light, good, or heavy, having
probabilities p1 = .05, p2 = .85, and p3 = .1, respectively. It is easy to see that the trinomial
p.m.f. in this case is
50!
f (x, y) = px py p50−x−y .
x!y!(50 − x − y)! 1 2 3
That is, f (x, y) is exactly the joint p.m.f. of X and Y , the number of light boxes and good
boxes, where p1 = .05, p2 = .85, and p3 = .1 are the various probabilities of each of the three
events: light, good, and heavy.
The random variable is binomially distributed, but the parameters of the distribution depend
on the value of the random variable X. We have that the random variable Y is (conditionally)
p2
binomially distributed b(n − x, 1−p 1
) since the marginal distributions of X, Y are b(n, p1 ),
b(n, p2 ) and the conditional p.m.f. of Y is thus, with n = 50,

1 n! x!(n − x)! 1
h(y|x) = f (x, y) = px1 py2 pn−x−y
3 · x
f1 (x) x!y!(n − x − y)! n! p1 (1 − p1 )n−x
n!x!(n − x)! px1 py2 pn−x−y
3
= · x· ·
x!y!(n − x − y)!n! p1 (1 − p1 ) (1 − p1 ) (1 − p1 )−y
y n−x
 y  n−x−y
(n − x)! p2 p3
= .
y!(n − x − y)! 1 − p1 1 − p1
In this case, with the specified values of n, p1 , p2 , and p3 , we have for X = 3
47!
h(y|3) = (.8947)y (.1053)47−y ,
y!(47 − y)!

so that Y is conditionally b(47, .8947) when X = 3. Since µ = np for binomial distribution,


we have that E(Y |X = 3) = (47)(.8947) = 42.05 .
p2
It is not hard to see that in fact E(Y |x) = (n − x) 1−p 1
in general, and that a similar formula
holds for E(X|y). The correlation coefficient is now found using the fact that since each of
4

p2 p1
the conditional expectations E(Y |x) = (n − x) 1−p 1
and E(X|y) = (n − y) 1−p 2
is linear, then
2
the square of the correlation coefficient ρ is equal to the product of the respective coefficients
of x and y in the conditional expectations.
  
2 −p2 −p1 p1 p2
ρ = = ,
1 − p1 1 − p2 (1 − p1 )(1 − p2 )

from which it follows that


r
p1 p 2
ρ= − = −.0819 .
(1 − p1 )(1 − p2 )

The fact that the correlation coefficient is negative follows from the fact that, for example,
σY
E(Y |x) = µY + ρ (x − µX ),
σX
and noting that the coefficient of x in E(Y |x) is seen to be negative (and also σY , σX > 0).

4. Let X have the uniform distribution U (0, 2) and let the conditional distribution of Y , given
that X = x, be U (0, x). Find the joint p.d.f. f (x, y) of X and Y , and be sure to state the
domain of f (x, y). Find E(Y |x).
Solution: We have that
y
f (x, y) = , 0 < x ≤ 2, 0 ≤ y ≤ x.
x
Now, x
x
x2
Z
y 1 3
E(Y |x) = y · dx = y = .
0 x 3x 0 3

2
5

5. The support of a random variable X is the set of x-values such that f (x) 6= 0. Given that
X has p.d.f. f (x) = x2 /3, −1 < x < 2, what is the support of X 2 ? Find the p.m.f. of the
random variable Y = X 2 .
Solution: The p.d.f. g(y) of Y = X 2 is obtained as follows. We note that the possible y-values
that can be obtained are in the range 0 ≤ y ≤ 4, so the support of g(y) needs to be the interval

[0, 4]. Now, on the interval [1, 4], there is a one-to-one transformation represented by x = y.

We first find G(y) = P (Y ≤ y) = P (X 2 ≤ y) = P (X ≤ y) for X = x in [1, 2], corresponding
to Y = y in [1, 4]. We have
√ √
y y
x2
Z Z
G(y) = f (x) dx = dx,
1 1 3
√ √
( y)2
and in particular g(y) = G0 (y) = · ( y)0 , from
3 √
the chain rule and the Fundamental
y
Theorem of Calculus. Simplifying, we have g(y) = 6 , 1 < y < 4.
In order to find g(y) on 0 < y < 1, we need to work a little harder. For x in the interval −1 <
√ √
x < 1 there is a two-to-one transformation given by x = − y for −1 < x < 0, and x = y
for 0 < x < 1. We then calculate G(y) for 0 < y < 1 (i.e., −1 < x < 1) as before, but now
√ √
using two integrals G(y) = P (Y ≤ y) = P (X 2 ≤ y) = P (− y < X < 0) + P (0 < X < y),
so for y in the interval 0 < y < 1 we have
Z 0 Z √y
G(y) = √ f (x) dx + f (x) dx.
− y 0

Again, from the chain rule and the Fundamental Theorem of Calculus we have

0 √ √ 0 √ √ 0 −y −1 y 1 y
g(y) = G (y) = −f (− y) · (− y) + f ( y) · ( y) = · √ + · √ = .
3 2 y 3 2 y 3

So,
(√
y/3 if 0 < y < 1,
g(y) = √
y/6 if 1 < y < 4.

There is no problem defining g(0) = g(1) = 0, or even just leaving the p.d.f. undefined at the
points y = 0 and y = 1.

2
6

6. Let X1 , X2 denote two independent random variables each with the χ2 (2) distribution. Find
the joint p.d.f. of Y1 = X1 and Y2 = X1 + X2 . What is the support of Y1 , Y2 (i.e., what is the
domain of the joint p.d.f., where f (y1 , y2 ) 6= 0)? Are Y1 and Y2 independent?
Solution: We have that X1 , X2 have the same p.d.f.
1 −x/2
h(x) =
e , 0 ≤ x < ∞,
2
corresponding to the χ2 (r) distribution with r = 2 degrees of freedom. By the way, this is
the same as saying that X1 , X2 follow exponential distributions with θ = 2. Since X1 , X2 are
independent, the joint p.d.f. of X1 and X2 is
1 −(x1 +x2 )/2
f (x1 , x2 ) = h(x1 )h(x2 ) = e .
4
The change of variables formula is g(y1 , y2 ) = |J|f (v1 (y1 , y2 ), v2 (y1 , y2 )) using the determinant
of the Jacobian
∂v1 (x1 , x2 ) ∂v1 (x1 , x2 )


∂y 1 ∂y 2


J =

∂v2 (x1 , x2 ) ∂v2 (x1 , x2 )

∂y1 ∂y2
where vi (y1 , y2 ) is the inverse of ui , Yi = ui (X1 , X2 ), i = 1, 2. In this case, Y1 = u1 (X1 , X2 ) =
X1 , so X1 = v1 (Y1 , Y2 ) = Y1 , and Y2 = u2 (X1 , X2 ) = X1 + X2 , so X2 = v2 (Y1 , Y2 ) = Y2 − Y1 .
Hence,
1 0
|J| = = 1,
−1 1
so the joint p.d.f. of Y1 and Y2 is
1
g(y1 , y2 ) = |1|f (y1 , y2 − y1 ) = e−y2 /2 , 0 ≤ y1 ≤ y2 < ∞.
4
To determine if Y1 and Y2 are independent we compute the marginal p.d.f.’s. We have,
Z y2
1 −y2 /2 y2
g1 (y1 ) = e dy1 = e−y2 /2 ,
0 4 4
and ∞

−1 −y2 /2
Z
1 −y2 /2 1
g2 (y2 ) = e dy2 = e = e−y1 /2 .
y1 4 2
y1 2
Since, g(y1 , y2 ) 6= g1 (y1 )g2 (y2 ), Y1 and Y2 are dependent.

You might also like