0% found this document useful (0 votes)
90 views

Probability Final Exam With Solutions

This document contains the instructions and questions for a final exam in MATH 5010/6805(002) Spring 2018. It is a 7 question exam that students have 120 minutes to complete. The exam contains both multiple choice and free response questions related to probability and statistics concepts. A table of integrals and sums is provided for students as a reference. The first question asks students to identify true or false statements. Subsequent questions ask students to compute probabilities, probability mass and density functions, and conditional probabilities using the given information and distributions.

Uploaded by

saruji_san
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
90 views

Probability Final Exam With Solutions

This document contains the instructions and questions for a final exam in MATH 5010/6805(002) Spring 2018. It is a 7 question exam that students have 120 minutes to complete. The exam contains both multiple choice and free response questions related to probability and statistics concepts. A table of integrals and sums is provided for students as a reference. The first question asks students to identify true or false statements. Subsequent questions ask students to compute probabilities, probability mass and density functions, and conditional probabilities using the given information and distributions.

Uploaded by

saruji_san
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

MATH 5010/6805(002) Spring 2018 Final Exam

Name: UID:

Question 1 2 3 4 5 6 7

Score

Instructions
1. You will have 120 minutes to complete this exam. There are seven questions. The
points values of each question are marked next to the question.
2. Please write neatly! If I cannot read or understand what you write, I will assume
that it is wrong.
3. You DO NOT need to simplify your answers. Do not spend time simplifying
fractions.
4. You may have 8.5 inch by 11 inch sheet of paper with notes on it on your desk. You
may have writing on both sides of the sheet of paper.
5. You may not use any electronic devices (including calculators), books, or notes other
than those on your one sheet of paper.
6. The next page includes a table of (potentially) useful integrals and sums.
Table of Integrals and Sums

xα+1
R
xα dx α+1 + C if α 6= −1
x−1dx
R
ln |x| + C, if x < 0 or x > 0
1 αx
R
eαxdx αe + C, α 6= 0
eαx
R
xeαxdx α2
(αx − 1) + C, α 6= 0
R∞ −αx 1 −αz
z e dx αe , α>0
R ∞ t−1 −x
0 x e dx Γ(t) if t > 0
Γ(n) (n − 1)! if n ∈ {1, 2, 3, 4 . . . } = N
1
R
1+x2
dx arctan(x) +C
R √ 1

1 − x2dx 2 (x 1 − x2 + arcsin(x)) + C
R
cos(x)dx sin(x) + C
R
sin(x)dx − cos(x) + C
R
tan(x)dx − ln | cos(x)| + C
R
cot(x)dx ln | sin(x)| + C
R ∞ − 1 x2 √
−∞ e
2 dx 2π
P∞ k 1
k=0 x 1−x if |x| < 1
P∞ xk
k=0 k! ex
1. (15 points) For each of the following questions, answer true or false on the line provided.

1. If P (A ∩ B ∩ C) > 0, then P (A ∩ B|C) = P (A|B ∩ C)P (B|C). T

2. Suppose that the correlation coefficient of X and Y , ρ(X, Y ) has T


|ρ(X, Y )| = 1. Then P (Y = aX + b) = 1 for some a, b ∈ R

3. Suppose that limn→∞ E[Xn ] = E[Y ]. Then Xn converges in distribution to Y . F

4. If P (B) > 0 and P (C) > 0, then P (A) = P (A|B)P (B) + P (A|C)P (C) F

5. Suppose that X is a random variable with E[eX ] ≤ e−1 . Then P (X > 1) ≤ e−2 . T

Solution:
2. (10 points) There are two parts to this question. Answer both of them.

1. Show that if X and Y are independent random variables with E[|X|] < ∞ and E[|Y |] < ∞
then Cov(X,Y) = 0. Explain your steps.

2. Give an example of two random variables X and Y which are uncorrelated, but not inde-
pendent. Show that they are uncorrelated but not independent.

Solution:

1. Independence implies that E[XY ] = E[X]E[Y ]. Then Cov(X, Y ) = E[XY ]−E[X]E[Y ] =


0.

2. Suppose that P (X = −1) = P (X = 0) = P (X = 1) = 1/3 and let Y = X 2 . Then


XY = X 3 = X and E[X] = 0. We have

Cov(X, Y ) = E[XY ] − E[X]E[Y ] = E[X](1 − E[Y ]) = 0,

but

P (X = 0|Y = 0) = 1 6= 1/3 = P (X = 0)

so X and Y are not independent.


3. (10 points) Suppose that X is a random variable with cumulative distribution function
given by



 0 x<0
0 ≤ x < 21

x
FX (x) = 1 1
 2 1 3
2 + x− 2 2 ≤x< 4



x ≥ 34

1

1. Are there any values of x for which P (X = x) > 0? If so, for each of these values compute
P (X = x).

2. Does X have a probability density function? If so, compute it. If not, explain why not.

Solution:

1. Yes, P (X = 3/4) = 1 − 7/16 = 9/16.

2. No, since P (X = 3/4) > 0, X is not continuous and so cannot have a pdf.
4. (10 points) A company that only sells its products through its website is reviewing the
effectiveness of its recent advertising campaign. It purchased a large list of e-mail addresses and
sent an advertisement with a link to the company’s website to each of those addresses. The
company’s data indicates that 10% of people who recieved the e-mail purchased goods. Among
people who did not recieve the e-mail link, 98% did not purchase goods. The company estimates
that approximately 2% of the population of potential customers recieved the e-mail link.

1. What proportion of the population of potential customers purchased goods from this com-
pany’s website?

2. What proportion of the customers who purchased goods visited the website using the
e-mail link?

Solution: Let B denote the collection of people who buy from this company’s website and let
M denote the people who viewed the e-mail link. We have

P (B|M ) = .1
P (B|M c ) = .02
P (M ) = .02.

Then

1. P (B) = P (B|M )P (M ) + P (B|M c )P (M c ) = (.1)(.02) + (.02)(.98).

2.
P (B|M )P (M ) (.1)(.02) .1
P (M |B) = c c
= = .
P (B|M )P (M ) + P (B|M )P (M ) (.1)(.02) + (.02)(.98) 1.08
5. (20 points) Suppose that we pick a number uniformly at random from the set {1, 2, 3}.
Call the result of this choice N . We have a biased coin, which flips heads with probability 1/4
and tails with probability 3/4. We flip this biased coin until we have observed the N th heads
(i.e. if N = 2 then we flip a fair coin until we observe the second heads). Call the number of
times we flip the coin X (i.e. X = 10 if the tenth flip is the flip on which we observe the N th
heads).

1. What is the conditional pmf of X given that N = n, fX|N (x|n)? Be careful about possible
values here.

2. Find the marginal pmf of X, fX (x). Remember your answer should be a number for each
x ∈ R, but you do not need to simpilfy.

3. Find P (N = 2|X = 10).

Solution:

1. The possible values of N are {1, 2, 3} so we have


(
(1/4)(3/4)x−1 x = 1, 2, 3, . . .
fX|N (x|1) =
0 otherwise
(
x−1 2 x−2 x = 2, 3, 4 . . .

fX|N (x|2) = 1 (1/4) (3/4)
0 otherwise
(
x−1 3 x−3 x = 3, 4 . . .

fX|N (x|3) = 2 (1/4) (3/4)
0 otherwise

2. By the law of total probability,

fX (x) = fX|N (x|1)fN (1) + fX|N (x|2)fN (2) + fX|N (x|3)fN (3)



 (1/3)(1/4) x=1
(1/3)(1/4)(3/4) + (1/3)(1/4)2

x=2
= x−1 x−1
 2 x−2 x−1
 3 x−3



 (1/3) (1/4)(3/4) + 1 (1/4) (3/4) + 2 (1/4) (3/4) x = 3, 4, 5, . . .

0 otherwise

3.
P (N = 2, X = 10) fX|N (10|2)fN (2) 9(1/4)2 (3/4)8
P (N = 2|X = 10) = = = .
P (X = 10) fX (10) 9(1/4)2 (3/4)8 + (1/4)(3/4)9
6. (15 points) Suppose that Xi are i.i.d. random variables with moment generating function
t2
MXi (t) = E[etXi ] = e 2 .

(i.e. the Xi are i.i.d. standard normal random variables). Suppose that X is a random variable
with P (X = 0) = 1.

1. Compute the moment generating function of the random variable


n
1X
Yn = Xi ,
n
i=1

MYn (t). Justify your steps.

2. Compute the moment generating function of the random variable X and show that

lim MYn (t) = MX (t).


n→∞

3. Explain in words what the meaning of the second part of the problem is.

Solution:

1. By independence,
n
Y t2
MYn (t) = M 1 Pn Xi (t) = M i=1 Xi (t/n) =
Pn MXi (t/n) = e n .
n i=1
i=1

2. MX (t) = 1. Since limn→∞ t2 /n = 0 for all t,

lim MYn (t) = 1 = MX (t).


n→∞

3. This shows that Yn converges in distribution to the random variable X which has P (X =
0) = 1.
7. (20 points) Suppose that the joint density function of (X, Y ) is
1 − 1 (x2 +y2 )
fX,Y (x, y) = e 2 .

Let Z = Y /X.

1. Show that the joint density function of (X, Z) is given by

|x| − 1 x2 (1+z 2 )
fX,Z (x, z) = e 2

2. Find the marginal density of z, fZ (z).

Hints:
∞ ∞
|x| − 1 x2 (1+z 2 )
Z Z
x − 1 x2 (1+z 2 )
e 2 dx = 2 e 2 dx.
−∞ 2π 0 2π

This integral can be evaluated with the u substitution u = 12 x2 (1 + z 2 ).

3. For each possible value z of Z, compute the conditional density of X given Z = z,


fX|Z (x|z).

4. Find the moment generating function of Z, MZ (t) = E[etZ ].

Solution:

1. We invert the transformation:

x(x, z) = x, y(x, z) = zx.

The partial derivatives are


∂x ∂x
= 1, =0
∂x ∂z
∂y ∂y
= z, =x
∂x ∂z
so the density of (X, Z) is

= 1 e−(x2 +(xz)2 ) |x| = |x| e−x2 (1+z 2 ) .
∂x ∂y ∂x ∂y
fX,Z (x, z) = fX,Y (x, xz)

∂x ∂z ∂z ∂x 2π 2π

2. The possible values of z are any real number. The density of Z is


Z ∞ Z ∞ Z ∞
|x| − 1 x2 (1+z 2 ) x − 1 x2 (1+z 2 )
fZ (z) = fX,Z (x, z)dx = e 2 dx = 2 e 2 dx.
−∞ −∞ 2π 0 2π

Substituting u = 12 x2 (1 + z 2 ), we have du/(1 + z 2 ) = dx. Then


Z ∞
1 1 1 1
fZ (z) = e−u du = .
π 1 + z2 0 π 1 + z2
3. We have
fX,Z (x, z) (1 + z 2 ) 2 2
fX|Z (x|z) = = |x|e−x (1+z ) .
fZ (z) 2

4. We always have MZ (0) = 1. For t > 0,

etx
lim = ∞.
x→∞ 1 + x2

and for t < 0,

e−tx
lim = ∞.
x→−∞ 1 + x2

It follows that for t 6= 0,



etx
Z
1
MZ (t) = dx = ∞.
π −∞ 1 + x2

So
(
0 t=0
MZ (t) = .
∞ t=
6 0

You might also like