Problem Set 1 With Solutions
Problem Set 1 With Solutions
Note: If you had problems answering the questions or you answered them incorrectly and
the solutions below do not help, you should review the material there and go see the TA
to make sure you understand the concepts and how to apply them.
Questions
(a) P (X ≤ −1.96)
Due to the symmetry of the normal distribution, P (X ≤ −1.96) = P (X ≥ 1.96). More-
over, P (X ≥ 1.96) = 1 − P (X ≤ 1.96). To evaluate P (X ≤ 1.96), you can use the table
below for the values of the cumulative function of the standard normal distribution,
1
(d) P (X ≤ 1.64)
We already computed this for part (b): P (X ≤ 1.64) = 0.9495.
(e) P (|X| ≤ 1.96)
Notice that P (|X| ≤ 1.96) = P (−1.96 ≤ X ≤ 1.96) = 1 − (P (X ≤ −1.96) + P (X ≥
1.96)). From part (a), we know that P (X ≤ −1.96) = P (X ≥ 1.96) = 0.025. Therefore,
P (|X| ≤ 1.96) = 1 − 2 × P (X ≤ −1.96) = 1 − 2 × 0.025 = 0.95.
2. Consider two random variables X and Z , with E[X] = 2, E[Z] = 1, V ar[X] = 1, V ar[Z] =
1, a = 0.5, b = 3, c = 0.2, d = 2.
because the variance of the constant number b is zero. Then V ar[aX + b] = 0.52 × 1 =
0.25
(c) Assuming that X and Z are independent, calculate V ar[X + Z] and SD[X + Z].
Recall that V ar[X + Z] = V ar[X] + V ar[Z] + 2Cov[X, Z]. Since we are told X and
Z are independent, we know Cov[X, Z] = 0 (recall: the reverse is not true!!). Hence:
V ar[X + Z] = 1 + 1 + 0 =
p2. √
Moreover, SD[X + Z] = V ar[X + 2] = 2.
(d) Assuming that Cov(X, Z) = 1, calculate Cov(aX + b, cZ + d).
We know that the covariance between a random variable and a constant is zero. Hence
Cov(aX + b, cZ + d) = Cov(aX, cZ) = a × c × Cov(X, Z) = 0.5 × 0.2 × 1 = 0.1
(e) Generalize previous results for any values of the means and variances E[X], E[Z], V ar[X], V ar[Z],
and any numbers a, b, cand d.
i. E[aX + b] = aE[X] + b.
ii. V ar[aX + b] = a2 V ar[X].
iii. V ar[X + Z] = V ar [X] + V ar [Z] + 2 × Cov[X, Z]; SD [X + Z] = V ar[X + Z].
p
iv. Recall the denition of covariance: for any random variables Y and W,
Cov(Y, W ) = E [Y − E(Y )] [W − E(W )]
2
so that
Cov(aX + b, cZ + d) = Cov(Y, W )
= E [Y − E(Y )] [W − E(W )]
= Ea (X − E(X)) c (Z − E(Z))
= acE (X − E(X)) (Z − E(Z))
= acCov(X, Z)
(f) Assuming again Cov(X, Z) = 1, what can you say about Corr(X, Z)?
Cov(X, Z) 1
Corr(X, Z) = =√ √ = 1.
SD(X)SD(Z) 1× 1
Cov(a + bX, c + dY )
Corr(a + bX, c + dY ) =
SD(a + bX) × SD(c + dY )
bdCov(X, Y )
=
bd × SD(X)SD(Y )
Cov(X, Y )
= = Corr(X, Y )
SD(X)SD(Y )
showing that linear transformations do not aect the Pearson correlation between two
variables.
(b) What is the unit of measure for Corr(X, Y )?
Correlation is unitless. This is the advantage of looking at correlations and not at co-
variances!