Probability
Probability
Chapter 2
• The following material has been covered in lecture. Sample space, event, probabil-
ity, conditional probability, Bayes’ formula, independent events, discrete and continu-
ous random variables, probability mass/density function (pdf), cumulative probability
function (cdf).
• X and Y are independent random variables if and only if for any functions h and g
E[g(X)h(Y )] = E[g(X)]E[h(Y )].
• Covariance of X and Y :
1
• Markov’s inequality. If X is a random variable that takes only nonnegative values,
then for any value a > 0
E[X]
P (X ≥ a) ≤ .
a
• Strong Law of Large Numbers and Central Limit Theorem.
• When X and Y are discrete random variables, the conditional probability mass function
of X given that Y = y is
p(x, y)
PX|Y (x|y) = .
pY (y)
Hence,
X X
FX|Y (x|y) = pX|Y (a|y), E[X|Y = y] = xpX|Y (x|y).
a≤x x
• If X and Y have a joint probability density function f (x, y), then the conditional
probability density function of X, given that Y = y, is defined for all values of y such
that fY (y) > 0, by
f (x, y)
fX|Y (x|y) = .
fY (y)
Hence the conditional expectation of g(X) given that Y = y, is
Z ∞
E[g(X)|Y = y] = g(x)fX|Y (x|y)dx.
−∞
• Let E[X|Y ] denote the function of the random variable Y whose value at Y = y is
E[X|Y = y], or equivalently, if g(y) = E[X|Y = y], then E[X|Y ] = g(Y ). Then we
have
E[X] = E[E[X|Y ]].