0% found this document useful (0 votes)
32 views2 pages

Probability

This document provides an overview of probability theory concepts covered in a university course. It discusses sample spaces, probability mass and density functions, common random variables like binomial and normal, expectations, joint and marginal distributions, independence of random variables, covariance, moment generating functions, Markov's inequality, laws of large numbers, and key concepts of conditional probability including the conditional probability mass/density function, the law of total expectation, and the law of total variance. Formulas are provided for computing expectations, variances, and conditional expectations and variances.

Uploaded by

Mickey Wong
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views2 pages

Probability

This document provides an overview of probability theory concepts covered in a university course. It discusses sample spaces, probability mass and density functions, common random variables like binomial and normal, expectations, joint and marginal distributions, independence of random variables, covariance, moment generating functions, Markov's inequality, laws of large numbers, and key concepts of conditional probability including the conditional probability mass/density function, the law of total expectation, and the law of total variance. Formulas are provided for computing expectations, variances, and conditional expectations and variances.

Uploaded by

Mickey Wong
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

2015−16

THE UNIVERSITY OF HONG KONG


DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE

STAT3603 Probability Modelling

Overview of Probability Theory

Chapter 2

• The following material has been covered in lecture. Sample space, event, probabil-
ity, conditional probability, Bayes’ formula, independent events, discrete and continu-
ous random variables, probability mass/density function (pdf), cumulative probability
function (cdf).

• Some typical random variables: Bernoulli, Binomial, Poisson; uniform, exponential,


Gamma, Normal.
P
• The expectation: for any function g(x) and r.v. X, E[g(X)] = g(x)p(x) when
x:p(x)>0
R∞
X is a discrete r.v. with probability mass function p(x) and E[g(X)] = −∞ g(x)f (x)dx
when X is a continuous r.v. with pdf f (x).

• Joint distribution. Joint cumulative probability distribution function of (X, Y ): F (a, b) =


P (X ≤ a, Y ≤ b). The marginal distribution function of X: FX (a) = F (a, ∞).
In the case that both X and Y are discrete, the joint probability mass function
of (X, Y ) is p(x, y) = P (X = x, Y = y), the probability mass function of X is
P
pX (x) = y:p(x,y)>0 p(x, y). If X and Y are jointly continuous with joint probabil-
R∞
ity density function f (x, y), then the marginal pdf of X is fX (x) = −∞ f (x, y)dy, and
Ra Rb P
F (a, b) = −∞ −∞ f (x, y)dydx. Now E[g(X, Y )] = x,y g(x, y)p(x, y) in the discrete
R∞ R∞
case and E[g(X, Y )] = −∞ −∞ g(x, y)f (x, y)dxdy in the continuous case.

• X and Y are independent random variables if and only if for any functions h and g
E[g(X)h(Y )] = E[g(X)]E[h(Y )].

• Covariance of X and Y :

Cov (X, Y ) = E[(X − E[X])(Y − E[Y ])] = E[XY ] − E[X]E[Y ].

• Review the properties of the expectation, covariance, and variance.

• φ(t) = E[etX ] is the moment generating function of X. φn (0) = E[X n ]. Computations


of the moment generating functions for typical random variables.

1
• Markov’s inequality. If X is a random variable that takes only nonnegative values,
then for any value a > 0
E[X]
P (X ≥ a) ≤ .
a
• Strong Law of Large Numbers and Central Limit Theorem.

Chapter 3. Read the first five sections in this chapter.

• When X and Y are discrete random variables, the conditional probability mass function
of X given that Y = y is
p(x, y)
PX|Y (x|y) = .
pY (y)
Hence,
X X
FX|Y (x|y) = pX|Y (a|y), E[X|Y = y] = xpX|Y (x|y).
a≤x x

• If X and Y have a joint probability density function f (x, y), then the conditional
probability density function of X, given that Y = y, is defined for all values of y such
that fY (y) > 0, by
f (x, y)
fX|Y (x|y) = .
fY (y)
Hence the conditional expectation of g(X) given that Y = y, is
Z ∞
E[g(X)|Y = y] = g(x)fX|Y (x|y)dx.
−∞

Note that E[g(X)|Y = y] is a function of y.

• Let E[X|Y ] denote the function of the random variable Y whose value at Y = y is
E[X|Y = y], or equivalently, if g(y) = E[X|Y = y], then E[X|Y ] = g(Y ). Then we
have
E[X] = E[E[X|Y ]].

• The conditional variance of X given that Y = y is

Var (X|Y = y) = E[(X − E[X|Y = y])2 |Y = y] = E[X 2 |Y = y] − (E[X|Y = y])2 .

We have the proposition

Var (X) = E[ Var (X|Y )] + Var (E[X|Y ]).

You might also like