0% found this document useful (0 votes)
288 views3 pages

STOR 435 Formula Sheet: Basics

This document provides formulas and definitions for probability distributions, probability rules, and statistical concepts. It includes formulas for calculating probabilities and expectations for binomial, normal, Poisson, exponential, gamma, geometric, hypergeometric, negative binomial, and uniform distributions. It also defines concepts like independence, variance, covariance, correlation, conditional probability, and conditional expectation.

Uploaded by

EricQian
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
288 views3 pages

STOR 435 Formula Sheet: Basics

This document provides formulas and definitions for probability distributions, probability rules, and statistical concepts. It includes formulas for calculating probabilities and expectations for binomial, normal, Poisson, exponential, gamma, geometric, hypergeometric, negative binomial, and uniform distributions. It also defines concepts like independence, variance, covariance, correlation, conditional probability, and conditional expectation.

Uploaded by

EricQian
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

STOR 435 Formula Sheet

1. P (U k) = P (X1 k, . . . , Xn k) = P (X1 k)
K k+1 n
P (Xn k) = ((
) )
K
2. P (U = k) = P (U k) P (U k + 1)

Basics
Event relationships
1. A = B A B B A
2. A B = mutually exclusive

Maximum
Let V = M ax{X1 , . . . , Xn }

Probability definition

1. P (V k) = P (X1 k, . . . , Xn k) = P (X1 k)
k
P (Xn k) = (( )n )
K

1. 0 P (A) 1, A
2. P () = 1P
n
3. P (A) = i=1 P (Ai ) if Ai , . . . , An partitions A

2. P (V = k) = P (V k) P (V k 1)
3. E(V ) = kk=1 P (V k) = K
k=1 [1 P (V < k)] = K
K
K
k=1 P (X1 < k) P (Xn < k) = K k=1

Probability rules
(
1. P (A|B) =

P (AB)
P (B)

if P (B) > 0

Calculus

0
if P (B) = 0
2. P (A B)
=
P
(A)P
(B|A)
Pn
3. P (B) = i=1 P (Ai )P (B|Ai )

Bernoulli trials
 
 
n k nk
n
P (X = k) =
p q
if X Bin(n, p) where
=
k
k

n!
(nk)!k!

R
R
1. udv = iv vdu
2. For (x, y) (u, v),
use Jacobian determinant.
x x

(x, y) u v
=

(u, v) y y


u v

|J| =

Distributions
Independence definition
1.
2.
3.

P (A B) = P (A)P (B)
P (A|B) = P (A)
P B|A) = P (B)
Disjoint blocks of independent events are independent
Mutually exclusive ; independent
Mutually independent pairwise independent (:)

Density curve (continuous RV)


1. fR (x) 0

2. f (x) dx = 1
3. P (a X b) =

Rb
A

f (u) du

Multivariate
Joint: P(X,Y ) (x, y) = P (X = x, Y = y)
Marginal: PX (x) = P (X = x) = y P(X,Y ) (x, y) = P (X =
x, Y = y)
If X = Y X and Y have the same distribution

Maximum and Minimum


Minimum
Let U = M in{X1 , . . . , Xn }

General facts
1. Same distribution if P (X = v) = P (Y = v).
P (a X b) = P (a Y b).
2. Equal if P (X = Y ) = 1.

Implies

Binomial: X Bin(n, p)
1. X = IA1 + + IAn . Var(X) = IA1 + + IAn =
n(E(IA1 ) [E(IA1 )]2 ) = npq
2. A and B are independent IA + IB
3. A B IA IB = IA

Binomial approximation
z
P (a X b) P ( a.5

= npq. Works if 3.

b+.5
)

where = np and

Exponential: X exp()
x
1. f (x) = e
(
1 ex where x 0
2. F (X) =
0 where x < 0
1
1
3. E(X) = , V ar(X) = 2

4. Survival: P (X > x) = 1 F (X) = ex


5. Memoryless property: P (X > x + t|X > x) = P (X > t)

Gamma: Sn Gamma(n, p)

CDF

Continuous analog of Poisson


1. Let Sn = X1 + .. + Xn where Xi exp()
ex n xn1
2. fSn =
where (n) = (n 1)!
(n)
n
n
3. E(Sn ) = , Var(Sn ) = 2

Continuous and step-wise


1. F (x) = P (X x), limx = 1
2. Step function: F (x) = u:ux P (X = x) P (X = x) =
F (x) F (X ) where F (X is the left limit
(a) P (a < X < b) = F (b ) F (a)
(b) P (a X < b) = F (b ) F (a )
(c) P (a < X b) = F (b) F (a)
(d) P (a X b) =RF (b) F (a )
x
d
F (x)
3. Continuous: F (x) = f (u)du f (x) = dx

Geometric: X Geom(p)
1. P (X = k) = q k1 p, P (X k) = q k1
k1
2. E(X) =
= =
k=1 P (X k) = q
q
3. Var(X) = p2

1
p

Hypergeometric X HG(n; N, B)

Joint, marginal, and independence

Sample n balls without


N =B+R
 replacement,

B R
1. P (X = b) =

r
N
n

B
2. E(X) = P (Ai ) = np where p =
n
N n
R
3. V ar(X) = (
)npq where q =
N 1
n

Independent Normal RV
1. X = z + X N (, 2 )
2. X1 , . . . , Xn iid for Xi N (i , i2 ) Sn := X1 + + Xn
N (1 + + n , 12 + + n2 )

Multinomial
Let X1 , . . . , Xn be iid with P (X1 = m) = pm .
1. Nm = IX1 =m + + IXn =m
n!
pn1 . . . pnmm
2. Then, P (N1 = n1 , . . . , Nm = nm ) = n1 !...n
m! 1
3. Cov(Nk , Nm ) = npk pm

Negative Binomial: X NegBin(n, p)


Number of failures until r successes
k r
k1
1. P (X = k) = k+r1
r1 q p , P (X k) = q
rq
2
2. E(X) = = p , V ar(X) = rq/p

Conditional joint and marginal


R
1. fY (y) = fy (y|X = x)dx
2. Multiplication rule: f (x, y) = fx (x)fy (y|X = x)
f (x, y)
3. fy (y|X = x) =
fx (x)
R
4. P (Y B|X
=
x)
= B fY (y|X = x)dy
R
5. P (B) = R P (B|X = x)fX (x)dx
6. E(Y ) = E(Y |X = x)fX (x)dx

Expectation, Variance, and Correlation


Expectation

Normal: X N(, 2 )
Continuous analog of binomial
1
1. f (x) = 2
exp( 21 2 (x )2 )
2
2. cdf: (z)

Poisson: X Poisson()
k

1. P (X = k) exp k!

1. RJoint CDF:
R xn F (x1 , . . . , xn ) = P (X1 x1 , . . . , X1 xn ) =
x1
.
.
.
= f (x1 , . . . , xn )dx1 . . . dxn

R
2. 1D Marginal: fxi (xi ) = Rn1 f (u1 , . . . , ui1 , xi , ui+1 , . . . , un
du1 . . . dui1 dui+1 . . . dun
3. Independence: f (x1 , . . . , xn ) = fx1 (x1 ) . . . fxn (xn )
F (x1 , . . . , xn ) = Fx1 (x1 ) . . . Fxn (xn ).
Also, f (x, y) =
fx (x)fy (y)
R
4. E(g(x1 , . . . , xnR)) = Rn g(x1 , . . . , xn )f (x1 , . . . , xn )dx1 . . . dxn .
Also, E(X) = xfx (x)dx
5. P (X dx, Y dy = f (x, y)dxdy

k1



2. cdf: E(X) =
k=0 ke
k! = k=1 e
(k1)!

Let X be a random variable and c be a constant


1. = xP
( (x), E(X) = x xP (X = x)
0 if A does not occurs
2. IA =
1 if A occurs
3. E(IA ) = P (A)
4. E(X + Y ) = E(X) + E(Y )
5. E(c) = c, E(cX) = cE(X)
6. If X and Y are independent, E(XY ) = E(X)E(Y )
= V ar(X) =
7. Let pj = P (Xi = j). E(x) = p1 +2p2 + +npn = (p1 + +
pn ) + (p2 + + pn ) + . . . (pn1 + pn ) + pn = ni=1 P (x j)

Uniform: X Uniform[a, b]
1
ba
= a+b
2 ,

Conditional Expectation

1. f (x) =

2. E(X)
V ar(X) = (ba)
12
ol(A)
3. P ((x1 , . . . , xn ) A) = VV ol(D)
A D
4. D = [a, b][c, d](x, y) is uniformly distributed over D X
U nif orm[a, b] and Y U nif orm[a, b]

Conditional expectation < Unconditional expectation


1. E(Y |X = x) = y yP (Y = y|X = x)
2. E(g(x, y)|X = x) = y g(x, y)P (Y = y|X = x)
3. E(Y ) = E[E(Y |X)] = x E(Y |X = x)P (X = x)

Variance

Correlation
Cov(X, Y )
=
SD(X)SD(Y )
2. 1 Corr(X, Y ) 1
Y
X
and X = Y
(a) X = X
X
Y
(b) E(X ) = E(Y ) = 0
(c) V ar(X ) = V ar(Y ) = 1
3. Corr(x, y) = 1 y = ax + b for a, b R

1. Corr(X, Y ) =
1.
2.
3.
4.
5.
6.
7.
8.

Cov(X, Y ) = E((X X )(Y Y )) = E(XY ) E(X)E(Y )


V ar(X) = E((X )2 ) = E(X 2 ) 2
Cov(X, X)=Var(X)
Cov(X, Y ) = Cov(Y, X)
Cov(IA , IB ) = P (A B) P (B)P (A))
V ar(X + Y ) = V ar(X) + V ar(Y ) + 2Cov(X, Y )
Cov(ai Xi , bj Yj ) = ai bi Cov(Xi , Yj )
X and Y are independent Corr(X, Y ) = 0 (:)

You might also like