St2131-Cheatsheet Otherstudent
St2131-Cheatsheet Otherstudent
examples Proof. let Y = F (X). then cdf of Y , FY (y) = standard normal distribution → X ∼ N (0, 1)
P (Y ≤ y) = P (F (X) ≤ y) = P (X ≤ F −1 (y)) = F (F −1 (y)) = y . x 1 2
Selecting hats problem • F (x) = P (X ≤ x) = √1rπ −∞ e− 2 y dy = Φ(x)
R
Let n be the number of men who select their own hats. Let IE be an indicator r.v. for hence Y is a uniform r.v.
E . Ei is the event that the i-th man selects his own hat. Let X be the number of Normal Approximation to the Binomial Distribution
• N9 - X = F −1 (U ) ∼ cdf F (x).
men that select their own hats. S −np
if Sn ∼ Binomial(n, p), then √ n ∼ N (0, 1) for large n.
• generating a r.v. from a uniform(0, 1) r.v. and a r.v. with cdf F (x).
• X = IE1 + IE2 + · · · + IEn np(1−p)
• P (Ei ) = n 1
Expectation & Variance µ = np, σ2 = np(1 − p)
1
• P (Ei |Ej ) = n−1 6= P (Ej ) for j < i (hence Ei and Ej are not independent) expectation Exponential Random Variable
• but dependence is weak for large n R∞ a continuous r.v. X is a exponential r.v., X ∼ Exponential(λ) or Exp(λ)
• X satisfies the other conditions for binomial r.v., besides independence (n trials N1 - expectation of X , E(X) = −∞ x · f (x) dx
if for some λ > 0, its pdf is given by
with equal probability of success) N2 - if X is a continuous r.v. with pdf fR(x), then for any real-valued function g , (
• Poisson approximation of X : X ∼ Poisson(λ) E[g(x)] = −∞∞
g(x)f (x) dx λe−λx , x ≥ 0
1 f (x) =
• λ = n · P (Ei ) = n · n otherwise
R∞
=1 N2a E[aX + b] = −∞ (aX + b) · f (x) dx = a · E(X) + b 0,
−1 i −1 R∞ 1 1
• P (X = i) = e i! 1 = e i! N3 - for a non-negative r.v. Y , E(Y ) = 0 P (Y > y) dy E(X) = λ
, V ar(X) = λ2
• P (X = 0) = e −1 ≈ 0.37 R∞ R∞R∞ d
No 2 people have the same birthday Proof. 0 P (Y > y) dy = 0
R∞Rx y fY (x) dx dy (because f (x) = dx
F (x))
n = 0 0 fY (x) dy dx (draw diagram to convert integration)
For 2 pairs of individuals i and j , i 6= j , let Eij be the event where they have the
Ra
same birthday. Let X be the number of pairs with the same birthday. = 0∞ fY (x) 0x dy dx
R R P (X < a) = 0 λe−λx dx
R∞ Rx
• X = IE1 + IE2 + · · · + IEn = 0 xfY (x) dx (because 0 dy = x)
1
• Each Eij is only pairwise independent. P (Eij ) = 365 = E(Y )
• an exponential r.v. is memoryless. 06. JOINTLY DISTRIBUTED RANDOM VARIABLES how to do a double integral
• a non-negative r.v. is memoryless → if
P (X > s + t | X > t) = P (X > s) for all s, t > 0.
Joint Distribution Function e.g. find P (X
( < Y ) where the joint pdf of X and Y are given by
the joint cumulative distribution function of the pair of r.v. X and Y is → 2e−x e−y , 0 < x < ∞, 0 < y < ∞
F (x, y) = P (X ≤ x, Y ≤ y), −∞ < x < ∞, −∞ < y < ∞ f (x, y) =
Gamma Distribution 0, otherwise
a r.v. X has a gamma distribution, X ∼ Gamma(α, λ) N1 - marginal cdf of X , FX (x) = lim F (x, y).
y→∞
with parameters (α, λ), λ > 0 and α > 0 if its pdf is given by N2 - marginal cdf of Y , FY (y) = lim F (x, y).
x→∞
λe−λx (λx)α−1
(
Γ(α)
, x≥0
f (x)
0, x<0 N3 - P (X > a, Y > b) = 1 − FX (a) − FY (b) + F (a, b)
E(X) = α
V ar(X) = λα2 N4 - P (a1 < X ≤ a2 , b1 < Y ≤ b2 )
λ
R∞ = F (a2 , b2 ) + F (a1 , b1 ) − F (a1 , b2 ) − F (a2 , b1 )
where the gamma function Γ(α) is defined as Γ(α) = 0 e−y y α−1 dy .
N1 - Γ(α) = (α − 1)Γ(α − 1)
Joint Probability Mass Function
Proof. using integration by parts of LHS to RHS
if X and Y are both discrete r.v., then their joint pmf is defined by 1. to get the bounds for dx and dy , plot X < Y
N2 - if α is an integer n, then Γ(n) = (n − 1)! p(i, j) = P (X = i, Y = j) 1.1. draw horizontal lines to determine the bounds for x, from x = a to x = b
N3 - if X ∼ Gamma(α, λ) and α = 1, then 1.2. draw vertical lines to determine the bounds for y , from y = c to y = d
N1 - marginal pmf of X , P (X = i) =
P
P (X = i, Y = j)
X ∼ Exp(λ). Pj RdRb
2. integrate c a f (x) dx dy
N2 - marginal pmf of Y , P (Y = i) = i P (X = i, Y = j)
N4 - for events occurring randomly in time following the 3 assumptions of poisson
example - given the joint pdf of X and Y , find
distribution, the amount of time elapsed until a total of n events has occurred is a Joint Probability Density Function the pdf of r.v. X/Y .
gamma r.v. with parameters (n, λ). the r.v. X and Y are said to be jointly continuous if there is a function f (x, y) called
• time at which event n occurs, Tn ∼ Gamma(n, λ) the joint pdf, such that for any two-dimensional set C , ans. set dummy variable W = X/Y , then
• number of events in time period [0, t], N (t) ∼ P oisson(λt) RR FW (w) = P (W ≤ w) = P ( X
Y
≤ w)
P [(X, Y ) ∈ C] = f (x, y) dx dy
N5 - Gamma(α = n , λ = 12 ) = χ2n (chi-square distribution to n degrees of R ∞ R wy −x−y
2 C P(X
Y
≤ w) = 0 0 e dx dy
freedom) = volume under the surface over the region C .
if X and Y are independent and have mgf’s MX (t) and MY (t) respectively,
N10 - the mgf of X + Y is MX+Y (t) = MX (t) · MY (t)
N11 - if MX (t) exists and is finite in some region about t = 0, then the distribution
of X is uniquely determined. MX (t) = MY (t) ⇐⇒ X = Y
Common mgf’s
2
• X ∼ N ormal(0, 1), M (t) = ee /2
• X ∼ Binomial(n, p), M (t) = (pet + (1 − p))n
• X ∼ P oisson(λ), M (t) exp[λ(et − 1)]
λ
• X ∼ Exp(λ), M (t) = λ−t
E[(X−µ)2 ]
Proof. P [(X − µ)2 ≥ k2 ] ≤ k2
by Markov’s inequality
2
Since (X − µ)2 ≥ k2 ⇐⇒ |X − µ| ≥ k, then P (|X − µ| ≥ k) ≤ σ
k2
then P (X 6= µ) = 0 ⇒ P (X = µ) = 1
commutative E∪F =F ∪E E∩F =F ∩E
associative (E ∪ F ) ∪ G = E ∪ (F ∪ G) (E ∩ F ) ∩ G = E ∩ (F ∩ G)
distributive (E ∪ F ) ∩ G = (E ∩ F ) ∪ (F ∩ G) (E ∩ F ) ∪ G = (E ∪ F ) ∩ (F ∪ G)
n n n n
Ei )c = Eic Ei )c = Eic
S T T S
DeMorgan’s ( (
i=1 i=1 i=1 i=1