0% found this document useful (0 votes)
2 views3 pages

W8 Notes

notes

Uploaded by

24f3002569
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views3 pages

W8 Notes

notes

Uploaded by

24f3002569
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Statistics for Data Science - 2

Week 8 Notes
Statistics from samples and Limit theorems

1. Moment generating function (MGF):


Let X be a zero-mean random variable (E[X] = 0). The MGF of X, denoted MX (λ),
is a function from R to R defined as
MX (λ) = E[eλX ]

MX (λ) = E[eλX ]
λ2 X 2 λ3 X 3
= E[1 + λX + + + . . .]
2! 3!
λ2 λ3
= 1 + λE[X] + E[X ] + E[X 3 ] + . . .
2
2! 3!
λk
That is coefficient of in the MGF of X gives the kth moment of X.
k!
2 2
• If X ∼ Normal(0, σ 2 ) then, MX (λ) = eλ σ /2
• Let X1 , X2 , . . . , Xn ∼ i.i.d. X and let S = X1 + X2 + . . . + Xn , then
MS (λ) = (E[eλX ])n = [MX (λ)]n
It implies that MGF of sum of independent random variables is product of the
individual MGFs.

2. Central limit theorem: Let X1 , X2 , . . . , Xn ∼ iid X with E[X] = µ, Var(X) = σ 2 .


Define Y = X1 + X2 + . . . + Xn . Then,
Y − nµ
√ ≈ Normal(0, 1).

3. Gamma distribution:
X ∼ Gamma(α, β) if PDF fx (x) ∝ xα−1 e−βx , x>0
• α > 0 is a shape parameter.
• β > 0 is a rate parameter.
1
• θ = is a scale parameter.
β
α
• Mean, E[X] =
β
α
• Variance, Var(X) = 2
β
4. Beta distribution:
X ∼ Beta(α, β) if PDF fx (x) ∝ xα−1 (1 − x)β−1 , 0<x<1

• α > 0, β > 0 are the shape parameters.


α
• Mean, E[X] =
α+β
αβ
• Variance, Var(X) = 2
(α + β) (α + β + 1)

5. Cauchy distribution:
1 α
X ∼ Cauchy(θ, α2 ) if PDF fx (x) ∝
π α + (x − θ)2
2

• θ is a location parameter.
• α > 0 is a scale parameter.
• Mean and variance are undefined.

6. Some important results:

• Let Xi ∼ Normal(µi , σi2 ) are independent and let Y = a1 X1 + a2 X2 + . . . an Xn ,


then
Y ∼ Normal(µ, σ 2 )
where µ = a1 µ1 + a2 µ2 + . . . an µn and σ 2 = a21 σ12 + a22 σ22 + . . . a2n σn2
That is linear combinations of i.i.d. normal distributions is again a normal distri-
bution.

• Sum of n i.i.d. Exp(β) is Gamma(n, β).


 
1 1
• Square of Normal(0, σ ) is Gamma , 2 .
2
2 2σ
X
• Suppose X, Y ∼ i.i.d. Normal(0, σ 2 ). Then, ∼ Cauchy(0, 1).
Y

• Suppose X ∼ Gamma(α, k), Y ∼ Gamma(β, k) are independent random vari-


X
ables, then ∼ Beta(α, β).
X +Y

• Sum of n independent Gamma(α, β) is Gamma(nα, β).


 
n 1
• If X1 , X2 , . . . , Xn ∼ i.i.d. Normal(0, σ ) , then
2
X12 +X22 +. . .+Xn2 ∼ Gamma , .
2 2σ 2

Page 2
 
n 1
• Gamma , is called Chi-square distribution with n degrees of freedom, de-
2 2
noted χ2n .

• Suppose X1 , X2 , . . . , Xn ∼ i.i.d. Normal(µ, σ 2 ). Suppose that X and S 2 denote


the sample mean and sample variance, respectively, then
(n − 1)S 2
(i) ∼ χ2n−1
σ2
(ii) X and S 2 are independent.

Page 3

You might also like