0% found this document useful (0 votes)
143 views4 pages

Normal Distribution

The document defines the normal distribution with parameters μ and σ, denoted N(μ, σ^2). It then defines the standard normal distribution Z~N(0,1). Some key properties of the normal distribution are proven, including that the mean is μ, variance is σ^2, and the moment generating function. The central limit theorem is introduced, stating that the sample mean of independent random variables is approximately normally distributed as the sample size increases. Several examples are provided to demonstrate calculating probabilities using the normal approximation.

Uploaded by

didiaodeqq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
143 views4 pages

Normal Distribution

The document defines the normal distribution with parameters μ and σ, denoted N(μ, σ^2). It then defines the standard normal distribution Z~N(0,1). Some key properties of the normal distribution are proven, including that the mean is μ, variance is σ^2, and the moment generating function. The central limit theorem is introduced, stating that the sample mean of independent random variables is approximately normally distributed as the sample size increases. Several examples are provided to demonstrate calculating probabilities using the normal approximation.

Uploaded by

didiaodeqq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Normal Distribution

Definition 1 A random variable X with pdf


1 (x−µ)2
f (x) = √ e− 2σ2 , −∞ < x < ∞
σ 2π
is called normal distribution with parameters µ and σ, denoted N (µ, σ 2 ). ¦
A normal distribution with µ = 0 and σ = 1, usually denoted Z ; N (0, 1), is called
standard normal distribution, its pdf is:
1 z2
g(z) = √ e− 2 , −∞ < z < ∞

Z ∞
1 2
- To check that g is a pdf, we need to compute the integral I = √ e−x /2 dx;
Z ∞ −∞ 2π
1 −y2 /2
we first consider J = √ e dy, then
−∞ 2π
Z ∞Z ∞
2 1 −(x2 +y2 )/2
I =I ×J = e dxdy,
−∞ −∞ 2π
Z 2π Z ∞
1 −r2 /2
= re drdθ = 1 (in polar coordinates), hence I = 1
0 0 2π
Now we can check that f is a pdf:
Z ∞ Z ∞
1 −
(x−µ)2 1 z2
√ e 2σ dx =2 √ e− 2 dz = 1 (by a changing variable z = x−µ σ
)
−∞ σ 2π −∞ 2π
Z +∞
1 z2 1 h − z2 i+∞
- E(Z) = √ z e− 2 dz = √ −e 2 =0
−∞ 2π 2π −∞
Z +∞ i+∞ Z +∞ 1
1 2 − z2 1 h − z2 z2
- V ar(Z) = √ z e 2 dz = √ −ze 2 + √ e− 2 dz = 1
−∞ 2π 2π −∞ −∞ 2π
0 −z 2 /2
(integration by parts u = z and v = ze )
Z +∞ Z +∞ Z +∞
1 tz − z2 1 − (z2 −2tz) t 2 /2 1 (z−t)2
- MZ (t) = √ e e 2 dz = √ e 2 dz = e √ e− 2 dz =
−∞ 2π −∞ 2π −∞ 2π
t2 /2
e
(the last integral is the pdf of a normal distribution with µ = t and σ = 1)
σ 2 t2
Exercise 1 Show that E(X) = µ, V ar(X) = σ 2 , and MX (t) = eµt+ 2 .
x−µ
(hint: consider the change of variable z = σ
) 5
X −µ
Proposition 1 If X ; N (µ, σ 2 ), then Z = ; N (0, 1).
σ
(equivalently, If Z ; N (0, 1), then X = σZ + µ ; N (µ, σ 2 )). 2
Proof: FZ (z) = P (Z ≤ z) = P ( X−µ
σ
≤ z) = P (X ≤ σz + µ) = FX (σz + µ), hence
2 /2
fZ (z) = FZ0 (z) = FX0 (σZ + µ) = σfX (σz + µ) = √1 e−z , −∞<z <∞

1
Proposition 2 If Z1 , Z2 , . . . , Zn are n independent N (0, 1), then
X = Z12 + Z22 + . . . + Zn2 ; χ2 (n) 2

Proof: Z12 ; χ2 (1) (why?), and Z12 + Z22 + . . . + Zn2 ; χ2 (n) (why?)
Sum of independent normal distributions:

Proposition 3 Let Xi ; N (µi , σi2 ) , i = 1, 2, . . . , n be n independent normal distri-


butions. Then Y = a1 X1 + . . . + an Xn ; N (µ, σ 2 ),
where µ = a1 µ1 + . . . + an µn , and σ 2 = a21 σ12 + . . . + a2n σn2 2

Proof: MY (t) = MX1 (a1 t) × . . . × MXn (an t)


a2 2 2
1 σ1 t a2 2 2
n σn t
= ea1 µ1 t+ 2 × . . . × ean µn t+ 2

(a2 2 2 2 2
1 σ1 +...+an σn )t
= e(a1 µ1 +...+an µn )t+ 2

hence Y ; N (a1 µ1 + . . . + an µn , a21 σ12 + . . . + a2n σn2 )

Example 1 Suppose that the length of life in hours of a light bulb manufactured by
company A is N (800, 14400) and the length of life in hours of a light bulb manufactured
by company B is N (850, 2500). One bulb is selected from each company and is burned
until death. Find the probability that length life of the bulb from company A exceeds
the length of life of the bulb from company B by at least 15 hours.
P (A > B + 15) = P (A − B > 15); and A − B 7→ N (−50, 16900)
hence P (A − B > 15) = P (Z > 15+50
130
) = P (Z > 0.5) = 1 − 0.6915 = 0.3085 (from the
standard normal table) 4

The Central Limit Theorem:

Definition 2 The mean of n random variables X1 , . . . , Xn , denoted X, is defined by


Pn
Xi
X = i=1 ¦
n
Proposition 4 Let X1 , . . . , Xn be n independent random variables with E(Xi ) = µ,
and V ar(Xi ) = σ 2 , for i = 1, 2, . . . , n. Then
σ2
E(X) = µ , and V ar(X) = n
2

The proof is obvious.

Proposition 5 Let X1 , . . . , Xn be a random sample of size n with normal distribution


N (µ, σ 2 ), then

2 X −µ
X ; N (µ, σn ) (or equivalently √ ; N (0, 1)) 2
σ/ n

Proof: Combine propositions 3 and 4.

2
Proposition 6 (Central limit theorem)
Let X1 , . . . , Xn be a random sample of size n with E(Xi ) = µ and V ar(Xi ) = σ 2 , then
2
X is approximately N (µ, σn ) 2

Example 2 Let X1 , . . . , X10 be a random sample of size 10 with exponential distribu-


tion with mean 2, i.e.
1
f (x) = e−x/2 , 0 < x < ∞
2
a. Find P (12.44 < X1 + . . . + X10 < 28.41)

b. Approximate P (12.44 < X1 + . . . + X10 < 28.41)

Solution: a) X1 + . . . + X10 ; χ2 (20), and P (12.44 < X1 + . . . + X10 < 28.41) =


P (12.44 < χ2 (20) < 28.41) = P (χ2 (20) < 28.41) − P (χ2 (20) < 12.44) = 0.9 − 0.1 = 0.8
(from the Chi-square table).

X1 +...+X10 4
b) 10
∼ N (2, 10 ) by the central limit theorem.

X1 +...+X10
P (12.44 < X1 + . . . + X10 < 28.41) ' P (1.244 < 10
< 2.841)
= P (−0.765 < X1 +...+X
10
10
− 2 < 0.841)
X1 +...+X10
−2
= P (−1.21 < 10√
2/ 10
< 1.33)
= P (−1.21 < Z < 1.33) = 0.795 ' 0.8

Example 3 Fifty numbers are rounded off to the nearest integer and then summed. If
the individual round-off errors are uniformly distributed over the interval (−1/2, 1/2).
Find the probability that the resultant sum differs from the exact sum by more than
3. 4

Solution: Let X1 , X2 , . . . , X50 be the errors for the 50 numbers; Xi 7→ U (−1/2, 1/2),
E(Xi ) = 0, and V ar(Xi ) = 1/12

X1 +X2 +...+X50 1
50
∼ N (0, 600 ) by the central limit theorem.

P (|X1√+ X2 + . . . + X50 | ≥ 3) = P (−3 < X1 + X2 + . . . + X50 < 3) = 1 − P (− 3 50600 <
Z < 3 50600 ) = 1 − P (−1.47 < Z < 1.47) = 1 − (0.9292 − (1 − 0.9292)) = 1 − 0.86 = 0.14
(from the normal table)

3
Normal approximation to Binomial:

Let X1 , . . . , Xn be a random sample of size n with Bernouilli distribution b(p), and let
X = X1 + . . . + Xn

(we know that X has a Binomial distribution b(n, p))

By the Central limit theorem,


X1 +...+Xn
n
−p
q ∼ N (0, 1)
p(1−p)
n
X1 + . . . + Xn − np
or equivalently p ∼ N (0, 1)
np(1 − p)
or X1 + . . . + Xn ∼ N (np, np(1 − p))

Thus, we have the proposition:

Proposition 7 If X ; b(n, p), then X can be approximated by a normal distribution


N (np, np(1 − p)). 2

Example 4 In the casino game roulette, the probability of winning with a bet on red
is 18/38. Let Y equal the number of winning bets out of 1000 independent bets that
are placed. Approximate P (Y > 500).

Let Xi be the winning at the ith bet; Xi 7→ b( 18


38
), hence
18 9 18 20 90
E(Xi ) = 38
= 19
and V ar(Xi ) = .
38 38
=361

by the central limit theorem, Y 7→ N ( 9000


19
; 90000
361
)
³ ´
Y −(9000/19) 500−(9000/19)
P (Y > 500) = P √ > √ = P (Z > 1.67) = 0.0475 (from the
90000/361 90000/361
normal table) 4

You might also like