0% found this document useful (0 votes)
18 views42 pages

Chapter 2 - Section 2 (Part 1)

Uploaded by

Ameena Bou
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views42 pages

Chapter 2 - Section 2 (Part 1)

Uploaded by

Ameena Bou
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

School of Commercial High Studies

Ecole des Hautes Etudes Commerciales (EHEC)

Probability courses
2nd year of preparatory classes

Chapter II : Special probability


distributions
Chapitre II : Distributions de
probabilités usuelles

Prepared by: ALLOUAT Asma


Academic year: 2024/2025
Section 2 :
Continuous special probability distributions

Section 2 : Lois usuelles continues


3
Contents:

1. Continuous uniform distribution


2. Gamma distribution
3. Exponential distribution
4. Normal distribution
1. Continuous uniform distribution 4
For a continuous random variable X that takes values in a finite interval
[a, b], if any subinterval of [a, b] with a fixed length, has the same
probability that X falls into it, we say that X follows a continuous uniform
distribution on [a, b], denoted by 𝑋 ~ U [a, b].

1.1. Probability density function


A random variable 𝑋 follows the continuous uniform distribution on the
range [a, b] , if its PDF is given by:

1
𝑖𝑓 𝑥 ∊ 𝒂, 𝒃
𝒇𝑿 𝒙 = 𝒃 − 𝒂
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Remark:
In the case where a = 0 and b = 1, we refer to the standard continuous
uniform distribution.
We verify that 𝑓 is a valid PDF :
5
𝒇𝑿 𝒙 ≥ 𝟎 ∀ x∊ D

𝐛 𝐛
𝟏 𝟏 𝒃
𝒃−𝒂
𝒇𝑿 𝒙 𝒅𝒙 = 𝒅𝒙 = 𝒙 𝒂 = =𝟏
𝒂 𝒂 𝒃−𝒂 𝒃−𝒂 𝒃−𝒂

1.2. Cumulative Distribution Function -CDF


𝟎 𝑖𝑓 𝒙 < 𝒂
𝒙−𝒂
𝑭𝑿 𝒙 = 𝑖𝑓 𝒙 ∊ 𝒂, 𝒃
𝒃−𝒂
𝟏 𝑖𝑓 𝒙 > 𝒃

1.3. Expectation and variance


𝟐
𝒂+𝒃 𝒃−𝒂
𝐄 𝐗 = 𝐕 𝐗 =
𝟐 𝟏𝟐
6
1.4. Moment generating function

𝐞𝐭𝒃 − 𝐞𝐭𝒂
𝑴𝑿 𝒕 =
𝒕(𝒃 − 𝒂)

1.5. Mode and median

- Mode
Having a constant density over the support [a,b], the mode
can correspond to any value ∈ [a,b].

- Median

𝒂+𝒃
𝐌𝐞 =
𝟐
2. Gamma distribution 7

Gamma function
Definition
For 𝒂 > 𝟎 , the Gamma function is defined by :
+∞

𝚪 𝒂 = 𝒙𝒂−𝟏 𝐞−𝒙 𝒅𝒙
𝟎

Properties:
𝚪 𝟏 =𝟏
𝚪 𝒂+𝟏 =𝒂𝚪 𝒂
𝚪 𝒂 + 𝟏 = 𝒂!
𝟏
𝚪 𝟐 = 𝛑
Proof:
8
+∞
+∞
𝚪 𝟏 = 𝐞−𝒙 𝒅𝒙 = − 𝒆−𝒙 𝟎 =− 𝟎−𝟏 =𝟏
𝟎

+∞ +∞
+∞
𝚪 𝒂+𝟏 = 𝒙𝒂 𝐞−𝒙 𝒅𝒙 = −𝒙𝒂 𝐞−𝒙 𝟎 − −𝒂𝒙𝒂−𝟏 𝐞−𝒙 𝒅𝒙
𝟎 𝟎
+∞

=𝟎+𝒂 𝒙𝒂−𝟏 𝐞−𝒙 𝒅𝒙 = 𝒂𝚪 𝒂


𝟎

𝚪 𝒂 + 𝟏 = 𝒂𝚪 𝒂
= 𝒂 𝒂−𝟏 𝚪 𝒂−𝟏
= 𝒂 𝒂−𝟏 𝒂−𝟐 𝚪 𝒂−𝟐
= 𝒂 𝒂 − 𝟏 𝒂 − 𝟐 𝒂 − 𝟑 … 𝟏𝚪 𝟏
= 𝒂 𝒂 − 𝟏 𝒂 − 𝟐 𝒂 − 𝟑 …𝟑 × 𝟐 × 𝟏
= 𝒂!
2.1. Probability density function 9
Definition
A random variable 𝑋 has the Gamma distribution with parameters
a > 0 and p> 0 , denoted by G (a,p), if its PDF is defined on the range
[0, +∞ [ by:

𝒑𝒂 𝒂−𝟏 −𝒑 𝒙
𝒙 𝒆 𝒊𝒇 𝒙 ≥ 𝟎
𝒇𝑿 𝒙 = 𝚪 𝒂
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆

a is called shape parameter and p rate parameter

We verify that 𝑓 is a valid PDF :


∗ 𝐂𝐨𝐧𝐝𝐢𝐭𝐢𝐨𝐧 𝟏: 𝒇𝑿 𝒙 ≥ 𝟎 ∀ 𝒙 ∊ 𝐃
10
𝒑𝒂
We have: a > 0 and p> 0, 𝐭𝐡𝐞𝐧: ∀ 𝒙 ≥ 𝟎: 𝒙 𝒂−𝟏 𝒆−𝒑 𝒙 ≥ 𝟎
𝚪 𝒂

+∞
𝒑𝒂 𝒂−𝟏 −𝒑 𝒙
∗ 𝐂𝐨𝐧𝐝𝐢𝐭𝐢𝐨𝐧 𝟐: 𝒙 𝒆 𝒅𝒙 = 𝟏
𝟎 𝜞 𝒂

𝒚 𝟏
We put: 𝒚 = 𝒑 𝒙 ⟹ 𝒙 = ⟹ 𝒅𝒙 = 𝒅𝒚
𝒑 𝒑

+∞ +∞ 𝒂−𝟏
𝒑𝒂 𝒚 𝟏
⟹ 𝒇𝑿 𝒙 𝒅𝒙 = 𝒆−𝒚 𝒅𝒚
−∞ 𝚪 𝒂 𝟎 𝒑 𝒑
𝒂−𝟏 +∞
𝒑𝒂 𝟏 𝟏
= 𝒚 𝒂−𝟏 𝒆−𝒚 𝒅𝒚
𝚪 𝒂 𝒑 𝒑 𝟎
𝒂 𝒂
𝒑 𝟏
= 𝚪 𝒂
𝚪 𝒂 𝒑
=𝟏
Property: 11
+∞
𝒑𝒂
We have: 𝒙 𝒂−𝟏 𝒆−𝒑 𝒙 𝒅𝒙 = 𝟏
𝚪 𝒂 𝟎

+∞
𝒂−𝟏 −𝒑 𝒙
𝚪 𝒂
then: 𝒙 𝒆 𝒅𝒙 = 𝒂
𝟎 𝒑

Example:
+∞
𝚪 𝟒 𝟑! 𝟔 𝟑
𝒙𝟑 𝒆−𝟐 𝒙 𝒅𝒙 = 𝟒 = 𝟒 = =
𝟎 𝟐 𝟐 𝟏𝟔 𝟖

+∞
𝟐 −
𝟏
𝒙 𝚪 𝟑
𝒙 𝒆 𝟐 𝒅𝒙 = 𝟑
= 𝟐𝟑 𝟐! = 𝟏𝟔
𝟎 𝟏
𝟐
12
2.2. Cumulative distribution function

In general, if 𝑋 follows a Gamma distribution of parameters a > 0


and p> 0 (positive real numbers), there is no closed-form
expression for the distribution function.

If a is an integer (a∊ℕ), the distribution function of 𝑋 is given by:

𝒂
𝒑 𝒋−𝟏 𝒋−𝟏 −𝒑𝒙
𝑭𝑿 𝒙 = 𝟏 − 𝒙 𝒆
𝚪 𝒋
𝒋=𝟏
2.3. Expectation and variance
13
Expectation 𝒂
𝐄 𝐗 =
𝒑

Proof:
+∞
𝒑𝒂 𝒂−𝟏 −𝒑 𝒙
𝐄 𝐗 = 𝒙 𝒇𝑿 𝒙 𝒅𝒙 = 𝒙 𝒙 𝒆 𝒅𝒙
𝑫 𝟎 𝚪 𝒂
+∞
𝒑𝒂
= 𝒙 𝒂 𝒆−𝒑 𝒙 𝒅𝒙
𝚪 𝒂 𝟎

+∞
𝒑𝒂 𝒂+𝟏 −𝟏
= 𝒙 𝒆−𝒑 𝒙 𝒅𝒙
𝚪 𝒂 𝟎

𝒑𝒂 𝚪 𝒂+𝟏 𝒑𝒂 𝒂𝚪 𝒂 𝒂
= × 𝒂+𝟏
= × 𝒂 =
𝚪 𝒂 𝒑 𝚪 𝒂 𝒑 𝒑 𝒑
Variance 𝒂
𝐕 𝐗 = 𝟐 14
𝒑
Proof:
+∞ +∞ 𝒂
𝒑
𝐄 𝐗𝟐 = 𝒙𝟐 𝒇𝑿 𝒙 𝒅𝒙 = 𝒙𝟐 𝒙 𝒂−𝟏 𝒆−𝒑 𝒙 𝒅𝒙
−∞ 𝟎 𝚪 𝒂

+∞
𝒑𝒂 𝒂+𝟐 −𝟏
= 𝒙 𝒆−𝒑𝒙 𝒅𝒙
𝚪 𝒂 𝟎

𝒑𝒂 𝚪 𝒂+𝟐 𝒑𝒂 𝒂+𝟏 𝚪 𝒂+𝟏


= × 𝒂+𝟐
= ×
𝚪 𝒂 𝒑 𝚪 𝒂 𝒑𝒂 𝒑𝟐
𝒑𝒂 𝒂+𝟏 𝒂𝚪 𝒂 𝒂+𝟏 𝒂
= × 𝒂 𝟐
=
𝚪 𝒂 𝒑 𝒑 𝒑𝟐
𝟐
𝒂+𝟏 𝒂 𝒂 𝒂
𝐕 𝐗 = 𝟐
− = 𝟐
𝒑 𝒑 𝒑
15
2.4. Moment generating function

The moment generating function of a continuous random


variable X that follows a Gamma distribution of parameters
a > 0 and p> 0 is given by :

−𝒂
𝐭
𝑴𝑿 𝒕 = 𝟏− 𝒑>𝐭
𝒑
Proof:
16
+∞
𝒑𝒂 𝒂−𝟏 −𝒑 𝒙
𝑴𝑿 𝒕 = 𝑬 𝒆𝒕𝑿 = 𝐞𝐭𝒙 𝒙 𝒆 𝒅𝒙
𝟎 𝚪 𝒂
+∞
𝒑𝒂
= 𝒙 𝒂−𝟏 𝒆−(𝒑−𝐭) 𝒙 𝒅𝒙
𝚪 𝒂 𝟎

+∞
𝚪 𝒂
For 𝒑 − 𝐭 > 𝟎 ; we have: 𝒙 𝒂−𝟏 𝒆− 𝒑−𝐭 𝒙 𝒅𝒙 = 𝒂
𝟎 𝒑−𝐭
𝒂
𝒑𝒂 𝚪 𝒂 𝒑𝒂 𝒑
Then: 𝑴𝑿 𝒕 = × 𝒂
= 𝒂
= with: 𝒑 > 𝐭
𝚪 𝒂 𝒑−𝐭 𝒑−𝐭 𝒑−𝐭

−𝒂 −𝒂
𝒑−𝐭 𝐭
Or ∶ 𝑴𝑿 𝒕 = = 𝟏− with 𝒑 > 𝐭
𝒑 𝒑
2.5. Mode and median 17

The mode

The mode of a continuous random variable X that follows


a Gamma distribution of parameters a > 0 and p> 0 is
given by :

𝒂−𝟏
𝑴𝒐 = 𝒂≥𝟏
𝒑
Proof: 18
𝒑𝒂 𝒂−𝟏 −𝒑 𝒙
𝒇𝑿 𝒙 = 𝒙 𝒆
𝚪 𝒂

𝒑𝒂
𝒇′ 𝑿 𝒙 = (𝒂 − 𝟏)𝒙 𝒂−𝟐 𝒆−𝒑 𝒙 + 𝒙 𝒂−𝟏 (−𝒑)𝒆−𝒑 𝒙
𝚪 𝒂

𝒑𝒂
= 𝒙 𝒂−𝟐 𝒆−𝒑 𝒙 𝒂 − 𝟏 − 𝒑𝒙
𝚪 𝒂

𝒇′𝑿 𝒙 = 𝟎 ⇒ 𝒂 − 𝟏 − 𝒑𝒙 = 𝟎

𝒂−𝟏
⇒ 𝑴𝒐 = 𝒂≥𝟏
𝒑
19

The median :

There is no closed form for the median of a


continuous random variable following gamma
distribution.
3. Exponential distribtion 20

This distribution is generally used to model the lifetime of certain


phenomena, known as memoryless, with an average lifetime of 1/𝜆.

3.1. Probability density function

Definition
A random variable 𝑋 has the exponential distribution with
parameter 𝜆 > 0 , denoted by Ԑ(𝜆), if its PDF is defined on the
range [0, +∞ [ by:
𝛌 𝒆−𝛌 𝒙 𝒔𝒊 𝒙 ≥ 𝟎
𝒇𝑿 𝒙 =
𝟎 𝒔𝒊𝒏𝒐𝒏
Relation with Gamma distribution 21
If 𝑋 follows the exponential distribution with the parameter 𝜆 >0, the
distribution of 𝑋 is just a Gamma (1, 𝜆) distribution.

𝒑𝒂 𝒂−𝟏 −𝒑 𝒙
𝒙 𝒆 𝒊𝒇 𝒙 > 𝟎
Indeed, we have: 𝒇𝑿 𝒙 = 𝚪 𝒂
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
For a = 1 and p = 𝜆 :
𝝀𝟏 𝟏−𝟏 −𝝀 𝒙 𝝀 𝒆−𝝀 𝒙 𝒊𝒇 𝒙 > 𝟎
𝒙 𝒆 𝒊𝒇 𝒙 ≥ 𝟎
𝒇𝑿 𝒙 = 𝚪 𝟏 ⟹ 𝒇𝑿 𝒙 =
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆 𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆

On the other hand, if Xi ~Exponential (𝜆) are independent, then :


𝒏

𝒀= 𝑿𝒊 ~𝐆𝐚𝐦𝐦𝐚 (𝐧, 𝛌)
𝒊=𝟏
22
3.2. Cumulative Distribution Function -CDF

If 𝑋 follows the exponential distribution with the parameter 𝜆 >0,


its CDF is given by :

𝟎 𝒊𝒇 𝒙 < 𝟎
𝑭𝑿 𝒙 =
𝟏 − 𝒆−λ𝒙 𝒊𝒇 𝒙 ≥ 𝟎

3.3. Expectation and variance

𝟏 𝟏
𝐄 𝐗 = 𝐕 𝐗 = 𝟐
𝛌 𝛌
23
Proof :
+∞
𝐄 𝐗 = 𝒙 𝒇𝑿 𝒙 𝒅𝒙 = 𝒙 𝛌 𝒆−𝛌𝒙 𝒅𝒙
𝑫 𝟎

+∞
+∞
−𝟏 −𝟏
=𝛌 𝒙 𝒆−𝛌𝒙 𝒅𝒙 = 𝛌 𝒙 𝛌
𝒆−𝛌𝒙 − 𝛌
𝒆−𝛌𝒙
𝟎
𝟎
+∞
−𝟏 𝟏 −𝟏
=𝛌 𝒙 𝛌
𝒆−𝛌𝒙 + 𝛌
𝒆−𝛌𝒙
𝛌 𝟎

+∞
−𝛌𝒙
𝟏 −𝛌𝒙 𝟏
= −𝒙 𝒆 − 𝒆 =
𝛌 𝟎 𝛌
+∞ 24
𝐄 𝐗𝟐 = 𝒙𝟐 𝒇𝑿 𝒙 𝒅𝒙 = 𝒙𝟐 𝛌 𝒆−𝛌𝒙 𝒅𝒙
𝑫 𝟎

+∞ +∞
=𝛌 𝟎
𝒙𝟐 𝒆−𝛌𝒙 𝒅𝒙 = 𝛌 𝒙𝟐 −𝟏 𝒆−𝛌𝒙
𝛌
− 𝟐𝒙 −𝟏
𝛌
𝒆−𝛌𝒙
𝟎

+∞ +∞
−𝒙𝟐 −𝒙𝟐
=𝛌 𝛌
𝒆−𝛌𝒙 + 𝟐
𝛌
𝒙 𝒆−𝛌𝒙 =𝛌 𝛌
𝒆−𝛌𝒙 + 𝟐
𝛌𝟐
−𝒙 𝒆−𝛌𝒙 − 𝒆−𝛌𝒙 𝟏
𝛌
𝟎 𝟎

+∞
𝟐 −𝛌𝒙 𝟐 −𝛌𝒙 𝟐 −𝛌𝒙 −𝟏 𝟐 𝟐 −𝛌𝒙 +∞
= −𝒙 𝒆 − 𝒙𝒆
𝛌
− 𝛌𝟐
𝒆 = 𝛌𝟐
𝛌 𝒙 + 𝟐𝛌𝒙 + 𝟐 𝒆
𝟎 𝟎

−𝟏 𝟐
= 𝟐 𝟎−𝟐 = 𝟐
𝛌 𝛌
𝟐
𝟐 𝟏 𝟏
⟹ 𝐕 𝐗 = 𝟐− = 𝟐
𝛌 𝛌 𝛌
2.4. Moment generating function
25
𝛌
𝑴𝑿 𝒕 = 𝒕<𝛌
𝛌−𝒕
Proof:
+∞
𝑴𝑿 𝒕 = 𝑬 𝒆𝒕𝑿 = 𝐞𝒕𝒙 𝒇𝑿 𝒙 𝒅𝒙 = 𝐞𝒕𝒙 𝛌 𝒆−𝛌𝒙 𝒅𝒙
𝑫 𝟎
+∞
=𝛌 𝐞(𝒕−𝛌)𝒙 𝒅𝒙
𝟎
+∞
𝛌
= 𝒆(𝐭−𝛌)𝒙
𝒕−𝛌 𝟎
𝛌
= 𝟎 − 𝟏 for 𝒕 − 𝛌 < 𝟎
𝒕−𝛌
𝛌
= for 𝒕 < 𝛌
𝛌−𝒕
26
2.5. Mode and median

Mode

The mode of a continuous random variable X following the


exponential distribution with parameter 𝜆 >0 is 𝑴𝒐 = 𝟎,
because its density function 𝒇𝐗 𝒙 = 𝛌 𝐞−𝛌 𝒙 is a
decreasing function on the domain [0, +∞ [ (its derivative
𝒇′𝐗 𝒙 = −𝛌𝟐 𝐞−𝛌 𝒙 is negative).
Median : 27

𝒍𝒏 𝟐
𝐌𝐞 =
𝛌
Proof
𝟏
𝐏(𝐗 ≤ 𝐌𝐞) = 𝐏(𝐗 > 𝐌𝐞) =
𝟐

𝟏
𝑭𝑿 𝐌𝐞 = 𝑷 𝑿 ≤ 𝐌𝐞 =
𝟐

𝑾𝒆 𝒉𝒂𝒗𝒆 ∶ 𝑭𝑿 𝒙 = 𝟏 − 𝒆−λ𝒙

𝟏 𝟏
⟹ 𝑭𝑿 𝐌𝐞 = 𝟏 − 𝒆−λ 𝑴𝒆
= ⇒ −λ 𝑴𝒆 = 𝒍𝒏
𝟐 𝟐
𝒍𝒏 𝟐
⟹ 𝐌𝐞 =
𝛌
2.6. Memoryless property 28
The exponential distribution has the memoryless property (also called
the Markov property), meaning that the probability of an event
occurring in the future is independent of the past.
Mathematically, this property is stated as:

𝑃(𝑋 > 𝑠 + 𝑡 | 𝑋 > 𝑠) = 𝑃(𝑋 > 𝑡) where:


X is an exponential random variable representing the time until the
event (e.g., the lifetime of an object).
s and t are non-negative real numbers (representing time periods).

* The equation means that if we already know that the event has not
occurred up to time s, then the probability that the event occurs after an
additional time period t is the same as the probability that it would
occur after time t from the beginning.
Proof: 29

𝐏( 𝑿 > 𝐬+𝐭 ⋂ 𝑿 > 𝐬 ) 𝐏 (𝑿 > 𝐬 + 𝐭 )


𝐏(𝑿 > 𝐬 + 𝐭 | 𝑿 > 𝐬) = =
𝐏( 𝑿 > 𝐬 ) 𝐏( 𝑿 > 𝐬 )

𝟏 − 𝐏 (𝑿 ≤ 𝐬 + 𝐭 ) 𝟏 − 𝐅𝐗 (𝐬 + 𝐭 ) 𝟏 − 𝟏 − 𝒆−𝛌 𝒔+𝒕 𝒆−𝛌 𝒔+𝒕


= = = =
𝟏 − 𝐏( 𝑿 ≤ 𝐬 ) 𝟏 − 𝐅𝐗 ( 𝐬 ) 𝟏 − 𝟏 − 𝒆−𝛌𝒔 𝒆−𝛌𝒔

= 𝒆−𝛌𝒕 = 𝟏 − 𝟏 − 𝒆−𝛌𝒕 = 𝟏 − 𝐅𝐗 ( 𝐭 ) = 𝟏 − 𝐏( 𝑿 ≤ 𝐭 ) = 𝐏( 𝑿 > 𝐭 )

NB:
The memoryless property can also be stated as:

𝑃(𝑋 ≤ 𝑠 + 𝑡 | 𝑋 > 𝑠) = 𝑃(𝑋 ≤ 𝑡)


4. Normal distribution 30

4.1. Probability density function

Definition 1
A random variable 𝑋 has the normal distribution with
parameters 𝜇 and 𝜎2 , denoted by N ( 𝜇, 𝜎2), if its PDF is defined on
ℝ by:

𝟏 𝟏 𝟐
− 𝒙−𝝁
𝒇𝑿 𝒙 = 𝒆 𝟐𝝈𝟐
𝝈 𝟐𝛑

Where : 𝜎∊ℝ + et 𝜇 ∊ℝ
Property: 31

We have:
+∞ 1 𝑥−𝜇 2
1 −
𝑒 2 𝜎 𝑑𝑥 = 𝟏
−∞ 𝜎 2π
+∞ 𝟏
− 𝟐 𝒙−𝝁 𝟐
𝐓𝐡𝐞𝐧: 𝒆 𝟐𝝈 𝒅𝒙 = 𝝈 𝟐𝛑
−∞

Examples:

+∞ 𝟏
− 𝟖 𝒙−𝟑 𝟐
𝒆 𝒅𝒙 = 𝟐 𝟐𝛑
−∞

+∞ 𝟏
− 𝟐 𝒙𝟐
𝒆 𝒅𝒙 = 𝟐𝛑
−∞
32

Definition 2: Standard normal distribution


A random variable 𝑋 has the standard normal distribution
with parameters 𝜇 = 0 and 𝜎2 = 1, denoted by N ( 0, 1), if its
PDF, denoted by 𝝓(𝒙) is defined on ℝ by:

𝟏 𝒙𝟐

𝝓(𝒙) = 𝒆 𝟐
𝟐𝝅
Remark: 33

The probability density function of a normal distribution


can always be expressed in terms of the density of the
standard normal distribution. This is referred to as the
standardization of a normal distribution.
𝟏 𝒙−𝝁
Indeed : 𝒇𝑿 𝒙 = 𝝓
𝛔 𝛔
It means that:
𝒙−𝝁
If 𝑋 ~ N ( 𝜇, 𝜎2) , the variable 𝒁 = ~ N ( 0, 1)
𝝈
4.2. Cumulative distribution function 34
If a random variable 𝑋 has the normal distribution with parameters 𝜇
and 𝜎2 , denoted by N ( 𝜇, 𝜎2), if its CDF is defined by:
𝒙
𝟏 𝟏 𝟐
− 𝒕−𝝁
𝑭𝑿 𝒙 = 𝒆 𝟐𝝈𝟐 𝒅𝒕
𝝈 𝟐𝛑
−∞

If a random variable 𝑋 has the standard normal distribution N ( 0, 1),


if its CDF, denoted by 𝚽 𝒙 , is defined by:
𝒙 𝒙
𝟏 𝟏
− 𝟐𝒕𝟐
𝚽 𝒙 = 𝝓(𝒕)𝒅𝒕 = 𝒆 𝒅𝒕
𝟐𝛑
−∞ −∞

However, these functions do not have an analytical expression.


4.3. Probability calculations 35
To calculate the cumulative probabilities of a general normal
distribution N ( 𝜇, 𝜎2), it is always possible to express them in terms
of those of the standard normal distribution N (0,1) , which are
provided by the statistical table of the standard normal distribution,
as follows:

𝒙−𝝁
If 𝑋 ~ N ( 𝜇, 𝜎2) , the variable 𝒁 = ~ N ( 0, 1)
𝝈

𝑿 − 𝝁 𝒙𝟎 − 𝝁 𝒙𝟎 − 𝝁
𝑭𝑿 𝒙𝟎 = 𝑷 𝑿 < 𝒙𝟎 = 𝑷 ≤ =𝑷 𝒁≤
𝝈 𝝈 𝝈
𝒙𝟎 − 𝝁
= 𝚽
𝝈
36
Example 37
𝑋 ~ N ( 𝜇=10, 𝜎2=4)
𝑿 − 𝟏𝟎 𝟏𝟐 − 𝟏𝟎
𝑭𝑿 𝟏𝟐 = 𝑷 𝑿 < 𝟏𝟐 = 𝑷 ≤
𝟐 𝟐
= 𝑷 𝒁 ≤ 𝟏 = 𝟎, 𝟖𝟒𝟏𝟑
38
Remark:

It can be observed that the values z listed in the statistical


table are all positive.

Due to the symmetry of the standard normal distribution,


we have:

𝚽 −𝒛 = 𝟏 − 𝚽 𝒛
4.4. Expectation and variance 39
Expecation

The mean (expectation) of a continuous random variable X that


follows a normal distributions with parameters 𝜇 and 𝜎2 is :

𝐄 𝐗 =𝝁

The mean (expectation) of a continuous random variable Z that


follows a standard normal distribution (of parameters 0 and 1) is :

𝑿−𝝁 𝟏 𝟏
𝑬 𝒁 =𝑬 = 𝑬 𝑿 − 𝑬(𝝁) = 𝝁−𝝁 =𝟎
𝝈 𝝈 𝝈
Variance 40
The variance of a continuous random variable X that follows
a normal distributions with parameters 𝜇 and 𝜎2 is :

𝐕 𝑿 = 𝝈𝟐

The variance of a continuous random variable Z that follows a


standard normal distribution (of parameters 0 and 1) is :

𝑿−𝝁 𝟏 𝟏 𝟐
𝑽 𝒁 =𝑽 = 𝟐 𝑽 𝑿 = 𝟐𝝈 = 𝟏
𝝈 𝝈 𝝈
4.5. Mode and median 41

Mode

Mo = 𝜇
Proof:
1
1 − 2 𝑥−𝜇 2
𝑓𝑋 𝑥 = 𝑒 2𝜎
𝜎 2π

1
1 1 − 𝑥−𝜇 2
𝑓𝑋′ 𝑥 = − 2 𝑥−𝜇 𝑒 2𝜎2
𝜎 2π 2𝜎 2

1
1 − 𝑥−𝜇 2
𝑓𝑋′ 𝑥 = 𝑥−𝜇 𝑒 2𝜎2
𝜎3 2π

𝑓𝑋′ 𝑥 = 0 ⇒ 𝑥 − 𝜇 = 0 ⇒ 𝑥 = 𝜇 = 𝑀𝑜
Median 42

Me = 𝜇

Proof:
𝟏 𝐗−𝛍 𝐌𝐞−𝛍 𝟏
𝐏(𝐗 ≤ 𝐌𝐞) = ⇒𝐏 ≤ =
𝟐 𝛔 𝛔 𝟐

𝐌𝐞 − 𝛍 𝟏 𝐌𝐞 − 𝛍 𝟏
⇒𝐏 𝐙 ≤ = ⇒𝚽 =
𝛔 𝟐 𝛔 𝟐

Using the table of the standard normal distribution, we find that:


𝟏
𝚽 𝟎 =
𝟐
𝐌𝐞 − 𝛍
⇒ = 𝟎 ⇒ 𝐌𝐞 = 𝛍
𝛔

You might also like