Chapter 3 & 4
Chapter 3 & 4
X :Ω −→ X(Ω) ⊂ R
ω −→ X(ω)
Definition 3.3 (Domain of a random variable). The set X(Ω) of possible realizations
that the random variable X can take is called the domain of the random variable X.
1
USTHB: ING INFO 1st year Probability and Statistics
xi x1 x2 ... xk
pi p1 p2 ... pk
0 ≤ pi ≤ 1;
i=k
X
pi = 1
i=1
FX :R −→ [0, 1]
X
x −→ FX (x) = P (X ≤ x) = P (X = xi )
xi ≤x
which yields to
0 if x < x1 ;
p1 if x1 ≤ x < x2 ;
p1 + p 2 if x2 ≤ x < x3 ;
FX (x) = ..
.
p1 + p 2 + · · · + p i if xi ≤ x < xi+1 ;
..
.
1 if x ≥ xk .
Remark 3.1. An analogy can be made with descriptive statistics. We find again the notion
of cumulative frequencies : ficum = f1 + f2 + · · · + fi .
1. 0 ≤ FX (x) ≤ 1, ∀x ∈ R ;
2
USTHB: ING INFO 1st year Probability and Statistics
Furthermore, and in general, we can show that for any discrete random variable X and
a, b ∈ R, we have :
• P (a < X ≤ b) = P (X ≤ b) − P (X ≤ a) = FX (b) − FX (a);
• P (a ≤ X ≤ b) = P (X ≤ b) − P (X < a) = FX (b) − FX (a) + P (X = a) ;
• P (a ≤ X < b) = P (X < b) − P (X < a) = FX (b) − FX (a) − P (X = b) + P (X = a);
• P (a < X < b) = P (X < b) − P (X ≤ a) = FX (b) − FX (a) + P (X = b).
Remark 3.2.
• The expectation E(X) is the (theoretical) mean of the random variable X ;
k
• An analogy can be made with the descriptive statistics : X = xi fi which is the
P
i=1
observed mean of X, calculated on a sample.
Properties 3.2. For any random variables X and Y and any real numbers a and b, we
have
• E(a) = a ;
• E(aX) = aE(X) ;
• E(aX + b) = aE(X) + b ;
• E(aX + bY ) = aE(X) + bE(Y ).
3
USTHB: ING INFO 1st year Probability and Statistics
Definition 3.9 (Standard Deviation). The standard deviation of the random variable X,
denoted σX (or σ(X)), is given by
p
σX = V (X).
where
k
X
2
E(X ) = x2i P (X = xi ).
i=1
Properties 3.3. For any random variables X and Y and any real numbers a and b, we
have
• V (a) = 0 ;
• V (aX) = a2 V (X) ;
• V (aX + b) = a2 V (X) =⇒ σ(aX + b) = |a|σ(X) ;
• V (aX + bY ) = a2 V (X) + b2 V (Y ), if X and Y are independent.
Definition 3.11 (Independence of two real random variables). Two real random variables
X and Y are said to be independent if and only if the probability distribution function of
the dual variable (X, Y ) is equal to the product of the probability distribution functions of
the marginal variables X and Y , that is :
i.e.
Definition 3.12. Two discrete random variables X and Y are said to be independent if
and only if
4
USTHB: ING INFO 1st year Probability and Statistics
3.5 Example
A random experiment consists in drawing (simultaneously) 2 balls from an urn contai-
ning 2 red balls (R) and 3 white balls (W). Let X be the random variable associating to
each draw the number of white balls obtained.
1. What is the number of possible drawings ?
2. Determine the probability distribution function of X.
3. Calculate the cumulative distribution function.
4. Calculate the expectation, variance and standard deviation of X.
5. Assume Y = 5X − 3. Calculate E(Y ), V (Y ), and σ(Y ).
Solution :
P
xi 0 1 2
1 6 3
pi 10 10 10
1
C22−k C3k
P (X = k) = P ("to have 2 − k R balls and k W balls") = , ∀k ∈ {0, 1, 2}.
C52
5
USTHB: ING INFO 1st year Probability and Statistics
4. Calculation of the expectation, the variance and the standard deviation of the
random variable X using the statistical table
P
xi 0 1 2
1 6 3
pi 10 10 10
1
6 6 12
xi p i 0 10 10
E(X) = 10
6 12 18
x2i pi 0 10 10
E(X 2 ) = 10
3
P 12
(a) E(X) = xi p i =
i=1 10
3 18
(b) V (x) = E(X 2 ) − E 2 (X) = x2i pi − E(X)2 = − 1.22 = 0.36
P
i=1 10
p √
(c) σ(X) = V (X) = 0.36 = 0.6
5. Calculation of the expectation, the variance and the standard deviation of the
random variable Y = 5X − 3
(a) E(Y ) = E(5X − 3) = 5E(X) − 3 = 3
(b) V (Y ) = V (5X − 3) = 52 V (Y ) = 9
√
(c) σ(Y ) = 9 = 3
6
Chapitre 4
In the previous chapter, we saw that a random variable X defined on a domain X(Ω)
is associated with a function P (X = x), called the probability distribution function of
X. Obviously, the probability distribution functions existing in practice are extremely
numerous. However, some of them have received particular attention because of their
usefulness in practical applications. Therefore, they have been given special names.
1. X(Ω) = {x1 , x2 , . . . , xk } ;
1
2. P (X = xi ) = for i ∈ {1, 2, . . . , k}.
|X(Ω)|
Therefore, we note
X ∼ U{x1 ,x2 ,...,xk } .
Properties 4.1. The random variable X ∼ U{x1 ,x2 ,...,xk } has an expectation and a variance
which are given by
Pk Pk
xi x2
E(X) = , and V (X) = i=1 i − E(X)2 .
i=1
|X(Ω)| |X(Ω)|
7
USTHB: ING INFO 1st year Probability and Statistics
Remark 4.1. In the special case of a random variable X with a discrete uniform distri-
bution on the set {1, . . . , n}, we write :
X ∼ U{1,2,...,n} .
Furthermore, we have :
n+1 n2 − 1
E(X) = , and V (X) = .
2 12
Example 4.1. Consider the experiment of rolling a fair six-sided die. Let X be the random
variable corresponding to the number of the revealed face. We have
1
X(Ω) = {1, 2, 3, 4, 5, 6} and P (X = i) = for i ∈ X(Ω).
6
Moreover, we have :
7 35
E(X) = , and V (X) = .
2 12
Properties 4.2. The random variable X ∼ B(p) has an expectation and a variance which
are given by
E(X) = p, and V (X) = p(1 − p).
Example 4.2. In a fair coin toss, there are two distinct outcomes, "Heads" and "Tails".
We can represent "Heads" as 0 and "Tails" as 1. We have
1 1
P (X = 0) = P ("Heads") = and P (X = 1) = P (”T ails”) = .
2 2
8
USTHB: ING INFO 1st year Probability and Statistics
Properties 4.3. The random variable X ∼ B(n, p) has an expectation and a variance
which are given by
E(X) = np, and V (X) = np(1 − p).
Remark 4.2.
• Bernoulli distribution is a particular binomial distribution for which n = 1.
• X can be seen as the sum of n independent random variables Y1 , Y2 , . . . , Yn of
Bernoulli distribution of parameter p each :
n
X
X= Yi , whereYi ∼ B(n, p) for i = 1, 2, . . . , n.
i=1
Example 4.3. A well-balanced coin is tossed 10 times in succession and the obtained
results were observed. Let X be the random variable that counts the number of Heads
obtained in the sequence of 10 tosses. Then
1
X ∼ B 10, .
2
9
USTHB: ING INFO 1st year Probability and Statistics
Example 4.4. 8 balls are drawn successively from an urn containing 3 white balls and
7 black balls. Let X be the random variable that counts the number of white balls drawn.
Then
3
X ∼ B 8, .
10
Thus, the probability that, for example, the three white balls are all drawn is :
3 5
3 7
P (X = 3) = C83 = 0.254.
10 10
Description :The geometric distribution models the rank of the first success by repeating
a Bernoulli trial (whose probability of success is p) identically and independently until to
infinity (theoretically).
Properties 4.4. The random variable X ∼ G(p) has an expectation and a variance which
are given by
1 1−p
E(X) = , and V (X) = .
p p2
Example 4.5. A fair die is rolled continuously until the number "six" is obtained. X is
the random variable representing the number of throws required to obtain "six". In this
1
case, the random variable X follows a geometric distribution with parameter , and we
6
write
1
X ∼ G( ).
6
Now, we need to calculate the probability that the first time we get "six" is on the 10th
draw : 9
1 1
P (X = 10) = 1− .
6 6
10
USTHB: ING INFO 1st year Probability and Statistics
Example 4.6. An urn contains 4 white balls and 6 black balls. We make random draws
with replacement, and note X the random variable that counts the number of draws requi-
red to obtain a white ball for the first time. We then have :
2
X ∼ G( ).
5
We now need to calculate the probability that the first time we get a white ball is on the
20th draw : 19
2 2
P (X = 20) = 1− .
5 5
1. X(Ω) = N ;
λx
2. P (X = x) = e−λ for x ∈ X(Ω).
x!
Therefore, we note
X ∼ P(λ).
Properties 4.5. The random variable X ∼ P(λ) has an expectation and a variance which
are given by
E(X) = V (X) = λ.
The binomial distribution depends on two parameters n and p. When n becomes large,
calculating its probabilities becomes very tedious. Provided that
n > 30 n > 50
or
np < 5 np < 0.1,
the binomial distribution B(n, p) can be approximated by the Poisson distribution with
parameter λ = np i.e. P (= np).
11
USTHB: ING INFO 1st year Probability and Statistics
Example 4.7. Let X be a random variable having a binomial distribution with parame-
ters n = 51 and p = 0.09 (B(51, 0.09)). Let Y be a random variable having a Poisson
distribution with parameter λ = np = 4.59 (P(4.59)).
k 0 1 2 3 4 5
P (X = k) 0.008 0.041 0.102 0.164 0.195 0.181
P (Y = k) 0.010 0.047 0.107 0.164 0.188 0.172
12