Independence of Events, Sequence of Events, Classes of Events, and Random Variables
Independence of Events, Sequence of Events, Classes of Events, and Random Variables
Variables
Md Aktar Ul Karim
Probability - Theory and Applications
Symbiosis Statistical Institute, Pune
Contents
2 Independence of n Events 2
2.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
6 Exercises 6
The concept of independence is fundamental in probability theory. Two events are said to be independent
if the occurrence of one does not affect the probability of the other.
Two events A and B are independent if:
P (A ∩ B) = P (A)P (B)
This means that knowing the outcome of one event gives no information about the outcome of the other.
1.1. Examples
1. Let A and B represent two events from rolling a die:
A: Rolling an even number {2, 4, 6}
B: Rolling a number less than 4 {1, 2, 3}
Find P (A ∩ B) and check for independence.
Sol.
3 3
P (A) = = 0.5, P (B) = = 0.5
6 6
A ∩ B corresponds to the event of rolling {2}. Therefore,
1
P (A ∩ B) =
6
Now we check whether P (A ∩ B) = P (A)P (B)
1
P (A)P (B) = 0.5 × 0.5 = 0.25 ̸=
6
Since P (A ∩ B) ̸= P (A)P (B), the events A and B are not independent.
Sol.:
26
The probability of drawing a red card P (A) = 52 = 0.5. The probability of drawing a face card P (B) =
12 3
52 = 13 . Now find P (A ∩ B), the probability of drawing a red face card:
6 3
P (A ∩ B) = =
52 26
Check if P (A ∩ B) = P (A)P (B):
3 3
P (A)P (B) = 0.5 × =
13 26
Since P (A ∩ B) = P (A)P (B), the events are independent.
2. Independence of n Events
Independence of events can be generalized when we deal with more than 2 events.
A sequence of events A1 , A2 , . . . , An is said to be independent if for any subset I ⊆ {1, 2, . . . , n}, we have:
!
\ Y
P Ai = P (Ai )
i∈I i∈I
This means that any group of events from the sequence is independent.
2
2.1. Examples
1. Let A1 , A2 , A3 be events such that P (A1 ) = 0.3, P (A2 ) = 0.7, and P (A3 ) = 0.5. If the events are
independent, find P (A1 ∩ A2 ∩ A3 ).
Sol.: By the definition of independence for three events:
A sequence of events A1 , A2 , A3 , . . . is independent if for every finite subset {Ai1 , Ai2 , . . . , Aik }, the events
Ai1 , Ai2 , . . . , Aik are independent.
3.1. Examples
1. Consider an infinite sequence of coin flips. Each flip corresponds to an event Ai , where Ai = {flip is heads}.
Each flip is independent of the others. Thus, for any finite collection of flips, the probability of all flips
resulting in heads is:
k k
!
\ Y
P Ai = P (Ai )
i=1 i=1
2. Consider 5 independent rolls of a die. What is the probability that all five rolls result in a 6?
Sol. The probability of rolling a 6 on a single die roll is 61 . For 5 independent rolls, the probability of
getting a 6 on all rolls is:
5
! 5
\ 1 1
P Ai = =
6 7776
i=1
A class of events A is independent if for any finite collection of events A1 , A2 , . . . , An from A, we have:
n n
!
\ Y
P Ai = P (Ai )
i=1 i=1
3
4.1. Examples
1. Let A1 be the class of events related to rolling a die and A2 the class of events related to drawing a card
from a deck. The two classes are independent if:
2. Let A1 be the class of events related to flipping a coin and A2 the class of events related to rolling a die.
Verify the independence of the classes by calculating P (A1 ∩ A2 ).
Sol. For A1 being the event of heads on the coin and A2 being the event of rolling a 6 on the die:
1 1
P (A1 ) = 0.5, P (A2 ) = , P (A1 ∩ A2 ) =
6 12
Since P (A1 ∩ A2 ) = P (A1 )P (A2 ), the classes are independent.
Two random variables X and Y are independent if for any measurable sets A and B,
P (X ∈ A, Y ∈ B) = P (X ∈ A) × P (Y ∈ B)
This means the occurrence of X ∈ A does not affect the probability of Y ∈ B, and vice versa.
Independence implies that knowing the value of one variable provides no information about the other. F
Example The outcome of one dice roll does not influence the outcome of another independent roll.
P (X = x, Y = y) = P (X = x) · P (Y = y) (discrete)
4
• Marginal independence: X and Y are independent without considering other variables.
• Conditional independence: X and Y are independent given another variable Z, i.e., P (X, Y |Z) =
P (X|Z)P (Y |Z).
5.3. Examples
1. Let X1 and X2 represent the outcome of two independent coin tosses. The joint probability distribution
is:
1 1 1
P (X1 = H, X2 = H) = P (X1 = H)P (X2 = H) = × =
2 2 4
2. Suppose X and Y are independent uniform random variables on [0, 1]. Then the joint density function is:
5.4. Properties
• Pairwise Independence:
Random variables X1 , X2 , . . . , Xn are pairwise independent if Xi and Xj are independent for all i ̸= j.
This does not necessarily imply mutual independence.
• Mutual Independence:
Random variables are mutually independent if for every subset S ⊆ {X1 , X2 , . . . , Xn }, the joint probability
distribution factors as:
5
6. Exercises
Prove or disprove: The events A1 , A2 , A3 are pairwise independent but not mutually independent.
Sol. We are given three events A1 , A2 , and A3 with the following probabilities:
We need to prove or disprove whether A1 , A2 , and A3 are pairwise independent but not mutually inde-
pendent.
Check Pairwise Independence: Two events A and B are pairwise independent if:
P (A ∩ B) = P (A)P (B)
Checking A1 and A2 :
Checking A2 and A3 :
Checking A1 and A3 :
Since P (A1 ∩ A3 ) = P (A1 )P (A3 ), the events A1 and A3 are independent. Thus, A1 , A2 , and A3 are
pairwise independent.
Check Mutual Independence Three events A1 , A2 , and A3 are mutually independent if:
6
We already know that:
P (A1 )P (A2 )P (A3 ) = 0.5 × 0.5 × 0.5 = 0.125
Simplifying:
P (A1 ∩ A2 ∩ A3 ) = 0.25.
Thus, P (A1 ∩ A2 ∩ A3 ) = 0.25, which is not equal to 0.125, the product P (A1 )P (A2 )P (A3 ). Hence, The
events A1 , A2 , and A3 are pairwise independent but not mutually independent.
2. Let X and Y be independent random variables with expectations E[X] = 2 and E[Y ] = 3. Find E[XY ].
3. Let X and Y be independent random variables. Prove that f (X) and g(Y ) are independent for any
measurable functions f and g.
Sol.
We need to prove that f (X) and g(Y ) are independent. This means that for any measurable sets A and
B, we must show:
P (f (X) ∈ A, g(Y ) ∈ B) = P (f (X) ∈ A)P (g(Y ) ∈ B)
Thus,
7
4. Let X and Y be independent random variables. Prove that their covariance is zero, i.e., Cov(X, Y ) = 0.
Sol.
The covariance of two random variables X and Y is given by:
E[XY ] = E[X]E[Y ]
Thus, Cov(X, Y ) = 0, meaning that independent random variables have zero covariance.
5. Let X1 ∼ Poisson(λ1 ) and X2 ∼ Poisson(λ2 ) be independent Poisson random variables. Find the distri-
bution of X1 + X2 .
Sol.
The moment generating function (MGF) of a Poisson random variable X ∼ Poisson(λ) is given by:
For independent random variables X1 and X2 , the MGF of the sum is the product of the individual
MGFs:
MX1 +X2 (t) = MX1 (t) · MX2 (t) = exp(λ1 (et − 1)) · exp(λ2 (et − 1)) = exp((λ1 + λ2 )(et − 1))
X1 + X2 ∼ Poisson(λ1 + λ2 )
X1 + X2 ∼ Poisson(λ1 + λ2 )
6. Let X1 , X2 be independent random variables, both uniformly distributed on [0, 1]. Find the joint proba-
bility density function (pdf) of X1 and X2 .
8
Sol. Since X1 and X2 are independent and each is uniformly distributed on [0, 1], their individual pdfs
are:
fX1 (x1 ) = 1 for 0 ≤ x1 ≤ 1
The joint pdf of independent random variables is the product of their individual pdfs:
Therefore, the joint distribution of X1 and X2 is uniform over the unit square [0, 1] × [0, 1].
7. Let X ∼ Exp(λ1 ) and Y ∼ Exp(λ2 ) be independent exponential random variables. Compute E[XY ].
Sol.
Since X and Y are independent, we can write:
E[XY ] = E[X]E[Y ]
1
E[X] =
λ
Thus:
1 1 1
E[XY ] = × =
λ1 λ2 λ1 λ2
8. Let X1 , X2 , . . . , Xn be independent random variables uniformly distributed on the interval [0, 1]. Define
the event Ai = {Xi > 0.5} for each i = 1, . . . , n.
(a) Find the probability that at least one of the events Ai occurs.
Sol.
We are given independent random variables X1 , X2 , . . . , Xn that are uniformly distributed on the interval
[0, 1]. For each i = 1, . . . , n, the event Ai is defined as:
9
(a) Since each Xi is uniformly distributed on [0, 1], the probability that Xi > 0.5 is the length of the
interval (0.5, 1], which is:
P (Ai ) = P (Xi > 0.5) = 1 − 0.5 = 0.5
We have to find the probability that at least one of the events Ai occurs.
To find this probability, we use the complement rule.
The complement of “at least one Ai occurs” is the event that none of the Ai ’s occur, i.e.,
Since the events Aci = {Xi ≤ 0.5} are independent, we can compute:
Each P (Aci ) = P (Xi ≤ 0.5) = 1 − P (Xi > 0.5) = 1 − 0.5 = 0.5, so:
Therefore, the probability that at least one of the events Ai occurs is:
(b) Now, we want to find the probability that at least one of the events Ai occurs when i goes to infinity.
The complement of ”at least one Ai occurs” is the event that none of the Ai ’s occur, which means that
Xi ≤ 0.5 for all i.
10
Therefore, the probability that at least one of the events Ai occurs is:
∞ ∞
! !
[ \
c
P Ai = 1 − P Ai = 1 − 0 = 1
i=1 i=1
Hence, For n independent random variables X1 , X2 , . . . , Xn , the probability that at least one of the events
Ai occurs is:
P (at least one Ai occurs) = 1 − 0.5n
For infinitely many independent random variables X1 , X2 , . . ., the probability that at least one of the
events Ai occurs is:
∞
!
[
P Ai =1
i=1
11