0% found this document useful (0 votes)
10 views11 pages

Independence of Events, Sequence of Events, Classes of Events, and Random Variables

Uploaded by

dopin68889
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views11 pages

Independence of Events, Sequence of Events, Classes of Events, and Random Variables

Uploaded by

dopin68889
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Independence of Events, Sequence of Events, Classes of Events, and Random

Variables

Md Aktar Ul Karim
Probability - Theory and Applications
Symbiosis Statistical Institute, Pune

Contents

1 Independence of two Events 1


1.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2 Independence of n Events 2
2.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

3 Sequence of Independent Events 3


3.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

4 Independent Classes of Events 3


4.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

5 Independence of Random Variables 4


5.1 Independence vs. Uncorrelated: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
5.2 Joint Distribution and Factorization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
5.3 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
5.4 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
5.5 Pairwise vs. Mutual Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

6 Exercises 6

1. Independence of two Events

The concept of independence is fundamental in probability theory. Two events are said to be independent
if the occurrence of one does not affect the probability of the other.
Two events A and B are independent if:

P (A ∩ B) = P (A)P (B)

This means that knowing the outcome of one event gives no information about the outcome of the other.
1.1. Examples
1. Let A and B represent two events from rolling a die:
A: Rolling an even number {2, 4, 6}
B: Rolling a number less than 4 {1, 2, 3}
Find P (A ∩ B) and check for independence.

Sol.
3 3
P (A) = = 0.5, P (B) = = 0.5
6 6
A ∩ B corresponds to the event of rolling {2}. Therefore,
1
P (A ∩ B) =
6
Now we check whether P (A ∩ B) = P (A)P (B)
1
P (A)P (B) = 0.5 × 0.5 = 0.25 ̸=
6
Since P (A ∩ B) ̸= P (A)P (B), the events A and B are not independent.

2. Let A and B represent the following events from a deck of cards:


A: Drawing a red card, B: Drawing a face card.
Find if A and B are independent events.

Sol.:
26
The probability of drawing a red card P (A) = 52 = 0.5. The probability of drawing a face card P (B) =
12 3
52 = 13 . Now find P (A ∩ B), the probability of drawing a red face card:
6 3
P (A ∩ B) = =
52 26
Check if P (A ∩ B) = P (A)P (B):
3 3
P (A)P (B) = 0.5 × =
13 26
Since P (A ∩ B) = P (A)P (B), the events are independent.

2. Independence of n Events

Independence of events can be generalized when we deal with more than 2 events.
A sequence of events A1 , A2 , . . . , An is said to be independent if for any subset I ⊆ {1, 2, . . . , n}, we have:
!
\ Y
P Ai = P (Ai )
i∈I i∈I

This means that any group of events from the sequence is independent.

2
2.1. Examples

1. Let A1 , A2 , A3 be events such that P (A1 ) = 0.3, P (A2 ) = 0.7, and P (A3 ) = 0.5. If the events are
independent, find P (A1 ∩ A2 ∩ A3 ).
Sol.: By the definition of independence for three events:

P (A1 ∩ A2 ∩ A3 ) = P (A1 )P (A2 )P (A3 ) = 0.3 × 0.7 × 0.5 = 0.105

3. Sequence of Independent Events

A sequence of events A1 , A2 , A3 , . . . is independent if for every finite subset {Ai1 , Ai2 , . . . , Aik }, the events
Ai1 , Ai2 , . . . , Aik are independent.

3.1. Examples

1. Consider an infinite sequence of coin flips. Each flip corresponds to an event Ai , where Ai = {flip is heads}.
Each flip is independent of the others. Thus, for any finite collection of flips, the probability of all flips
resulting in heads is:
k k
!
\ Y
P Ai = P (Ai )
i=1 i=1

If P (Ai ) = 0.5 for all i, then:


k
!
\
P Ai = (0.5)k
i=1

2. Consider 5 independent rolls of a die. What is the probability that all five rolls result in a 6?

Sol. The probability of rolling a 6 on a single die roll is 61 . For 5 independent rolls, the probability of
getting a 6 on all rolls is:
5
!  5
\ 1 1
P Ai = =
6 7776
i=1

4. Independent Classes of Events

A class of events A is independent if for any finite collection of events A1 , A2 , . . . , An from A, we have:
n n
!
\ Y
P Ai = P (Ai )
i=1 i=1

This extends the concept of independence to a broader collection of events.

3
4.1. Examples
1. Let A1 be the class of events related to rolling a die and A2 the class of events related to drawing a card
from a deck. The two classes are independent if:

P (A1 ∩ A2 ) = P (A1 )P (A2 )

for any event A1 ∈ A1 and A2 ∈ A2 .

2. Let A1 be the class of events related to flipping a coin and A2 the class of events related to rolling a die.
Verify the independence of the classes by calculating P (A1 ∩ A2 ).

Sol. For A1 being the event of heads on the coin and A2 being the event of rolling a 6 on the die:
1 1
P (A1 ) = 0.5, P (A2 ) = , P (A1 ∩ A2 ) =
6 12
Since P (A1 ∩ A2 ) = P (A1 )P (A2 ), the classes are independent.

5. Independence of Random Variables

Two random variables X and Y are independent if for any measurable sets A and B,

P (X ∈ A, Y ∈ B) = P (X ∈ A) × P (Y ∈ B)

This means the occurrence of X ∈ A does not affect the probability of Y ∈ B, and vice versa.
Independence implies that knowing the value of one variable provides no information about the other. F
Example The outcome of one dice roll does not influence the outcome of another independent roll.

5.1. Independence vs. Uncorrelated:


Independence: P (X, Y ) = P (X)P (Y ) Uncorrelated: Cov(X, Y ) = 0, but they may not be independent
(e.g., nonlinearly related random variables).

5.2. Joint Distribution and Factorization


The joint distribution of two random variables X and Y gives the probability of X taking a value within a
set A and Y taking a value within a set B.
So, we can say that the random variables X and Y are independent if their joint probability distribution
can be factored as the product of their marginal distributions:

P (X = x, Y = y) = P (X = x) · P (Y = y) (discrete)

fX,Y (x, y) = fX (x) · fY (y) (continuous)

Difference between Marginal and Conditional Independence:

4
• Marginal independence: X and Y are independent without considering other variables.

• Conditional independence: X and Y are independent given another variable Z, i.e., P (X, Y |Z) =
P (X|Z)P (Y |Z).

5.3. Examples

1. Let X1 and X2 represent the outcome of two independent coin tosses. The joint probability distribution
is:
1 1 1
P (X1 = H, X2 = H) = P (X1 = H)P (X2 = H) = × =
2 2 4

2. Suppose X and Y are independent uniform random variables on [0, 1]. Then the joint density function is:

fX,Y (x, y) = fX (x)fY (y) = 1 × 1 = 1 for 0 ≤ x, y ≤ 1

5.4. Properties

• Expectation and Variance:


For independent random variables X and Y :

E[X + Y ] = E[X] + E[Y ]

Var(X + Y ) = Var(X) + Var(Y ) (if independent)

• Covariance of Independent Variables:


If X and Y are independent, their covariance is zero:

Cov(X, Y ) = E[XY ] − E[X]E[Y ] = 0

5.5. Pairwise vs. Mutual Independence

• Pairwise Independence:
Random variables X1 , X2 , . . . , Xn are pairwise independent if Xi and Xj are independent for all i ̸= j.
This does not necessarily imply mutual independence.

• Mutual Independence:
Random variables are mutually independent if for every subset S ⊆ {X1 , X2 , . . . , Xn }, the joint probability
distribution factors as:

P (X1 ∈ A1 , X2 ∈ A2 , . . . , Xn ∈ An ) = P (X1 ∈ A1 ) · P (X2 ∈ A2 ) · · · · · P (Xn ∈ An )

5
6. Exercises

1. Consider three events A1 , A2 , and A3 with probabilities:

P (A1 ) = P (A2 ) = P (A3 ) = 0.5, P (A1 ∩ A2 ) = P (A2 ∩ A3 ) = P (A1 ∩ A3 ) = 0.25

Prove or disprove: The events A1 , A2 , A3 are pairwise independent but not mutually independent.

Sol. We are given three events A1 , A2 , and A3 with the following probabilities:

P (A1 ) = P (A2 ) = P (A3 ) = 0.5

P (A1 ∩ A2 ) = P (A2 ∩ A3 ) = P (A1 ∩ A3 ) = 0.25

We need to prove or disprove whether A1 , A2 , and A3 are pairwise independent but not mutually inde-
pendent.

Check Pairwise Independence: Two events A and B are pairwise independent if:

P (A ∩ B) = P (A)P (B)

Checking A1 and A2 :

P (A1 ∩ A2 ) = 0.25, P (A1 )P (A2 ) = 0.5 × 0.5 = 0.25

Since P (A1 ∩ A2 ) = P (A1 )P (A2 ), the events A1 and A2 are independent.

Checking A2 and A3 :

P (A2 ∩ A3 ) = 0.25, P (A2 )P (A3 ) = 0.5 × 0.5 = 0.25

Since P (A2 ∩ A3 ) = P (A2 )P (A3 ), the events A2 and A3 are independent.

Checking A1 and A3 :

P (A1 ∩ A3 ) = 0.25, P (A1 )P (A3 ) = 0.5 × 0.5 = 0.25

Since P (A1 ∩ A3 ) = P (A1 )P (A3 ), the events A1 and A3 are independent. Thus, A1 , A2 , and A3 are
pairwise independent.

Check Mutual Independence Three events A1 , A2 , and A3 are mutually independent if:

P (A1 ∩ A2 ∩ A3 ) = P (A1 )P (A2 )P (A3 )

6
We already know that:
P (A1 )P (A2 )P (A3 ) = 0.5 × 0.5 × 0.5 = 0.125

Now, we need to check if P (A1 ∩ A2 ∩ A3 ) = 0.125. From the inclusion-exclusion principle:

P (A1 ∩ A2 ∩ A3 ) = P (A1 ) + P (A2 ) + P (A3 ) − P (A1 ∩ A2 ) − P (A2 ∩ A3 ) − P (A1 ∩ A3 ) + P (A1 ∩ A2 ∩ A3 )

Plugging in the values we know:

1 = 0.5 + 0.5 + 0.5 − 0.25 − 0.25 − 0.25 + P (A1 ∩ A2 ∩ A3 )

Simplifying:
P (A1 ∩ A2 ∩ A3 ) = 0.25.

Thus, P (A1 ∩ A2 ∩ A3 ) = 0.25, which is not equal to 0.125, the product P (A1 )P (A2 )P (A3 ). Hence, The
events A1 , A2 , and A3 are pairwise independent but not mutually independent.

2. Let X and Y be independent random variables with expectations E[X] = 2 and E[Y ] = 3. Find E[XY ].

Sol. For independent random variables, E[XY ] = E[X]E[Y ]. Thus, E[XY ] = 2 × 3 = 6.

3. Let X and Y be independent random variables. Prove that f (X) and g(Y ) are independent for any
measurable functions f and g.

Sol.
We need to prove that f (X) and g(Y ) are independent. This means that for any measurable sets A and
B, we must show:
P (f (X) ∈ A, g(Y ) ∈ B) = P (f (X) ∈ A)P (g(Y ) ∈ B)

Since X and Y are independent, we know that:

P (X ∈ f −1 (A), Y ∈ g −1 (B)) = P (X ∈ f −1 (A))P (Y ∈ g −1 (B))

But by the definition of the inverse image of a function, we have:

P (f (X) ∈ A, g(Y ) ∈ B) = P (X ∈ f −1 (A), Y ∈ g −1 (B))

Thus,

P (f (X) ∈ A, g(Y ) ∈ B) = P (X ∈ f −1 (A))P (Y ∈ g −1 (B)) = P (f (X) ∈ A)P (g(Y ) ∈ B)

Hence, f (X) and g(Y ) are independent.

7
4. Let X and Y be independent random variables. Prove that their covariance is zero, i.e., Cov(X, Y ) = 0.

Sol.
The covariance of two random variables X and Y is given by:

Cov(X, Y ) = E[XY ] − E[X]E[Y ]

Since X and Y are independent, we know that:

E[XY ] = E[X]E[Y ]

Substituting this into the covariance formula, we get:

Cov(X, Y ) = E[X]E[Y ] − E[X]E[Y ] = 0

Thus, Cov(X, Y ) = 0, meaning that independent random variables have zero covariance.

5. Let X1 ∼ Poisson(λ1 ) and X2 ∼ Poisson(λ2 ) be independent Poisson random variables. Find the distri-
bution of X1 + X2 .

Sol.
The moment generating function (MGF) of a Poisson random variable X ∼ Poisson(λ) is given by:

MX (t) = E[etX ] = exp(λ(et − 1))

For independent random variables X1 and X2 , the MGF of the sum is the product of the individual
MGFs:

MX1 +X2 (t) = MX1 (t) · MX2 (t) = exp(λ1 (et − 1)) · exp(λ2 (et − 1)) = exp((λ1 + λ2 )(et − 1))

This is the MGF of a Poisson distribution with parameter λ1 + λ2 . Hence:

X1 + X2 ∼ Poisson(λ1 + λ2 )

Therefore, If X1 ∼ Poisson(λ1 ) and X2 ∼ Poisson(λ2 ), then:

X1 + X2 ∼ Poisson(λ1 + λ2 )

6. Let X1 , X2 be independent random variables, both uniformly distributed on [0, 1]. Find the joint proba-
bility density function (pdf) of X1 and X2 .

8
Sol. Since X1 and X2 are independent and each is uniformly distributed on [0, 1], their individual pdfs
are:
fX1 (x1 ) = 1 for 0 ≤ x1 ≤ 1

fX2 (x2 ) = 1 for 0 ≤ x2 ≤ 1

The joint pdf of independent random variables is the product of their individual pdfs:

fX1 ,X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 )

Thus, the joint pdf is:

fX1 ,X2 (x1 , x2 ) = 1 × 1 = 1 for 0 ≤ x1 ≤ 1 and 0 ≤ x2 ≤ 1

Therefore, the joint distribution of X1 and X2 is uniform over the unit square [0, 1] × [0, 1].

7. Let X ∼ Exp(λ1 ) and Y ∼ Exp(λ2 ) be independent exponential random variables. Compute E[XY ].

Sol.
Since X and Y are independent, we can write:

E[XY ] = E[X]E[Y ]

For an exponential random variable X ∼ Exp(λ), the expectation is:

1
E[X] =
λ

Thus:
1 1 1
E[XY ] = × =
λ1 λ2 λ1 λ2

8. Let X1 , X2 , . . . , Xn be independent random variables uniformly distributed on the interval [0, 1]. Define
the event Ai = {Xi > 0.5} for each i = 1, . . . , n.

(a) Find the probability that at least one of the events Ai occurs.

(b) Generalize this for infinitely many independent random variables X1 , X2 , . . ..

Sol.
We are given independent random variables X1 , X2 , . . . , Xn that are uniformly distributed on the interval
[0, 1]. For each i = 1, . . . , n, the event Ai is defined as:

Ai = {Xi > 0.5}

9
(a) Since each Xi is uniformly distributed on [0, 1], the probability that Xi > 0.5 is the length of the
interval (0.5, 1], which is:
P (Ai ) = P (Xi > 0.5) = 1 − 0.5 = 0.5

We have to find the probability that at least one of the events Ai occurs.
To find this probability, we use the complement rule.
The complement of “at least one Ai occurs” is the event that none of the Ai ’s occur, i.e.,

Xi ≤ 0.5 for all i.

The probability that none of the events Ai occur is:

P (none of the Ai occurs) = P (Ac1 ∩ Ac2 ∩ · · · ∩ Acn )

Since the events Aci = {Xi ≤ 0.5} are independent, we can compute:

P (Ac1 ∩ Ac2 ∩ · · · ∩ Acn ) = P (Ac1 )P (Ac2 ) . . . P (Acn )

Each P (Aci ) = P (Xi ≤ 0.5) = 1 − P (Xi > 0.5) = 1 − 0.5 = 0.5, so:

P (none of the Ai occurs) = 0.5n

Therefore, the probability that at least one of the events Ai occurs is:

P (at least one Ai occurs) = 1 − P (none of the Ai occurs) = 1 − 0.5n

(b) Now, we want to find the probability that at least one of the events Ai occurs when i goes to infinity.

This can be phrased as:



!
[
P Ai
i=1

The complement of ”at least one Ai occurs” is the event that none of the Ai ’s occur, which means that
Xi ≤ 0.5 for all i.

The probability of this event is:


∞ n
! !
\ \
P Aci = lim P Aci = lim 0.5n = 0
n→∞ n→∞
i=1 i=1

Thus, the probability that none of the Ai ’s occur is 0.

10
Therefore, the probability that at least one of the events Ai occurs is:
∞ ∞
! !
[ \
c
P Ai = 1 − P Ai = 1 − 0 = 1
i=1 i=1

Hence, For n independent random variables X1 , X2 , . . . , Xn , the probability that at least one of the events
Ai occurs is:
P (at least one Ai occurs) = 1 − 0.5n

For infinitely many independent random variables X1 , X2 , . . ., the probability that at least one of the
events Ai occurs is:

!
[
P Ai =1
i=1

11

You might also like