0% found this document useful (0 votes)
83 views7 pages

Mathematical Statistics (MA212M) : Lecture Slides

This document contains lecture slides on independence and random variables from a mathematical statistics course. It defines pairwise independence, mutual independence, and independence for countable collections of events. It also defines conditional independence between events given a third event. Finally, it provides several examples of random variables defined on sample spaces and their probability distributions.

Uploaded by

akshay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views7 pages

Mathematical Statistics (MA212M) : Lecture Slides

This document contains lecture slides on independence and random variables from a mathematical statistics course. It defines pairwise independence, mutual independence, and independence for countable collections of events. It also defines conditional independence between events given a third event. Finally, it provides several examples of random variables defined on sample spaces and their probability distributions.

Uploaded by

akshay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Mathematical Statistics (MA212M)

Lecture Slides
Lecture 4
Independence

Def: A countable collection of events E1 , E2 , . . . are said to be


pairwise independent if Ei and Ej are independent for i 6= j.
Def: A finite collection of events E1 , E2 , . . . , En are said to be
independent(or mutually independent) if for any sub-collection
En1 , . . . , Enk of E1 , E2 , . . . , En ,
k
\  k
Y
P Enk = p(Eni ) .
i =1 i =1

Def: A countable collection of events E1 , E2 , . . . are said to be


independent if any finite sub-collection are independent.
Remarks
◮ To verify the independence of E1 , E2 , . . . , En we must check
2n − n − 1 conditions. For example, for n = 3, the conditions
that need to be checked are
P(E1 ∩ E2 ) = P(E1 )P(E2 ), P(E1 ∩ E3 ) = P(E1 )P(E3 ), P(E2 ∩
E3 ) = P(E2 )P(E3 ), P(E1 ∩ E2 ∩ E3 ) = P(E1 )P(E2 )P(E3 ).
◮ Independence implies pairwise independence.
◮ Pairwise independence does not imply independence in general.
Example 19: Let S = {HH, HT , TH, TT }. Suppose all
elementary events are equally likely. Let E1 = {HH, HT },
E2 = {HH, TH} and E3 = {HH, TT }. Then E1 , E2 , E3 are
pairwise independent but not independent.
◮ P(E1 ∩ E2 ∩ E3 ) = P(E1 )P(E2 )P(E3 ) is also not sufficient.
Example 20: Let S = {(i , j) : i = 1, . . . , 6, j = 1, . . . , 6}.
Suppose all elementary events are equally likely. Define
E1 = {1st roll is 1, 2 or 3}, E2 = {1st roll is 3, 4 or 5} and
E3 = {Sum of the rolls is 9}.
Conditional Independent

Def: Given an event C two events A and B are said to be


conditionally independent if P(A ∩ B|C ) = P(A|C )P(B|C ).
Example 21: A box contains two coins: a fair coin and one fake
two-headed coin, i.e., (P(H) = 1). You choose a coin at random and
toss it twice. Define the following events.

A = First coin toss results in a H.


B = Second coin toss results in aH.
C = Coin 1 (regular) has been selected.

Then A and B are conditionally independent given C . Are A and B


independent?
Random Variable
Random Variables

Def: A function X : S → R is called a random variable.

Example 1: Tossing a fair coin n times. Assume that the tosses are
independent. Let X : S → R be defined by the number of heads.

Example 2: Throwing a fair die twice. Assume the throws are


independent. Let X : S → R be defined by the sum of the outcomes.

Example 3: Suppose we are testing the reliability of a battery. Define


X : S → R by X1 (ω) = ω. Now suppose we are mainly interested in
whether the battery would last more than 2 years or not. Then
X2 = 1(2,∞) .
Example 4: Take n=2. P(X = 0) = P(X = 2) = 1/4,
P(X = 1) = 1/2.

Example 5: P(X = 2) = 1/36, P(X = 3) = 2/36, P(X = 4) =


3/36, P(X = 5) = 4/36, P(X = 6) = 5/36, P(X = 7) = 6/36,
P(X = 8) = 5/36, P(X = 9) = 4/36, P(X = 10) = 3/36,
P(X = 11) = 2/36, P(X = 12) = 1/36.
R
Example 6: P(I ) = I e −t dt, defines a probability on B(0, ∞).
P(X2 = 1) = e −2 , P(X2 = 0) = 1 − e −2 .

You might also like