0% found this document useful (0 votes)
31 views4 pages

Sub BAB 3.8

This document discusses independent events and provides examples. It defines two events E and F as independent if P(EF) = P(E)P(F). If E is independent of F, then E is also independent of the complement of F (Fc). The document also discusses how independence extends to more than two events, with the three events E, F, and G being independent if P(EFG) = P(E)P(F)P(G) and all pairs are independent. Independence can also apply to sequences of experiments or components of a system functioning independently.

Uploaded by

Willem Louis
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views4 pages

Sub BAB 3.8

This document discusses independent events and provides examples. It defines two events E and F as independent if P(EF) = P(E)P(F). If E is independent of F, then E is also independent of the complement of F (Fc). The document also discusses how independence extends to more than two events, with the three events E, F, and G being independent if P(EFG) = P(E)P(F)P(G) and all pairs are independent. Independence can also apply to sequences of experiments or components of a system functioning independently.

Uploaded by

Willem Louis
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

86 CHAP TER 3: Elements of probability

For j = 2, 3,
P (E|Rj )P (Rj )
P (Rj |E) =
P (E)
(1)(1/3)
=
(α1 )1/3 + 1/3 + 1/3
1
= , j = 2, 3
α1 + 2
Thus, for instance, if α1 = .4, then the conditional probability that the plane is
in region 1 given that a search of that region did not uncover it is 16 , whereas
the conditional probabilities that it is in region 2 and that it is in region 3 are
1
both equal to 2.4 = 125
. 

3.8 Independent events


The previous examples in this chapter show that P (E|F ), the conditional prob-
ability of E given F, is not generally equal to P (E), the unconditional proba-
bility of E. In other words, knowing that F has occurred generally changes the
chances of E ’s occurrence. In the special cases where P (E|F ) does in fact equal
P (E), we say that E is independent of F. That is, E is independent of F if knowl-
edge that F has occurred does not change the probability that E occurs.
Since P (E|F ) = P (EF)/P (F ), we see that E is independent of F if

P (EF) = P (E)P (F ) (3.8.1)

Since this equation is symmetric in E and F, it shows that whenever E is inde-


pendent of F so is F of E. We thus have the following.
Definition. Two events E and F are said to be independent if Equation (3.8.1)
holds. Two events E and F that are not independent are said to be dependent.
Example 3.8.a. A card is selected at random from an ordinary deck of 52 play-
ing cards. If A is the event that the selected card is an ace and H is the event that
it is a heart, then A and H are independent, since P (AH ) = 52 1
, while P (A) = 52
4

and P (H ) = 52 . 
13

Example 3.8.b. If we let E denote the event that the next president is a Repub-
lican and F the event that there will be a major earthquake within the next year,
then most people would probably be willing to assume that E and F are inde-
pendent. However, there would probably be some controversy over whether it
is reasonable to assume that E is independent of G, where G is the event that
there will be a recession within the next two years. 

We now show that if E is independent of F then E is also independent of F c .


3.8 Independent events 87

Proposition 3.8.1. If E and F are independent, then so are E and F c .

Proof. Assume that E and F are independent. Since E = EF ∪ EF c , and EF and


EF c are obviously mutually exclusive, we have that

P (E) = P (EF ) + P (EF c )


= P (E)P (F ) + P (EF c ) by the independence of E and F

or equivalently,

P (EF c ) = P (E)(1 − P (F ))
= P (E)P (F c )

and the result is proven. 

Thus if E is independent of F, then the probability of E ’s occurrence is un-


changed by information as to whether or not F has occurred.
Suppose now that E is independent of F and is also independent of G. Is E
then necessarily independent of FG ? The answer, somewhat surprisingly, is
no. Consider the following example.
Example 3.8.c. Two fair dice are thrown. Let E7 denote the event that the sum
of the dice is 7. Let F denote the event that the first die equals 4 and let T be
the event that the second die equals 3. Now it can be shown (see Problem 36)
that E7 is independent of F and that E7 is also independent of T; but clearly
E7 is not independent of FT [since P (E7 |FT ) = 1]. 

It would appear to follow from the foregoing example that an appropriate defi-
nition of the independence of three events
 E, F, and G would have to go further
than merely assuming that all of the 32 pairs of events are independent. We are
thus led to the following definition.
Definition. The three events E, F, and G are said to be independent if

P (EFG) = P (E)P (F )P (G)


P (EF) = P (E)P (F )
P (EG) = P (E)P (G)
P (FG) = P (F )P (G)

It should be noted that if the events E, F, G are independent, then E will be


independent of any event formed from F and G. For instance, E is independent
of F ∪ G since

P (E(F ∪ G )) = P (EF ∪ EG )
= P (EF ) + P (EG ) − P (EFG )
88 CHAP TER 3: Elements of probability

= P (E)P (F ) + P (E )P (G) − P (E)P (FG )


= P (E)[P (F ) + P (G) − P (FG )]
= P (E)P (F ∪ G)

Of course we may also extend the definition of independence to more than


three events. The events E1 , E2 , . . . , En are said to be independent if for every
subset E1 , E2 , . . . , Er , r ≤ n, of these events

P (E1 E2 · · · Er ) = P (E1 )P (E2 ) · · · P (Er )

It is sometimes the case that the probability experiment under consideration


consists of performing a sequence of subexperiments. For instance, if the exper-
iment consists of continually tossing a coin, then we may think of each toss as
being a subexperiment. In many cases it is reasonable to assume that the out-
comes of any group of the subexperiments have no effect on the probabilities
of the outcomes of the other subexperiments. If such is the case, then we say
that the subexperiments are independent.

Example 3.8.d. A system composed of n separate components is said to be a


parallel system if it functions when at least one of the components functions.
(See Figure 3.7.) For such a system, if component i, independent of other com-
ponents, functions with probability pi , i = 1, . . . , n, what is the probability the
system functions?

Solution. Let Ai denote the event that component i functions. Then

P {system functions} = 1 − P {system does not function}


= 1 − P {all components do not function}
 
= 1 − P Ac1 Ac2 · · · Acn

n
=1− (1 − pi ) by independence 
i=1

Example 3.8.e. A set of k coupons,


each of which is independently a type
j coupon with probability pj , nj=1 pj = 1, is collected. Find the probability
that the set contains a type j coupon given that it contains a type i, i = j .

FIGURE 3.7
Parallel system: functions if current flows from A to B.
Problems 89

Solution. Let Ar be the event that the set contains a type r coupon. Then
P (Aj Ai )
P (Aj |Ai ) =
P (Ai )
To compute P (Ai ) and P (Aj Ai ), consider the probability of their comple-
ments:

P (Ai ) = 1 − P (Aci )
= 1 − P {no coupon is type i}
= 1 − (1 − pi )k
P (Ai Aj ) = 1 − P (Aci ∪ Acj )
= 1 − [P (Aci ) + P (Acj ) − P (Aci Acj )]
= 1 − (1 − pi )k − (1 − pj )k + P {no coupon is type i or type j }
= 1 − (1 − pi )k − (1 − pj )k + (1 − pi − pj )k

where the final equality follows because each of the k coupons is, indepen-
dently, neither of type i or of type j with probability 1 − pi − pj . Consequently,

1 − (1 − pi )k − (1 − pj )k + (1 − pi − pj )k
P (Aj |Ai ) = 
1 − (1 − pi )k

Problems
1. A box contains three marbles — one red, one green, and one blue. Con-
sider an experiment that consists of taking one marble from the box, then
replacing it in the box and drawing a second marble from the box. De-
scribe the sample space. Repeat for the case in which the second marble
is drawn without first replacing the first marble.
2. An experiment consists of tossing a coin three times. What is the sample
space of this experiment? Which event corresponds to the experiment
resulting in more heads than tails?
3. Let S = {1, 2, 3, 4, 5, 6, 7}, E = {1, 3, 5, 7}, F = {7, 4, 6}, G = {1, 4}. Find
a. EF ; c. EGc ; e. E c (F ∪ G);
b. E ∪ FG ; d. EF ∪ G; f. EG ∪ FG.
c

4. Two dice are thrown. Let E be the event that the sum of the dice is odd,
let F be the event that the first die lands on 1, and let G be the event that
the sum is 5. Describe the events EF, E ∪ F, FG, EF c , EFG.
5. A system is composed of four components, each of which is either work-
ing or failed. Consider an experiment that consists of observing the status
of each component, and let the outcome of the experiment be given by
the vector (x1 , x2 , x3 , x4 ) where xi is equal to 1 if component i is working
and is equal to 0 if component i is failed.

You might also like