Lecture 03
Lecture 03
The content of this lecture roughly corresponds to Sec. 3.6–3.7 of Sheldon and Ross’ Introduction to Proba-
bility and Statistics for Engineers and Scientists.
In Lecture 2 we were concerned with (unconditional) probability, i.e. assigning a probability to an event
with no knowledge of what occurred. Sometimes, when the experiment is performed, (partial) information
about what occurred comes available (the condition).
Example 1. Suppose we roll two four-sided dice (one red, one blue) and observe the “downface on each”.
We learned there are 16 possible sample points in Ω. Let’s assume the dice are “fair” so that all 16 possible
outcomes are equally likely.
Consider the event
F = sum of faces is 4
= {(1, 3), (2, 2), (3, 1)} notice |F |= 3
Notation: cardinality or size of a set is denoted by absolute value
3
Then P (F ) = 16 . ← an (unconditionally) probability
Can think that F is the mutually exclusive union of {(1, 3)}, {(2, 2)}, and {(3, 1)}, and each of those has
1 1 1 1 3
probability 16 of occurring, so P (F ) = P ({(1, 3)}) + P ({(2, 2)}) + P ({(3, 1)}) = 16 + 16 + 16 = 16 .
Now, suppose when the experiment is performed, we saw the number 4 occurred on the red die.
Now, we know F did not occur and under this condition (a 4 on red die), the probability of F is ZERO.
A picture of what is happening in last example:
Since we are told a 4 occurs on red die, this event is, in effect, our “new” sample space, and none of the
3-1
3-2 Lecture 3: Conditional Probability
The red region is now our new sample space – which still has equally likely outcomes, but only 7 of them
now. Only 2 of these 7 (namely (1, 3) and (3, 1)) belong to F . So the conditional probability is 72 that F
occurs given a 3 was rolled.
Question: what happen if the (different) conditioning event is a 3 occurred on the red die. In this case,
probability of F given 3 on red die is 14 .
A formula for conditional probability
Notation: we wish to compute the probability of an event A given the condition that event B occurred we
write
Formula:
P (A ∩ B)
P (A|B) = (3.1)
P (B)
This formula implicitly assumes P (B) > 0 since we should NOT divide by ZERO!
This formula is useful for computing a conditional probability when the unconditional probabilities P (A∩B)
and P (B) are “easier” to compute.
Example 2. Toss a fair coin 3 times and observe the sequence of heads and tails,
Then
2
P (A ∩ B) 8 1
P (A|B) = = 4 =
P (B) 8
2
Notice
2
P (B ∩ A) 8 2
P (B|A) = = 3 =
P (A) 8
3
P (A ∩ B) = P (A|B)P (B)
Let R be the event a red marble is selected. If we let Bi be the event that Box i is selected (i = 1, 2), then
we are told P (B1 ) = 1/2, P (B2 ) = 1/2. But, more importantly, the conditional probabilities P (R|B1 ) = 1/2
and P (R|B2 ) = 2/3 are “easy”.
3-4 Lecture 3: Conditional Probability
Now,
1 1 1
P (R ∩ B1 ) = P (R|B1 )P (B1 ) = · =
2 2 4
and
2 1 1
P (R ∩ B2 ) = P (R|B2 )P (B2 ) = · =
3 2 3
Example 4. Suppose we have a population for which we having the following knowledge:
P (E) = 0.6
P (S) = 0.3
P (S|E) = 0.1 ⇒ since P (E ∩ S) = P (S|E)P (E) = 0.1 × 0.6 = 0.06