Chapter 04
Chapter 04
.. .
.. SLIDES BY
.. John Loucks
. St. Edward’s
.. University
.
Increasing Likelihood of
Occurrence
0 . 1
Probabilit 5
y:
The The The
event occurrence event
is very of the event is
unlikely is almost
to just as likely certain
occur. as to
it is unlikely. occur.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Statistical Experiments
Experiment Experiment
Toss a coin Outcomes
Inspection a part Head, tail
Conduct a sales Defective, non-
call defective
Roll a die Purchase, no
Play a football purchase
game 1, 2, 3, 4, 5, 6
Win, lose, tie
Investment Gain or
Loss
in 3 Months (in
Markley Collins
$000)
Oil Mining
1 8
0 −
2
5
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole
Slide
A Counting Rule for Multiple-Step Experiments
If an experiment consists of a sequence of k
steps
in which there are n1 possible results for the
first step,
n2 possible results for the second step, and so
on,
Athen
helpful
the graphical
total numberrepresentation of a
of experimental
multiple-step
outcomes is
experiment
given by (n1is a2tree
)(n ) . . .diagram
(nk). .
Markley Oil: n1 = 4
Collins Mining: n2 = 2
Total Number of
Experimental Outcomes: n1n2 = (4)
(2) = 8
Number of Combinations of N
Objects
Taken n at a useful
A second Time counting rule enables us to count
the number of experimental outcomes when n
objects
are to be selected from a set of N objects.
⎛N ⎞ N!
CnN =⎜ ⎟ =
⎝ n ⎠ n !(N − n )!
Number of Permutations of N
Objects
TakenA nthird
at a useful
Time counting rule enables us to
count
the number of experimental outcomes
when n
objects are to be selected from a set of N
objects,
N ⎛N ⎞ N!
where the Porder
n = n !of
⎜ selection
⎟= is important.
⎝ n ⎠ (N − n )!
P(E1) + P(E2) + . . . +
P(En) = 1
where:
n is the number of experimental
outcomes
Classical
Method
Assigning probabilities based on the
assumption
of equally likely outcomes
Relative Frequency
Method
Assigning probabilities based on
experimentation
or historical data
Subjective
Method probabilities based on judgment
Assigning
Experiment: Rolling a
die
Sample Space: S = {1, 2, 3,
4, 5, 6}
Probabilities: Each sample point
has a
1/6 chance of occurring
Number of Number
Polishers of Days Probabilit
Rented
0 4 y
.
1 6 1
2 1 0 4/40
3 8 .
4 1 1
0 5
2 .
4
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
4
Slide
0 website, in5whole
or duplicated, or posted to a publicly accessible
Subjective Method
When economic conditions and a company’s
circumstances change rapidly it might be
inappropriate to assign probabilities based
solely on
historical data.
We can use any data available as well as our
experience and intuition, but ultimately a
probability
value should express our degree of belief that
the
The best probability
experimental outcome estimates
will occur.often are
obtained by
combining the estimates from the classical or
relative
frequency approach with the subjective
estimate.
© 2014 Cengage Learning. All Rights Reserved. May not be
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Subjective Method
Complement of an
Event
Union of Two Events
Intersection of Two
Events
Mutually Exclusive
Events
Samp
Event A le
A c Space
S
Venn
Diagra
m
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Union of Two Events
Samp
Event A Event le
B Space
S
Samp
Event A Event le
B Space
S
Intersection of A
and B
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Intersection of Two Events
Samp
Event A Event le
B Space
S
There is no need to
include “− P(A ∩
B)”
Collins Mining
Markley Oil Profitable (C) Not Profitable Tot
Profitable (M) (Cc) al
.36 .7
Not Profitable
.34 0
(Mc)
.12 .3
.18 0
Total .48 1.0
.52 0
Joint Probabilities
(appear in the
body Marginal
of the table) Probabilities
(appear in the
margins
© 2014 Cengage Learning. of the
All Rights Reserved. table)
May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Independent Events
Example: L. S. Clothiers
A proposed shopping center will provide strong
competition for downtown businesses like L. S.
Clothiers. If the shopping center is built, the owner
of L. S. Clothiers feels it would be best to relocate to
the shopping center.
Example: L. S. Clothiers
Let:
A1 = town council approves the zoning
change
A2 = town council disapproves the
Using subjective
change
judgment:
P(A1) = .7, P(A2)
= .3
Example: L. S. Clothiers
The planning board has recommended against
the zoning change. Let B denote the event of a
negative recommendation by the planning board.
Example: L. S. Clothiers
Past history with the planning board and
the town
council indicates the following:
P(B|A1) P(B|A2) = .9
= .2
Henc P(BC|A1) P(BC|A2)
e: = .8 = .1
Example: L. S. Clothiers
Town Planning Experimental
Council Board Outcomes
P(B|A1)
P(A1 ∩ B)
= .2
P(A1) = = .14
.7 P(Bc|A1) P(A1 ∩ Bc)
= .8 = .56
P(B|A2)
P(A2 ∩ B)
= .9
P(A2) = = .27
.3 P(A2 ∩ Bc)
P(Bc|A2)
= .03
= .1
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem
To find the posterior probability that event Ai
will
occur given that event B has occurred, we
apply
Bayes’ theorem. P( Ai ) P ( B | Ai )
P( i A | =B )
P( A1) P( B
| 1 )A
+ ( P ) A2 ( P| B) 2+ A
. +.P . An ( P) B( An| )
Example: L. S. Clothiers
Given the planning board’s recommendation not
to approve the zoning change, we revise the prior
probabilities as follows:
P( A1 ) P ( B | A1 )
P( 1 A | B
= )
P( A1 ) P ( B | 1A+) P ( A 2 )P (B |A2 )
(. 7 )(. 2 )
=
(. 7 )(. 2 ) + (. 3)(. 9)
= .3
4
Example: L. S. Clothiers
The planning board’s recommendation is
good
news for L. S. Clothiers. The posterior
probability of
the town council approving the zoning
change is .34
compared to a prior probability of .70.
Example: L. S. Clothiers
Step 1
Prepare the following three
columns:
Column 1 − The mutually exclusive events
for
which
Column posterior
2 − The probabilities are desired.
prior probabilities for the
events.
Column 3 − The conditional probabilities of
the
new information given each event.
Example: L. S. Clothiers
Step 1
( ( ( ( (
1 2
Prior Conditiona
3 4 5
)
Even )
Probabiliti )l ) )
es Probabiliti
ts
es
Ai P(Ai)
A1 P(B.|Ai)
.
A2 2
7
.
.
9
3
1.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole
Slide
Bayes’ Theorem: Tabular Approach
Example: L. S. Clothiers
Step 2
Prepare the fourth column:
Column 4
Compute the joint probabilities for each
event and
the new information B by using the
multiplication
Multiply the prior probabilities in
law. 2 by the corresponding conditional
column
probabilities in column 3. That is, P(Ai B) =
P(Ai) P(B|Ai).
Example: L. S. Clothiers
Step 2
( ( ( ( (
1 2
Prior Conditiona
3 4
Joint 5
)
Even )
Probabiliti )l )
Probabiliti )
es Probabiliti es
ts
es
Ai P(Ai) P(Ai B)
A1 P(B.|Ai) .1
.
A2 2 4
7 .7 x .2
. .2
.
9 7
3
1.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole
Slide
Bayes’ Theorem: Tabular Approach
Example: L. S. Clothiers
Step
We2see
(continued)
that there is a .14 probability of
the town
council
Thereapproving the zoningofchange
is a .27 probability and a
the town
negative
council recommendation by the planning
board.
disapproving the zoning change and a
negative
recommendation by the planning board.
Example: L. S. Clothiers
Step 3
Sum the joint probabilities in Column
4. The
sum is the probability of the new
information,
P(B). The sum .14 + .27 shows an overall
probability of .41 of a negative
recommendation
by the planning board.
Example: L. S. Clothiers
Step 3
( ( ( ( (
1 2
Prior Conditiona
3 4
Joint 5
)
Even )
Probabiliti )l )
Probabiliti )
es Probabiliti es
ts
es
Ai P(Ai) P(A.1
i B)
A1 P(B.|Ai)
.
A2 2 4
7
. P(B) .2
.
9 = .417
3
1.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole
Slide
Bayes’ Theorem: Tabular Approach
Example: L. S. Clothiers
Step 4
Prepare the fifth column:
Column 5
Compute the posterior probabilities
using the
basic relationship of conditional
probability. P( Ai ∩ B)
P( Ai | B) =
P( B)
The joint probabilities P(Ai B) are in
column 4
and the probability P(B) is the sum of
© 2014 column
Cengage 4.All Rights Reserved. May not be
Learning.
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem: Tabular Approach
Example: L. S. Clothiers
Step 4
( ( ( ( (
1 2
Prior Conditiona
3 Joint
4 5
Posterior
)
Even )
Probabiliti )l Probabiliti
) )
Probabiliti
es Probabiliti es es
ts es
Ai P(Ai) P(Ai B) P(Ai |B)
A1 P(B.|Ai) .1
. .341
A2 2 4
7 5
. P(B) .2 .658
.
9 = .417
3 .14/.4 5
1. 1 1.00
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole 00Slide
End of Chapter 4