0% found this document useful (0 votes)
11 views63 pages

Chapter 04

Uploaded by

a0966628494
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as KEY, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views63 pages

Chapter 04

Uploaded by

a0966628494
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as KEY, PDF, TXT or read online on Scribd
You are on page 1/ 63

.

.. .
.. SLIDES BY
.. John Loucks
. St. Edward’s
.. University
.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Chapter 4 Introduction to Probability

Experiments, Counting Rules,


and Assigning Probabilities
Events and Their Probability
Some Basic Relationships
of Probability
Conditional Probability
Bayes’ Theorem

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Uncertainties

Managers often base their decisions on an


analysis
of uncertainties such as the following:
What are the chances that sales will
decrease
if we increase prices?
What is the likelihood a new assembly
method
will increase productivity?
What are the odds that a new
investment will
be profitable?

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Probability

Probability is a numerical measure of the


likelihood
that an event will occur.
Probability values are always assigned on a
scale
from 0 to 1.
A probability near zero indicates an event is
quite
unlikely to occur.
A probability near one indicates an event is
almost
certain to occur.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Probability as a Numerical Measureof the
Likelihood of Occurrence

Increasing Likelihood of
Occurrence
0 . 1
Probabilit 5
y:
The The The
event occurrence event
is very of the event is
unlikely is almost
to just as likely certain
occur. as to
it is unlikely. occur.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Statistical Experiments

In statistics, the notion of an experiment


differs
somewhat from that of an experiment in the
physical sciences.
In statistical experiments, probability
determines
outcomes.
Even though the experiment is repeated in
exactly
the same way, an entirely different outcome
may
For this reason, statistical experiments are
occur.
some-
times called random experiments.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
An Experiment and Its Sample Space

An experiment is any process that generates


well-
defined outcomes.
The sample space for an experiment is the
set of
all experimental outcomes.
An experimental outcome is also called a
sample
point.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
An Experiment and Its Sample Space

Experiment Experiment
Toss a coin Outcomes
Inspection a part Head, tail
Conduct a sales Defective, non-
call defective
Roll a die Purchase, no
Play a football purchase
game 1, 2, 3, 4, 5, 6
Win, lose, tie

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
An Experiment and Its Sample Space

Example: Bradley Investments


Bradley has invested in two stocks, Markley Oil
and Collins Mining. Bradley has determined that the
possible outcomes of these investments three months
from now are as follows.

Investment Gain or
Loss
in 3 Months (in
Markley Collins
$000)
Oil Mining
1 8
0 −
2
5
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole
Slide
A Counting Rule for Multiple-Step Experiments
If an experiment consists of a sequence of k
steps
in which there are n1 possible results for the
first step,
n2 possible results for the second step, and so
on,
Athen
helpful
the graphical
total numberrepresentation of a
of experimental
multiple-step
outcomes is
experiment
given by (n1is a2tree
)(n ) . . .diagram
(nk). .

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
A Counting Rule for Multiple-Step Experiments

Example: Bradley Investments


Bradley Investments can be viewed as a two-step
experiment. It involves two stocks, each with a set of
experimental outcomes.

Markley Oil: n1 = 4
Collins Mining: n2 = 2
Total Number of
Experimental Outcomes: n1n2 = (4)
(2) = 8

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Tree Diagram

Example: Bradley Investments


Markley Oil Collins Mining Experimental
(Stage 1) (Stage 2) Outcomes
Gain (10, 8) Gain
8 $18,000
(10, -2) Gain
Gain Lose $8,000
Gain 2 (5, 8) Gain
10
8 $13,000
Lose (5, -2) Gain
Gain $3,000
5 2 Gain (0, 8) Gain
8 $8,000
Eve (0, -2) Lose
Lose n Lose $2,000
Gain 2 (-20, 8) Lose
20
8 $12,000
Lose (-20, -2) Lose
2 $22,000
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Counting Rule for Combinations

Number of Combinations of N
Objects
Taken n at a useful
A second Time counting rule enables us to count
the number of experimental outcomes when n
objects
are to be selected from a set of N objects.

⎛N ⎞ N!
CnN =⎜ ⎟ =
⎝ n ⎠ n !(N − n )!

where: N! = N(N − 1)(N − 2) . .


. (2)(1)
n! = n(n − 1)(n − 2) . . .
(2)(1)
0! = 1
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Counting Rule for Permutations

Number of Permutations of N
Objects
TakenA nthird
at a useful
Time counting rule enables us to
count
the number of experimental outcomes
when n
objects are to be selected from a set of N
objects,
N ⎛N ⎞ N!
where the Porder
n = n !of
⎜ selection
⎟= is important.
⎝ n ⎠ (N − n )!

where: N! = N(N − 1)(N − 2) . .


. (2)(1)
n! = n(n − 1)(n − 2) . . .
(2)(1)
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied 0! = 1
Slide
or duplicated, or posted to a publicly accessible website, in whole
Assigning Probabilities

Basic Requirements for Assigning


Probabilities
1. The probability assigned to each
experimental
outcome must be between 0 and 1,
inclusively.
0 < P(Ei) < 1 for
all i
where:
Ei is the ith experimental
outcome
and P(Ei) is its probability

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Assigning Probabilities

Basic Requirements for Assigning


Probabilities
2. The sum of the probabilities for all
experimental
outcomes must equal 1.

P(E1) + P(E2) + . . . +
P(En) = 1
where:
n is the number of experimental
outcomes

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Assigning Probabilities

Classical
Method
Assigning probabilities based on the
assumption
of equally likely outcomes
Relative Frequency
Method
Assigning probabilities based on
experimentation
or historical data
Subjective
Method probabilities based on judgment
Assigning

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Classical Method

Example: Rolling a Die


If an experiment has n possible outcomes, the
classical method would assign a probability of 1/ n
to each outcome.

Experiment: Rolling a
die
Sample Space: S = {1, 2, 3,
4, 5, 6}
Probabilities: Each sample point
has a
1/6 chance of occurring

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Relative Frequency Method

Example: Lucas Tool Rental


Lucas Tool Rental would like to assign
probabilities
to the number of car polishers it rents each
day.
Office records show the following frequencies
Number of Number
of daily Polishers of Days
rentals for the last
Rented0 40 days. 4
1 6
2 1
3 8
4 1
0
© 2014 Cengage Learning. All Rights Reserved. May2not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Relative Frequency Method

Example: Lucas Tool Rental


Each probability assignment is given by dividing
the frequency (number of days) by the total frequency
(total number of days).

Number of Number
Polishers of Days Probabilit
Rented
0 4 y
.
1 6 1
2 1 0 4/40
3 8 .
4 1 1
0 5
2 .
4
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
4
Slide
0 website, in5whole
or duplicated, or posted to a publicly accessible
Subjective Method
When economic conditions and a company’s
circumstances change rapidly it might be
inappropriate to assign probabilities based
solely on
historical data.
We can use any data available as well as our
experience and intuition, but ultimately a
probability
value should express our degree of belief that
the
The best probability
experimental outcome estimates
will occur.often are
obtained by
combining the estimates from the classical or
relative
frequency approach with the subjective
estimate.
© 2014 Cengage Learning. All Rights Reserved. May not be
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Subjective Method

Example: Bradley Investments


An analyst made the following probability estimates.

Exper. Net Gain or Probabilit


(10,
Outcome Loss
$18,000 y
.2
8) Gain
0
(10, $8,000
−2) .0
Gain
(5, 8) 8
$13,000
(5, Gain .1
−2) $3,000 6
(0, 8) Gain .2
(0, $8,000 6
−2) Gain .1
© 2014 Cengage(−20,Learning. All Rights $2,000
Reserved. May not be 0
scanned, copied
8) Loss .1 Slide
or duplicated, or posted to a publicly accessible website, in whole
Events and Their Probabilities

An event is a collection of sample points.

The probability of any event is equal to the


sum of
the probabilities of the sample points in the
event.
If we can identify all the sample points of an
experiment and assign a probability to each,
we
can compute the probability of an event.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Events and Their Probabilities

Example: Bradley Investments

Event M = Markley Oil Profitable


M = {(10, 8), (10, −2), (5, 8),
P(M) = P(10,(5,
8) −+2)}
P(10, −2) + P(5, 8)
+ P+
= .20 + .08 (5, −2)
.16
= .7 + .26
0

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Events and Their Probabilities

Example: Bradley Investments

Event C = Collins Mining Profitable


C = {(10, 8), (5, 8), (0, 8),
P(C) = P(10,(−8)
20,+ 8)}
P(5, 8) + P(0, 8) +
P(−+20,
= .20 + .16 .108)
= .4 + .02
8

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Some Basic Relationships of Probability

There are some basic probability relationships that


can be used to compute the probability of an event
without knowledge of all the sample point probabilities.

Complement of an
Event
Union of Two Events

Intersection of Two
Events
Mutually Exclusive
Events

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Complement of an Event

The complement of event A is defined to be


the event
consisting of all sample points that are not in
A.
The complement of A is denoted by Ac.

Samp
Event A le
A c Space
S
Venn
Diagra
m
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Union of Two Events

The union of events A and B is the event


containing
all sample points that are in A or B or both.
The union of events A and B is denoted by A ∪
B.

Samp
Event A Event le
B Space
S

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Union of Two Events

Example: Bradley Investments

Event M = Markley Oil Profitable


Event C = Collins Mining Profitable
M ∪ C = Markley Oil Profitable
or Collins Mining Profitable (or
both)
M ∪ C = {(10, 8), (10, −2), (5, 8), (5, −2), (0,
P(M ∪ C) = P(10, 8)8),
+ (P−(10,
20, 8)}
−2) + P(5, 8) +
P(5, −2)
+ P(0, 8) + P(−20, 8)
= .20 + .08 + .16 + .26
+ .10
= .8 + .02
2
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Intersection of Two Events

The intersection of events A and B is the set of


all
sample points that are in both A and B.
The intersection of events A and B is denoted by
A ∩ Β.

Samp
Event A Event le
B Space
S

Intersection of A
and B
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Intersection of Two Events

Example: Bradley Investments

Event M = Markley Oil Profitable


Event C = Collins Mining Profitable
M ∩ C = Markley Oil Profitable
and Collins Mining Profitable
M ∩ C = {(10, 8), (5,
8)}
P(M ∩ C) = P(10, 8) +
P(5, 8)
= .20 + .16
= .3
6

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Addition Law

The addition law provides a way to compute the


probability of event A, or B, or both A and B
occurring.
The law is written as:
P(A ∪ B) = P(A) + P(B) − P(A
∩ B)

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Addition Law

Example: Bradley Investments


Event M = Markley Oil Profitable
Event C = Collins Mining Profitable
M ∪ C = Markley Oil Profitable
or Collins Mining Profitable
We know: P(M) = .70, P(C) = .48, P(M ∩ C)
= .36 P(M ∪ C) = P(M) + P(C) − P(M ∩ C)
Thus:
= .70 + .48 − .36
= .8
(This result is the2same as that obtained
earlier
using the definition of the probability of an
© 2014event.)
Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Mutually Exclusive Events

Two events are said to be mutually exclusive if


the
events have no sample points in common.
Two events are mutually exclusive if, when one
event
occurs, the other cannot occur.

Samp
Event A Event le
B Space
S

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Mutually Exclusive Events

If events A and B are mutually exclusive, P(A ∩


B) = 0.
The addition law for mutually exclusive events
is:
P(A ∪ B) = P(A) + P(B)

There is no need to
include “− P(A ∩
B)”

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Conditional Probability

The probability of an event given that another


event
has occurred is called a conditional probability.
The conditional probability of A given B is
denoted
by P(A|B).
A conditional probability is computed as
follows :
P( ∩ A B)
P( A | =B )
P( B)

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Conditional Probability

Example: Bradley Investments

Event M = Markley Oil Profitable


Event C = Collins Mining Profitable
P( C | M= )Collins Mining Profitable
given Markley Oil Profitable
We know: P(M ∩ C) = .36, P(M)
= .70
P( ∩C) M
. 3 6
P( C| M=) .= 5 = 1 4 3
Thus: P( )M . 7 0

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Multiplication Law

The multiplication law provides a way to


compute the
probability of the intersection of two events.
The law is written as:
P(A ∩ B) = P(B)P(A|B)

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Multiplication Law

Example: Bradley Investments


Event M = Markley Oil Profitable
Event C = Collins Mining Profitable
M ∩ C = Markley Oil Profitable
and Collins Mining Profitable
We know: P(M) = .70, P(C|M)
= .5143
Thus: P(M ∩ C) = P(M)P(M|
C) = (.70)
(.5143)
= .3
(This result is the same as 6 that obtained
earlier
using the definition of the probability of an
© 2014 event.)
Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Joint Probability Table

Collins Mining
Markley Oil Profitable (C) Not Profitable Tot
Profitable (M) (Cc) al
.36 .7
Not Profitable
.34 0
(Mc)
.12 .3
.18 0
Total .48 1.0
.52 0
Joint Probabilities
(appear in the
body Marginal
of the table) Probabilities
(appear in the
margins
© 2014 Cengage Learning. of the
All Rights Reserved. table)
May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Independent Events

If the probability of event A is not changed


by the
existence of event B, we would say that
events A
and B are independent
Two events A and B are. independent if:
P(A|B) = P(A) P(B|A) = P(B)
o
r

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Multiplication Lawfor Independent Events

The multiplication law also can be used as a


test to see
if two events are independent.
The law is written as:
P(A ∩ B) = P(A)P(B)

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Multiplication Lawfor Independent Events

Example: Bradley Investments

Event M = Markley Oil Profitable


Event C = Collins Mining Profitable
Are events M and C
independent?
Does P(M ∩ C) = P(M)P(C) ?
We know: P(M ∩ C) = .36, P(M) = .70,
P(C) = .48P(M)P(C) = (.70)(.48)
But:
= .34, not .36
Hence: M and C are not
independent.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Mutual Exclusiveness and Independence

Do not confuse the notion of mutually


exclusive
events with that of independent events.
Two events with nonzero probabilities cannot
be
both mutually exclusive and independent.
If one mutually exclusive event is known to
occur,
the other cannot occur.; thus, the probability
of the
other event occurring is reduced to zero
(and they that are not mutually exclusive,
Two events
are therefore dependent).
might
or might not be independent.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem
Often we begin probability analysis with
initial or
prior probabilities.
Then, from a sample, special report, or a
product
test
Givenwethis
obtain some additional
information, information.
we calculate revised
or
posterior probabilities.
Bayes’ theorem provides the means for
revising the
prior probabilities.
Prior New Application Posterior
Probabiliti Informatio of Bayes’ Probabiliti
es n Theorem es

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem

Example: L. S. Clothiers
A proposed shopping center will provide strong
competition for downtown businesses like L. S.
Clothiers. If the shopping center is built, the owner
of L. S. Clothiers feels it would be best to relocate to
the shopping center.

The shopping center cannot be built


unless a
zoning change is approved by the town
council.
The planning board must first make a
recommendation, for or against the zoning
© 2014 change,
Cengage Learning. All Rights Reserved. May not be
scanned, copied
to the orcouncil.
or duplicated, posted to a publicly accessible website, in whole
Slide
Prior Probabilities

Example: L. S. Clothiers
Let:
A1 = town council approves the zoning
change
A2 = town council disapproves the
Using subjective
change
judgment:
P(A1) = .7, P(A2)
= .3

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
New Information

Example: L. S. Clothiers
The planning board has recommended against
the zoning change. Let B denote the event of a
negative recommendation by the planning board.

Given that B has occurred, should L. S.


Clothiers
revise the probabilities that the town
council will
approve or disapprove the zoning change?

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Conditional Probabilities

Example: L. S. Clothiers
Past history with the planning board and
the town
council indicates the following:
P(B|A1) P(B|A2) = .9
= .2
Henc P(BC|A1) P(BC|A2)
e: = .8 = .1

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Tree Diagram

Example: L. S. Clothiers
Town Planning Experimental
Council Board Outcomes

P(B|A1)
P(A1 ∩ B)
= .2
P(A1) = = .14
.7 P(Bc|A1) P(A1 ∩ Bc)
= .8 = .56
P(B|A2)
P(A2 ∩ B)
= .9
P(A2) = = .27
.3 P(A2 ∩ Bc)
P(Bc|A2)
= .03
= .1
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem
To find the posterior probability that event Ai
will
occur given that event B has occurred, we
apply
Bayes’ theorem. P( Ai ) P ( B | Ai )
P( i A | =B )
P( A1) P( B
| 1 )A
+ ( P ) A2 ( P| B) 2+ A
. +.P . An ( P) B( An| )

Bayes’ theorem is applicable when the


events for
which we want to compute posterior
probabilities
are mutually exclusive and their union is the
entire
sample space.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Posterior Probabilities

Example: L. S. Clothiers
Given the planning board’s recommendation not
to approve the zoning change, we revise the prior
probabilities as follows:

P( A1 ) P ( B | A1 )
P( 1 A | B
= )
P( A1 ) P ( B | 1A+) P ( A 2 )P (B |A2 )
(. 7 )(. 2 )
=
(. 7 )(. 2 ) + (. 3)(. 9)
= .3
4

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Posterior Probabilities

Example: L. S. Clothiers
The planning board’s recommendation is
good
news for L. S. Clothiers. The posterior
probability of
the town council approving the zoning
change is .34
compared to a prior probability of .70.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers

Step 1
Prepare the following three
columns:
Column 1 − The mutually exclusive events
for
which
Column posterior
2 − The probabilities are desired.
prior probabilities for the
events.
Column 3 − The conditional probabilities of
the
new information given each event.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers

Step 1
( ( ( ( (
1 2
Prior Conditiona
3 4 5
)
Even )
Probabiliti )l ) )
es Probabiliti
ts
es
Ai P(Ai)
A1 P(B.|Ai)
.
A2 2
7
.
.
9
3
1.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole
Slide
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers

Step 2
Prepare the fourth column:
Column 4
Compute the joint probabilities for each
event and
the new information B by using the
multiplication
Multiply the prior probabilities in
law. 2 by the corresponding conditional
column
probabilities in column 3. That is, P(Ai B) =
P(Ai) P(B|Ai).

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers

Step 2
( ( ( ( (
1 2
Prior Conditiona
3 4
Joint 5
)
Even )
Probabiliti )l )
Probabiliti )
es Probabiliti es
ts
es
Ai P(Ai) P(Ai B)
A1 P(B.|Ai) .1
.
A2 2 4
7 .7 x .2
. .2
.
9 7
3
1.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole
Slide
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers

Step
We2see
(continued)
that there is a .14 probability of
the town
council
Thereapproving the zoningofchange
is a .27 probability and a
the town
negative
council recommendation by the planning
board.
disapproving the zoning change and a
negative
recommendation by the planning board.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers
Step 3
Sum the joint probabilities in Column
4. The
sum is the probability of the new
information,
P(B). The sum .14 + .27 shows an overall
probability of .41 of a negative
recommendation
by the planning board.

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers

Step 3
( ( ( ( (
1 2
Prior Conditiona
3 4
Joint 5
)
Even )
Probabiliti )l )
Probabiliti )
es Probabiliti es
ts
es
Ai P(Ai) P(A.1
i B)
A1 P(B.|Ai)
.
A2 2 4
7
. P(B) .2
.
9 = .417
3
1.
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole
Slide
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers

Step 4
Prepare the fifth column:
Column 5
Compute the posterior probabilities
using the
basic relationship of conditional
probability. P( Ai ∩ B)
P( Ai | B) =
P( B)
The joint probabilities P(Ai B) are in
column 4
and the probability P(B) is the sum of
© 2014 column
Cengage 4.All Rights Reserved. May not be
Learning.
scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole
Bayes’ Theorem: Tabular Approach

Example: L. S. Clothiers

Step 4
( ( ( ( (
1 2
Prior Conditiona
3 Joint
4 5
Posterior
)
Even )
Probabiliti )l Probabiliti
) )
Probabiliti
es Probabiliti es es
ts es
Ai P(Ai) P(Ai B) P(Ai |B)
A1 P(B.|Ai) .1
. .341
A2 2 4
7 5
. P(B) .2 .658
.
9 = .417
3 .14/.4 5
1. 1 1.00
© 2014 Cengage Learning. All Rights Reserved. May not be
scanned, copied
0
or duplicated, or posted to a publicly accessible website, in whole 00Slide
End of Chapter 4

© 2014 Cengage Learning. All Rights Reserved. May not be


scanned, copied
Slide
or duplicated, or posted to a publicly accessible website, in whole

You might also like