0% found this document useful (0 votes)
107 views44 pages

QM Session 4

1) The document discusses probability theory concepts including the multiplication rule, independence of events, Bayes' theorem, sample spaces, counting rules, and methods for assigning probabilities. 2) It provides examples using an investment in two stocks to illustrate concepts such as determining the sample space, using a tree diagram and counting rules, and calculating probabilities of events. 3) Probability can be assigned using classical, relative frequency, and subjective methods, and examples are given of calculating probabilities and applying rules like addition and conditional probability.

Uploaded by

harherron123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
107 views44 pages

QM Session 4

1) The document discusses probability theory concepts including the multiplication rule, independence of events, Bayes' theorem, sample spaces, counting rules, and methods for assigning probabilities. 2) It provides examples using an investment in two stocks to illustrate concepts such as determining the sample space, using a tree diagram and counting rules, and calculating probabilities of events. 3) Probability can be assigned using classical, relative frequency, and subjective methods, and examples are given of calculating probabilities and applying rules like addition and conditional probability.

Uploaded by

harherron123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Probability theory- Multiplication rule- Examples

1,2,3,4,5,6
5 →(5,1) (5,2) (5,3) (5,4) (5,6)
Probability theory- Independence of events
Probability theory- Independence of events
Probability theory- Independence of events
Baye’s theorem
An Experiment and Its Sample Space
• Example: Bradley Investments
Bradley has invested in two stocks, Markley Oil and Collins
Mining. Bradley has determined that the possible outcomes of
these investments three months from now are as follows.
Investment Gain or Loss
in 3 Months (in $1000s)
Markley Oil Collins Mining
10 8
5 -2
0
-20

6
A Counting Rule for Multiple-Step
Experiments
• If an experiment consists of a sequence of k steps in which there are n1
possible results for the first step, n2 possible results for the second step, and
so on, then the total number of experimental outcomes is given by (n1)(n2) . .
. (nk).
• A helpful graphical representation of a multiple-step experiment is a tree
diagram.

7
A Counting Rule for Multiple-Step
Experiments
• Example: Bradley Investments
• Bradley Investments can be viewed as a two-step experiment. It
involves two stocks, each with a set of experimental outcomes.
Markley Oil: n1 = 4
Collins Mining: n2 = 2
Total Number of
Experimental Outcomes: n1n2 = (4)(2) = 8

8
Tree Diagram
• Example: Bradley Investments
Markley Oil Collins Mining Experimental
(Stage 1) (Stage 2) Outcomes
Gain 8
(10, 8) Gain $18,000
(10, -2) Gain $8,000
Gain 10 Lose 2
Gain 8 (5, 8) Gain $13,000

Lose 2 (5, -2) Gain $3,000


Gain 5
Gain 8
(0, 8) Gain $8,000
Even
(0, -2) Lose $2,000
Lose 20 Lose 2
Gain 8 (-20, 8) Lose $12,000

Lose 2 (-20, -2) Lose $22,000

9
Counting Rule for Combinations
• Number of Combinations of N Objects Taken n at a Time
• A second useful counting rule enables us to count the number
of experimental outcomes when n objects are to be selected
from a set of N objects.
𝑁 𝑁 𝑁!
𝐶 = =
𝑛 𝑛 𝑛! 𝑁−𝑛 !

where: N! = N(N - 1)(N - 2) . . . (2)(1)


n! = n(n - 1)(n - 2) . . . (2)(1)
0! = 1

10
Counting Rule for Permutations
• Number of Permutations of N Objects Taken n at a Time
• A third useful counting rule enables us to count the number of
experimental outcomes when n objects are to be selected from a set of
N objects, where the order of selection is important.

𝑁 𝑁 𝑁!
𝑃 = 𝑛! =
𝑛 𝑛 𝑁−𝑛 !

where: N! = N(N - 1)(N - 2) . . . (2)(1)


n! = n(n - 1)(n - 2) . . . (2)(1)
0! = 1

11
Classical Method
• Example: Rolling a Die
If an experiment has n possible outcomes, the classical
method would assign a probability of 1/n to each outcome.
Experiment: Rolling a die
Sample Space: S = {1, 2, 3, 4, 5, 6}
Probabilities: Each sample point has a 1/6 chance of occurring

12
Relative Frequency Method
• Example: Lucas Tool Rental
Lucas Tool Rental would like to assign probabilities to the number of car
polishers it rents each day. Office records show the following frequencies of
daily rentals for the last 40 days.
Number of Number
Polishers Rented of Days
0 4
1 6
2 18
3 10
4 2

13
Relative Frequency Method
• Example: Lucas Tool Rental
Each probability assignment is given by dividing the frequency
(number of days) by the total frequency (total number of days).

Number of Number
Polishers Rented of Days Probability
0 4 .10 = 4/40
1 6 .15
2 18 .45
3 10 .25
4 2 .05
40 1.00

14
Subjective Method
• When economic conditions or a company’s circumstances change rapidly it
might be inappropriate to assign probabilities based solely on historical data.
• We can use any data available as well as our experience and intuition, but
ultimately a probability value should express our degree of belief that the
experimental outcome will occur.
• The best probability estimates often are obtained by combining the estimates
from the classical or relative frequency approach with the subjective estimate.

15
Subjective Method
• Example: Bradley Investments
An analyst made the following probability estimates.
Exper. Outcome Net Gain or Loss Probability
(10, 8) $18,000 Gain .20
(10, -2) $8,000 Gain .08
(5, 8) $13,000 Gain .16
(5, -2) $3,000 Gain .26
(0, 8) $8,000 Gain .10
(0, -2) $2,000 Loss .12
(-20, 8) $12,000 Loss .02
(-20, -2) $22,000 Loss .06
1.00
16
Events and Their Probabilities
• Example: Bradley Investments
Event M = Markley Oil Profitable
M = {(10, 8), (10, -2), (5, 8), (5, -2)}
P(M) = P(10, 8) + P(10, -2) + P(5, 8) + P(5, -2)
= .20 + .08 + .16 + .26
= .70

17
Events and Their Probabilities
• Example: Bradley Investments
Event C = Collins Mining Profitable
C = {(10, 8), (5, 8), (0, 8), (-20, 8)}
P(C) = P(10, 8) + P(5, 8) + P(0, 8) + P(-20, 8)
= .20 + .16 + .10 + .02
= .48

18
Union of Two Events
• Example: Bradley Investments
Event M = Markley Oil Profitable
Event C = Collins Mining Profitable
M  C = Markley Oil Profitable
or Collins Mining Profitable (or both)
M  C = {(10, 8), (10, -2), (5, 8), (5, -2), (0, 8), (-20, 8)}
P(M  C) = P(10, 8) + P(10, -2) + P(5, 8) + P(5, -2) + P(0, 8) + P(-20, 8)
= .20 + .08 + .16 + .26 + .10 + .02
= .82

19
Intersection of Two Events
• Example: Bradley Investments
Event M = Markley Oil Profitable
Event C = Collins Mining Profitable
M  C = Markley Oil Profitable and Collins Mining Profitable
M  C = {(10, 8), (5, 8)}
P(M  C) = P(10, 8) + P(5, 8)
= .20 + .16
= .36

20
Addition Law
• Example: Bradley Investments
Event M = Markley Oil Profitable
Event C = Collins Mining Profitable
M  C = Markley Oil Profitable or Collins Mining Profitable
We know: P(M) = .70, P(C) = .48, P(M  C) = .36
Thus: P(M  C) = P(M) + P(C) - P(M  C)
= .70 + .48 - .36
= .82
(This result is the same as that obtained earlier
using the definition of the probability of an event.)

21
Conditional Probability
• The probability of an event given that another event has occurred is called a
conditional probability.
• The conditional probability of A given B is denoted by P(A|B).
• A conditional probability is computed as follows :
𝑷(𝑨 ∩ 𝑩)
𝑷 𝑨𝑩 =
𝑷(𝑩)

22
Conditional Probability
• Example: Bradley Investments
Event M = Markley Oil Profitable
Event C = Collins Mining Profitable
P(C|M) = Collins Mining Profitable given Markley Oil Profitable
We know: P(M  C) = .36, P(M) = .70
𝑃(𝐶∩𝑀) .36
Thus: 𝑷 𝑪 𝑴 = 𝑃(𝑀)
=
.70
= .5143

23
Multiplication Law
• The multiplication law provides a way to compute the probability of the
intersection of two events.
• The law is written as:
P(A  B) = P(B)P(A|B)

24
Multiplication Law
• Example: Bradley Investments
Event M = Markley Oil Profitable
Event C = Collins Mining Profitable
M  C = Markley Oil Profitable and Collins Mining Profitable
We know: P(M) = .70, P(C|M) = .5143
Thus: P(M  C) = P(M)P(C|M)
= (.70)(.5143)
= .36
(This result is the same as that obtained earlier
using the definition of the probability of an event.)

25
Multiplication Law for Independent Events
• Example: Bradley Investments
Event M = Markley Oil Profitable
Event C = Collins Mining Profitable
Are events M and C independent?
Does P(M  C) = P(M)P(C) ?
We know: P(M  C) = .36, P(M) = .70, P(C) = .48
But: P(M)P(C) = (.70)(.48) = .34, not .36
Hence: M and C are not independent.

26
Bayes’ Theorem
• Often we begin probability analysis with initial or prior probabilities.
• Then, from a sample, special report, or a product test we obtain some
additional information.
• Given this information, we calculate revised or posterior probabilities.
• Bayes’ theorem provides the means for revising the prior probabilities.

New Application
Prior Posterior
Information of Bayes’
Probabilities Probabilities
Theorem

27
Bayes’ Theorem
• Example: L. S. Clothiers
A proposed shopping center will provide strong competition for downtown
businesses like L. S. Clothiers. If the shopping center is built, the owner of L. S.
Clothiers feels it would be best to relocate to the shopping center.

The shopping center cannot be built unless a zoning change is


approved by the town council. The planning board must first make a
recommendation, for or against the zoning change, to the council.

28
Prior Probabilities
• Example: L. S. Clothiers
Let:
A1 = town council approves the zoning change
A2 = town council disapproves the change
Using subjective judgment:
P(A1) = .7, P(A2) = .3

29
New Information
• Example: L. S. Clothiers
The planning board has recommended against the zoning change. Let B
denote the event of a negative recommendation by the planning board.
Given that B has occurred, should L. S. Clothiers revise the probabilities
that the town council will approve or disapprove the zoning change?

30
Conditional Probabilities
• Example: L. S. Clothiers
Past history with the planning board and the town council indicates
the following:
P(B|A1) = .2 and P(B|A2) = .9
Hence: P(BC|A1) = .8 and P(BC|A2) = .1

P(B|𝐴1 ) = proportion of cases not recommended by the board from among the cases approved by the council

31
Tree Diagram
• Example: L. S. Clothiers
Town Council Planning Board Experimental Outcomes

P(B|A1) = .2
P(A1  B) = .14
P(A1) = .7
c
P(B |A1) = .8 P(A1  Bc) = .56

P(B|A2) = .9
P(A2  B) = .27
P(A2) = .3
c
P(B |A2) = .1 P(A2  Bc) = .03
1.00

32
Bayes’ Theorem
• To find the posterior probability that event Ai will occur given that event B
has occurred, we apply Bayes’ theorem.
𝑃 𝐴𝑖 𝑃(𝐵|𝐴𝑖 )
𝑃 𝐴𝑖 𝐵 =
𝑃 𝐴1 𝑃 𝐵 𝐴1 + 𝑃 𝐴2 𝑃 𝐵 𝐴2 + ⋯ + 𝑃 𝐴𝑛 𝑃(𝐵|𝐴𝑛 )

• Bayes’ theorem is applicable when the events for which we want to


compute posterior probabilities are mutually exclusive and their union is
the entire sample space.

33
Posterior Probabilities
• Example: L. S. Clothiers
Given the planning board’s recommendation not to approve the
zoning change, we revise the prior probabilities as follows:
𝑃 𝐴1 𝑃(𝐵|𝐴1 )
𝑷 𝑨𝟏 𝑩 =
𝑃 𝐴1 𝑃 𝐵 𝐴1 + 𝑃 𝐴2 𝑃 𝐵 𝐴2
.7 (.2)
=
.7 .2)+ .3 .9)
= .34

34
Posterior Probabilities
• Example: L. S. Clothiers
The planning board’s recommendation is good news for L. S. Clothiers.
The posterior probability of the town council approving the zoning
change is .34 compared to a prior probability of .70.

35
Bayes’ Theorem
• Often we begin probability analysis with initial or prior probabilities.
• Then, from a sample, special report, or a product test we obtain some
additional information.
• Given this information, we calculate revised or posterior probabilities.
• Bayes’ theorem provides the means for revising the prior probabilities.

New Application
Prior Posterior
Information of Bayes’
Probabilities Probabilities
Theorem

36
Bayes’ Theorem
• Example: L. S. Clothiers
A proposed shopping center will provide strong competition for downtown
businesses like L. S. Clothiers. If the shopping center is built, the owner of L. S.
Clothiers feels it would be best to relocate to the shopping center.

The shopping center cannot be built unless a zoning change is


approved by the town council. The planning board must first make a
recommendation, for or against the zoning change, to the council.

37
Prior Probabilities
• Example: L. S. Clothiers
Let:
A1 = town council approves the zoning change
A2 = town council disapproves the change
Using subjective judgment:
P(A1) = .7, P(A2) = .3

38
New Information
• Example: L. S. Clothiers
The planning board has recommended against the zoning change. Let B
denote the event of a negative recommendation by the planning board.
Given that B has occurred, should L. S. Clothiers revise the probabilities
that the town council will approve or disapprove the zoning change?

39
Conditional Probabilities
• Example: L. S. Clothiers
Past history with the planning board and the town council indicates
the following:
P(B|A1) = .2 and P(B|A2) = .9
Hence: P(BC|A1) = .8 and P(BC|A2) = .1

P(B|𝐴1 ) = proportion of cases not recommended by the board from among the cases approved by the council

40
Tree Diagram
• Example: L. S. Clothiers
Town Council Planning Board Experimental Outcomes

P(B|A1) = .2
P(A1  B) = .14
P(A1) = .7
c
P(B |A1) = .8 P(A1  Bc) = .56

P(B|A2) = .9
P(A2  B) = .27
P(A2) = .3
c
P(B |A2) = .1 P(A2  Bc) = .03
1.00

41
Bayes’ Theorem
• To find the posterior probability that event Ai will occur given that event B
has occurred, we apply Bayes’ theorem.
𝑃 𝐴𝑖 𝑃(𝐵|𝐴𝑖 )
𝑃 𝐴𝑖 𝐵 =
𝑃 𝐴1 𝑃 𝐵 𝐴1 + 𝑃 𝐴2 𝑃 𝐵 𝐴2 + ⋯ + 𝑃 𝐴𝑛 𝑃(𝐵|𝐴𝑛 )

• Bayes’ theorem is applicable when the events for which we want to


compute posterior probabilities are mutually exclusive and their union is
the entire sample space.

42
Posterior Probabilities
• Example: L. S. Clothiers
Given the planning board’s recommendation not to approve the
zoning change, we revise the prior probabilities as follows:
𝑃 𝐴1 𝑃(𝐵|𝐴1 )
𝑷 𝑨𝟏 𝑩 =
𝑃 𝐴1 𝑃 𝐵 𝐴1 + 𝑃 𝐴2 𝑃 𝐵 𝐴2
.7 (.2)
=
.7 .2)+ .3 .9)
= .34

43
Posterior Probabilities
• Example: L. S. Clothiers
The planning board’s recommendation is good news for L. S. Clothiers.
The posterior probability of the town council approving the zoning
change is .34 compared to a prior probability of .70.

44

You might also like