0% found this document useful (0 votes)
43 views

Module 2 - Probability Concepts and Applications

1) This document outlines the key concepts and formulas for probability that will be covered in Module 2 of the Quantitative Analysis course, including fundamental probability concepts, rules, and types of probability. 2) Conditional probability and Bayes' theorem are introduced, as well as how to calculate probabilities of unions, intersections, and complements of events. 3) Examples are provided to demonstrate independent and dependent events, and how to apply the conditional probability formula in different scenarios.

Uploaded by

nkrumah prince
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

Module 2 - Probability Concepts and Applications

1) This document outlines the key concepts and formulas for probability that will be covered in Module 2 of the Quantitative Analysis course, including fundamental probability concepts, rules, and types of probability. 2) Conditional probability and Bayes' theorem are introduced, as well as how to calculate probabilities of unions, intersections, and complements of events. 3) Examples are provided to demonstrate independent and dependent events, and how to apply the conditional probability formula in different scenarios.

Uploaded by

nkrumah prince
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 67

QUANTITATIVE ANALYSIS (ISD

551) Dr. Emmanuel Quansah


Lecturer:
Department: Supply Chain and Information Systems - KSB
Office: SF 25, KSB Undergraduate Block
Module 2:
Probability Concepts and
Applications
Module
Outline
1. Fundamental Concepts
2. Revision of Probability Rules
3. Conditional Probability
4. Bayes Theorem
5. Decision Trees
6. Normal Distribution
Introduction
• Life is uncertain; we are not sure what the future will bring

• Probability is the likelihood that an outcome occurs.


Probabilities are expressed as values between 0 and 1.
• An experiment is the process that results in an outcome.
• The outcome of an experiment is a result that we
observe.
• An event is a collection of one or more outcomes from a
sample space.
• The sample space is the collection of all possible
outcomes of an experiment.
Management and
Uncertainties

Managers often base their decisions on
an analysis of uncertainties such as the
following
• What are the chances that sales will
decrease if we increase prices?
• What is the likelihood a new assembly
method will increase
productivity?
• What are the odds that a new investment
will be profitable?
Two Basic Rules of
Probability
• The probability, P, of any event or state of nature occurring
is greater than or equal to 0 and less than or equal to 1.
That is,
0 ≤ P(event) ≤ 1
Rule 1:
A probability of 0 indicates that an event is never expected
to occur. A probability of 1 means that an event is always
expected to occur.

Rule 2:
• The sum of the simple probabilities for all possible
outcomes of an activity must equal 1.
Types of Probability (1 of
3)
• Relative Frequency Approach
• Probabilities are based on empirical data

Number of occurrences of the event


P(event) =
Total number of trials or outcomes
Diversey Paint
•Example
Historical demand for white latex paint at = 0, 1, 2, 3, or 4
gallons per day
• Observed frequencies over the past 200 days

TABLE 2.2 Relative Frequency Approach to Probability for


Paint Sales
QUANTITY DEMANDED (GALLONS) NUMBER OF DAYS PROBABILITY

0 40 0.20(= 40÷200)

1 80 0.40(= 80÷200)

2 50 0.25(= 50÷200)

3 20 0.10(= 20÷200)

4 10 0.05(= 10÷200)

Total 200 Total 1.00(= 200÷200)


Types of Probability (2 of
3)
• Classical Approach
– probabilities can be deduced from theoretical arguments
• Perform a series of trials

1  Number of ways of getting a head


P head =
2  Number of possible outcomes(head or tail)
P
13  Number of chances of drawing a spade
spade  =
52  Number of possible outcomes
1
= = 0.25 = 25%
4
Types of Probability (3 of
3)
• Subjective Approach
• Probabilities are based on the experience and
judgment of the person making the estimate.
Example;
• Opinion polls
• Judgment of experts
• Delphi method
Probability Rules and Formulas
Complement of an Event
• If A is any event, the complement of A, denoted
consists of all outcomes in the sample space
not in A.

• Rule: The probability of the complement of any


event A is
Unions and Intersections of Events (1 of
2)
• The probability of the intersection of two events is called a joint
probability.
• Intersection – the set of all outcomes that are common to both
events
• Intersection of event A and event B = A and =A∩B
B
= AB
– Probability notation

– P(Intersection of event A and event B)= P(A and B)

= P(A ∩ B)
= P(AB)
Unions and Intersections of Events (2 of
2)
• Union – the set of all outcomes that are contained in
either of two events

Union of event A and event B = A or B

– Probability notation

P(Union of event A and event B) = P(A or B)


= P(A ∪ B)
Probability Rules and Formulas
Union of Events
• The union of two events contains all outcomes that belong
to either of the two events.
– If A and B are two events, the probability that some
outcome in either A or B (that is, the union of A and B)
occurs is denoted as
• Two events are mutually exclusive if they have no
outcomes in common.
– Tossing a coin will result in either a head or a tail
– Rolling a die will result in only one of six possible outcomes

• Rule: If events A and B are mutually exclusive,


then
Probability Rules and Formulas
Non-Mutually Exclusive Events
• The notation (A and B) represents the intersection
of events A and B - that is, all outcomes
belonging to both A and B.

• Rule: If two events A and B are not mutually


exclusive, then
Unions and Intersections of Events (4
of 4)
The addition law provides a way to compute the probability of
event A, or B, or both A and B occurring.
• General rule for union of two events, additive rule

P(A or B) = P(A) + P(B) − P(A and B)

• Union of two events, drawing a a 7 or a heart from a deck


of cards

• P(A U B) = P(A) + P(B) − P(A  B)


= 4/52 + 13/52 − 1/52

= 16/52
Venn
FIGURE 2.1 VennDiagrams
FIGURE 2.2 Venn Diagram
Diagram for Events That for Events That Are Not
Are Mutually Mutually Exclusive
Exclusive
Conditional
Probability
• Conditional probability is the probability of
occurrence of one event A, given that another event
B is known to be true or has already occurred.

P( AB)
P(A | B) = P(B)
P(AB) = P(A |
B)P(B)
• Example: Probability of a 7 given a heart has been
drawn 1
P(AB)
P(A | B) = 52 = 113
P(B) 13
= 52
Example: Using the Conditional
Probability Formula
Joint Probability Table Brand 1 Brand 2 Brand 3 Grand Total
Female 0.09 0.06 0.22 0.37
Male 0.25 0.17 0.21 0.63
Grand Total 0.34 0.23 0.43 1

• Summary of conditional probabilities:

• Applications in marketing and advertising.


Variations of the Conditional Probability
Formula


– Note:

• Multiplication law of probability:


Independent Events (1 of
3)
• Independent events are events that have no effect on
each other
• Two events A and B are independent if
P(A | B) = P(A)

P(A and B) = P(A)P(B)


• For a fair coin tossed twice

A = event that a head is the result of the first toss


B = event that a head is the result of the second
toss
P(A) = 0.5 and P(B) = 0.5
P(AB) = P(A)P(B) = 0.5(0.5) = 0.25
Independent Events (2 of
3)
• A bucket contains 3 black balls and 7 green balls
• Draw a ball from the bucket, replace it, and
draw a second ball
1. The probability of a black ball drawn on first draw
is:

P(B) = 0.30
2. The probability of two green balls drawn is:

P(GG) = P(G) × P(G) = 0.7 × 0.7 = 0.49


Independent Events (3 of
3)
• A bucket contains 3 black balls and 7 green balls
• Draw a ball from the bucket, replace it, and draw a second
ball
3. The probability of a black ball drawn on the
second draw if the first draw is green is:
P(B|G) = P(B) = 0.30
4. The probability of a green ball drawn on
the
second draw if the first draw is green is:
P(G|G) = P(G) = 0.70
Dependent Events (1 of
3)
• An bowl contains the following 10 balls:
– 4 are white (W) and lettered (L)
– 2 are white (W) and numbered (N)
– 3 are yellow (Y) and lettered (L)
– 1 is yellow (Y) and numbered (N)

P(WL) = 4/10 = 0.4 P(YL) = 3/10 = 0.3


P(WN) = 2/10 = 0.2 P(YN) = 1/10 = 0.1
P(W) = 6/10 = 0.6 P(L) = 7/10 = 0.7
P(Y) = 4/10 = 0.4 P(N) = 3/10 = 0.3
Dependent Events (2 of
3)

4 balls White (W)


and Lettered (L) Probability (WL) = 4
10

2 balls White (W)


The
bowl and Numbered (N) Probability (WN) = 2
contains 10
10 balls
3 balls Yellow (Y)
and Lettered (L) Probability (WN) = 3
10

1 ball Yellow (Y) and


Numbered (N) 3
Probability (YN) = 10
Dependent Events (3 of
3)
• The conditional probability that the ball drawn is lettered,
given that it is yellow
P(YL) 0.3
P(L | Y ) = = = 0.75
P(Y ) 0.4

• We can verify P(YL) using the joint probability formula


P(YL) = P(L | Y) × P(Y) = (0.75)(0.4) = 0.3
Bayes'
Theorem
Bayes' Theorem (1 of
3)
• Bayes' Theorem shows the relationship
between a conditional probability and its
inverse.
• Baye‟s theorem is used to incorporate
additional information as it is made available
and help create revised or posterior
probabilities.
Bayes' Theorem (2 of
3)
• Often we begin probability analysis with initial or
prior probabilities.
• Then, from a sample, special report, or a product
test we obtain some additional information.
• Given this information, we calculate revised or
posterior probabilities.
• Bayes’ theorem provides the means for revising
the prior probabilities.
Application
Prior New Posterior
of Bayes’
Probabilities Information Probabilities
Theorem
Bayes' Theorem (3of
3)

• Bayes theorem for “n” mutually exclusive


events is given by;

P(Ai )  P(B|
P(Ai | B)
P(A )  P(B| A )  Ai ) 2 )  P(B| A2 )  ...  P(An ) 
P(A
 1 1

P(B| An )
Bayes' Theorem Examples (1of
4)
Example 1:
• Three machines A, B and C produce
respectively 50%, 30% and 20% of the total
number of items of a factory. The percentages
of defective output of these machines are
3%, 4% and 5%. If an item is selected at
random, find the probability that the item is
defective.
Bayes' Theorem Examples (2of
4)
Solution:
•Let X be the event that an item is defective.
Therefore;

• P(X) = P(A) P(X|A) + P(B) P(X|B) + P(C)


P(X|C)
• = (0.50)(0.03) + (0.30)(0.04) + (0.20)
(0.05)
• = 0.037
Tree diagram for example
1
0.03 D
A
0.50 N
0.04
0.30 B
D
0.20 0.05
C N

D
Bayes' Theorem Examples (3of
4)
• Example 2:
• Consider the factory in the preceding example. Suppose
an item is selected at random and is found to be
defective. Find the probability that the item was produced
by machine A; that is, find P(A|X).

• Solution:

By Bayes‟ theorem,
P( A)P( X |
P( A | X )
P( A)P( X | A) A)
P(B)P( X | B)  P(C)P( X |

C)
Bayes' Theorem Examples (4of
4)
• P(A|X) = . P(A) P(X|A)
• . P(A) P(X|A) + P(B) P(X|B) + P(C) P(X|C)
=. (0.50)(0.03) .
• (0.50)(0.03) + (0.30)(0.04) + (0.20)(0.05) 0.037
=. (0.50)(0.03) .
• (0.50)(0.03) + (0.30)(0.04) + (0.20)(0.05) 0.037
• =
• 0.015
• 0.037
=
0.405
Bayes' Theorem Examples (4of
4)
• Example 3:
• Three girls Aileen, Barbara, and Cathy pack
biscuits in a factory. From the batch allotted to
them Aileen packs 55%, Barbara 30% and
Cathy 15%. The probability that Aileen breaks
some biscuits in a packet is 0.7 and the
respective probabilities for Barbara and Cathy
are 0.2 and 0.1. What is the probability that a
packet with broken biscuits found by the
checker was broken by Aileen?
Bayes' Theorem Examples (4of
4)
Solution:
• Let A be the event „the packet was packed by
Aileen‟,
• B the event „the packet was packed by
Barbara‟,
• C the event „the packet was packed by
Cathy‟,
• D the event „the packet contains broken
biscuits‟.
Bayes' Theorem Examples (4of
4) We are given:
• P(A) = 0.55; P(B) = 0.30; P(C)
= 0.15; and
• P(D|A) = 0.70; P(D|B) = 0.20; P(D|C) =
0.10.
• We require P(A|D).
• So we use Bayes‟ theorem to reverse the conditions;
P(D | A)  (0.70)(0.55)
P(A | D) = = 0.837
P(A)P(D) 0.46

P(D) is the „total probability‟ of D, i.e. the probability that a packet contains
broken biscuits.

P(D) = P(D|A) ∙ P(A) + P(D|B) ∙ P(B) + P(D|C) ∙ P(C)


= (0.70)(0.55) + (0.2)(0.30) + (0.1)(0.15)
= 0.46
Bayes' Theorem Examples (4of
4)
• Tree diagram for example 3:

P(D|A) = 0.70
P(D|A) ∙ P(A) = (0.70)(0.55)
P( |A) = 0.30
P(A) = 0.55
P(D|B) = 0.20
P(B) = 0.30 P(D|B) ∙ P(B) = (0.20)(0.30)
P( |B) = 0.80

P(C) = 0.15 P(D|C) = 0.10


P(D|C) ∙ P(C) = (0.10)(0.15)
P( |C) = 0.90
Packer of biscuits State of biscuits
RANDOM VARIABLES
Random
Variables
• A random variable represents a possible numerical value
from an uncertain event, e.g
X = number of refrigerators sold during the day
• Discrete random variables produce outcomes that come
from a counting process (i.e. number of classes you are
taking).
• can assume only a finite or limited set of values

• Continuous random variables produce outcomes that


come from a measurement (i.e. your annual salary, or
your weight).
• can assume any one of an infinite set of values
Probability Distribution
Overview
Probability
Distributions

Discrete Continuous
Probability Probability
Distributions Distributions

Binomial Normal

Poisson Uniform

Hypergeometric Exponential
Continuous Probability
Distributions
•A continuous random variable is a variable that can
assume any value on a continuum (can assume an
uncountable number of values)
• thickness of an item
• time required to complete a task
• temperature of a solution
• height

• These can potentially take on any value, depending only


on the ability to measure precisely and accurately.
The Normal Distribution
Properties
The Normal Distribution (1
of 4)

 The formula for the normal probability density


function is
( x
1  2)2 2
f(X)= e

2constant approximated by 2.71828
Where e = the mathematical
π = the mathematical constant approximated by
3.14159 μ = the population mean
σ = the population standard deviation
X = any value of the continuous variable
The Normal Distribution (2
of 4)
FIGURE 2.7 Normal Distribution with Different Values for μ
The Normal Distribution (3
of 4)
FIGURE 2.8 Normal Distribution with Different Values for σ
The Normal Distribution (4
of 4)
The total area under the curve is 1.0, and the curve is
symmetric, so half is above the mean, half is below.

f(X) P(  X  μ)  0.5


P(μ  X  )  0.5

0.5
0.5

P(  X  ) 
1.0
Using the Standard Normal Table (1
of 4)
Step 1
• Convert the normal distribution into a standard normal
distribution
– Mean of 0 and a standard deviation of 1
– The new standard random variable is Z

X
Z=

where
X = value of the random variable we want to measure
μ = mean of the distribution
σ = standard deviation of the distribution
Z = number of standard deviations from X to the mean, μ
Using the Standard Normal Table (2
of 4)
• Example 1:
• For μ = 100, σ = 15, find the probability that X is less than
130

X 130 
Z= =
 15
100
30
= 15 = 2 std dev

FIGURE 2.9 Normal Distribution Showing the Relationship Between


Z Values and X Values
Using the Standard Normal Table (3
of 4)
Step 2
• Look up the probability from a table of normal curve areas

• Normally, the column on the left is Z value


• Row at the top has second decimal places for Z values
Using the Standard Normal Table (4
of 4)
TABLE 2.10 (partial) Standardized Normal Distribution
Function
AREA UNDER THE NORMAL CURVE
Z 0.00 0.01 0.02 0.03
1.8 0.96407 0.96485 0.96562 0.96638
1.9 0.97128 0.97193 0.97257 0.97320
2.0 0.97725 0.97784 0.97831 0.97882
2.1 0.98214 0.98257 0.98300 0.98341
2.2 0.98610 0.98645 0.98679 0.98713

For Z = 2.00
P(X < 130) = P(Z < 2.00) = 0.97725
P(X > 130) = 1 − P(X ≤ 130) = 1 − P(Z
≤ 2)
= 1 − 0.97725 = 0.02275
Standard Normal
TABLE Distribution
2.10 (partial) Standardized Normal Distribution Function

AREA UNDER THE NORMAL CURVE


Z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09

0.5 .69146 .69497 .69847 .70194 .70540 .70884 .71226 .71566 .71904 .72240

0.6 .72575 .72907 .73237 .73536 .73891 .74215 .74537 .74857 .75175 .75490

0.7 .75804 .76115 .76424 .76730 .77035 .77337 .77637 .77935 .78230 .78524

0.8 .78814 .79103 .79389 .79673 .79955 .80234 .80511 .80785 .81057 .81327

0.9 .81594 .81859 .82121 .82381 .82639 .82894 .83147 .83398 .83646 .83891

1.0 .84134 .84375 .84614 .84849 .85083 .85314 .85543 .85769 .85993 .86214

1.1 .86433 .86650 .86864 .87076 .87286 .87493 .87698 .87900 .88100 .88298

1.2 .88493 .88686 .88877 .89065 .89251 .89435 .89617 .89796 .89973 .90147

1.3 .90320 .90490 .90658 .90824 .90988 .91149 .91309 .91466 .91621 .91774

1.4 .91924 .92073 .92220 .92364 .92507 .92647 .92785 .92922 .93056 .93189

1.5 .93319 .93448 .93574 .93699 .93822 .93943 .94062 .94179 .94295 .94408
Finding Normal Probability (1 of
6)
Example 2:
• Let X represent the time it takes (in seconds) to download
an image file from the internet.
• Suppose X is normal with mean 8.0 and
standard deviation 5.0
• Find P(X < 8.6)

X
8.0
8.6
Finding Normal Probability (2 of
6)
Example 2:
 Suppose X is normal with mean 8.0 and standard
deviation
5.0. Find P(X < 8.6).
Xμ 8.6  8.0
Z   0.12

σ 5.0 μ=0
σ=1
μ=8
σ = 10

8 X 0 0.12 Z
8.6
P(Z < 0.12)
P(X < 8.6)
Finding Normal Probability (3 of
6)

P(X < 8.6)


Standardized Normal Probability
Table (Portion) = P(Z < 0.12)
.5478
Z .00 .01 .02

0.0 .5000 .5040 .5080


μ=0
0.1 .5398 .5438 .5478 σ=1

0.2 .5793 .5832 .5871

0.3 .6179 .6217 .6255 0 0.12 Z


Finding Normal Probability (4 of
6)
 Find P(X > 8.6)…

P(X > 8.6) = P(Z > 0.12) = 1.0 - P(Z ≤ 0.12)


= 1.0 - .5478 = .4522
.5478

1.0 - .5478 = .4522

Z
0
0.12
Finding Normal Probability (6 of
6)

Probability is measured by the area under the curve

f(X)
P(a ≤ X ≤
b)
(Note that the probability
of any individual value
is zero)

a b
Finding Normal
Probability
Procedure
To find P(a < X < b) when
X is distributed normally:

 Draw the normal curve for the problem in terms of X.

 Translate X-values to Z-values.

 Use the Standardized Normal Table.


Finding Normal
Probability Between
 Two Values
Suppose X is normal with mean 8.0 and
standard deviation 5.0. Find P(8 < X < 8.6)

Calculate Z-values:

Xμ 88
Z   0
8 8.6 X
σ 5
0 0.12 Z

P(8 < X < 8.6)


Xμ 8.6  8 = P(0 < Z < 0.12)
Z   0.12
Finding Normal
Probability Between
Two Values P(8 < X < 8.6)
Standardized Normal Probability = P(0 < Z < 0.12)
Table (Portion) = P(Z < 0.12) – P(Z ≤
0)
Z .00 .01 .02 = .5478 - .5000 = .0478
0.0 .5000 .5040 .5080 .5000 .0478

0.1 .5398 .5438 .5478

0.2 .5793 .5832 .5871

0.3 .6179 .6217 .6255


Z
0.00 0.12
Given Normal
Probability, Find the
X Value (1 of 3)
• Let X represent the time it takes (in seconds) to download an
image file from the internet.
• Suppose X is normal with mean 8.0 and standard
deviation 5.0
• Find X such that 20% of download times are less than X.

.2000

? 8.0 X
? 0 Z
Given Normal
Probability, Find the
X Value
• First, find the (2 of 3)
Z value corresponds to the known
probability using the table.

Z …. .03 .04 .05

-0.9 …. .1762 .1736 .1711


.2000
-0.8 …. .2033 .2005 .1977

-0.7 …. .2327 .2296 .2266


? X
8.0
-0.84 0 Z
Given Normal
Probability, Find the
X Value
• Second, convert (3 of 3)
the Z value to X units using the
following formula.

X  μ  Zσ
 8.0 
(0.84)5.0
 3.80
So 20% of the download times from the distribution with mean 8.0
and standard deviation 5.0 are less than 3.80 seconds.
The Empirical Rule (1 of
2)
• For a normally distributed random variable with
mean μ and standard deviation σ
• Approximately 68% of values will be within ±1σ of the
mean
• Approximately 95% of values will be within ±2σ of the
mean
• Almost all (99.7%) of values will be within ±3σ of the
mean
The Empirical Rule (2 of 2)
FIGURE 2.13 Approximate Probabilities from the Empirical
Rule
END OF
SESSION
Questions???

You might also like