0% found this document useful (0 votes)
5 views

Introduction To Probability

Notes on probability

Uploaded by

kariukiderrick5
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Introduction To Probability

Notes on probability

Uploaded by

kariukiderrick5
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Slide

6-1

Probability Theory
Probability: Understanding
Random Situations
Slide
6-2 Introduction
• The study of Uncertainty
– Changes “I’m not sure …”
• to “I’m positive we’ll succeed … with probability 0.8”
– Can’t predict “for sure” what will happen next
• But can quantify the likelihood of what might happen
• And can predict percentages well over the long run
• e.g., a 60% chance of rain
• e.g., success/failure of a new business venture
• New terminology (words and concepts)
– Keeps as much as possible Certain (not random)
– Put the randomness in only at the last minute
Slide
6-3 Terminology
• Random Experiment
– A procedure that produces an outcome
• Not perfectly predictable in advance
– There are many random experiments (situations)
• We will study them one at a time
– Example: Record the income of a random family
• Random telephone dialing in a target marketing area, repeat
until success (income obtained), round to nearest $thousand
• Sample Space
– A list of all possible outcomes
• Each random experiment has one (i.e., one list)
– Example: {0, 1,000, 2,000, 3,000, 4,000, …}
Slide
6-4 Terminology (continued)
• Event
– Happens or not, each time random experiment is run
• Formally: a collection of outcomes from sample space
• A “yes or no” situation: if the outcome is in the list, the event
“happens”
• Each random experiment has many different events of interest
– Example: the event “Low Income” ($15,000 or less)
• The list of outcomes is {0, 1, 2, …, 14,999, 15,000}
– Example: the event “Six Figures”
• The list of outcomes is {100,000, 100,001, 100,002, …, 999,999}
– Example: the event “Ten to Forty Thousand”
• The list of outcomes is {10,000, 10,001, …, 39,900, 40,000}
Slide
6-5 Terminology (continued)
• Probability of an Event

– A number between 0 and 1


• The likelihood of occurrence of an event
– Each random experiment has many probability numbers
• One probability number for each event
– Example: Probability of event “Low Income” is 0.17
• Occurs about 17% over long run, but unpredictable each time
– Example: Probability of event “Six Figures” is 0.08
• Not very likely, but reasonably possible
– Example: Probability of “10 to 40 thousand” is 0.55
• A little more likely to occur than not
Slide
6-6 Sources of Probabilities
• Relative Frequency
– From data
– What percent of the time the event happened in the past

• Theoretical Probability
– From mathematical theory
– Make assumptions, draw conclusions

• Subjective Probability
– Anyone’s opinion, perhaps even without data or theory
– Bayesian analysis uses subjective probability with data
Slide
6-7 Relative Frequency
• From data. Run random experiment n times
– See how often an event happened
• (Relative Frequency of A) = (# of times A happened)/n
– e.g., of 12 flights, 9 were on time.
• Relative frequency of the event “on time” is 9/12 = 0.75
• Law of Large Numbers
– If n is large, then the relative frequency will be close to
the probability of an event
• Probability is FIXED. Relative frequency is RANDOM
– e.g., toss coin 20 times. Probability of “heads” is 0.5
• Relative frequency is 12/20 = 0.6 , or 9/20 = 0.45 , depending
Slide
6-8 Relative Frequency (continued)
• Suppose event has probability 0.25
• In n = 5 runs of random experiment
– Event happens: no, yes, no, no, yes
– Relative frequency is 2/5 = 0.4
• Graph of relative frequencies for n = 1 to 5
Relative frequency

0.5

0.0
1 2 3 4 5
Number n of times random experiment was run
Slide
6-9 Relative Frequency (continued)
• As n gets larger
– Relative frequency gets closer to probability
• Graph of relative frequencies for n = 1 to 200
– Relative frequency approaches the probability
0.5
Relative frequency

Probability = 0.25
0
0 50 100 150 200
Number n of times random experiment was run
Slide
6-10
Table 6.3.1
Relative Frequency (continued)
• About how far from the probability will the
relative frequency be?
– The random relative frequency will be about one of its
standard deviations away from the (fixed) probability
– Depends upon the probability and n
• Farther apart when more uncertainty (probability near 0.5)
Probability Probability Probability
0.50 0.25 or 0.75 0.10 or 0.90

n = 10 0.16 0.14 0.09


25 0.10 0.09 0.06
50 0.07 0.06 0.04
100 0.05 0.04 0.03
1,000 0.02 0.01 0.01
Slide
6-11 Theoretical Probability
• From mathematical theory
• One example: The Equally Likely Rule
– If all N possible outcomes in the sample space are
equally likely, then the probability of any event A is
• Prob(A) = (# of outcomes in A) / N
– Note: this probability is not a random number. The
probability is based on the entire sample space
• e.g., Suppose there are 35 defects in a production lot of 400.
Choose item at random. Prob(defective) = 35/400 = 0.0875
• e.g., Toss coin. Prob(heads) = 1/2
• But: Tomorrow it may snow or not. Prob(snow)  1/2
– because “snow” and “not snow” are not equally likely
Slide
6-12 Subjective Probability
• Anyone’s opinion
– What do you think the chances are that the U.S.
economy will have steady expansion in the near future?
– An economist’s answer
• Bayesian analysis
– Combines subjective probability with data to get results
– Non-Bayesian “Frequentist” analysis computes using
only the data
• But subjective opinions (prior beliefs) can still play a
background role, even when they are not introduced as
numbers into a calculation, when they influence the choice of
data and the methodology (model) used
Slide
6-13 Bayesian and Nonbayesian Analysis
Fig 6.3.3

• Bayesian Analysis
Data
Bayesian
Prior Probabilities Results
Analysis
Model
• Frequentist (non-Bayesian) Analysis
Data
Prior Frequentist
Results
Beliefs Analysis
Model
Slide
6-14 Combining Events
• Complement of the event A
– Happens whenever A does not happen
• Union of events A and B
– Happens whenever either A or B or both events happen
• Intersection of A and B
– Happens whenever both A and B happen
• Conditional Probability of A Given B
– The updated probability of A, possibly changed to
reflect the fact that B happens
Slide
6-15 Complement of an Event
• The event “not A” happens whenever A does not
• Venn diagram: A (in circle), “not A” (shaded)

not A
A

• Prob(not A) = 1 – Prob(A)
– If Prob(Succeed) = 0.7, then Prob(Fail) = 1–0.7 = 0.3
Slide
6-16 Union of Two Events
• Union happens whenever either (or both) happen
• Venn diagram: Union “A or B” shaded)

A B

– e.g., A = “get Intel job offer”, B = “get GM job offer”


• Did the union happen? Congratulations! You have a job
– e.g., Did I have eggs or cereal for breakfast? “Yes”
Slide
6-17 Intersection of Two Events
• Intersection happens whenever both events happen
• Venn diagram: Intersection “A and B” shaded)

A B

– e.g., A = “sign contract”, B = “get financing”


• Did the intersection happen? Great! Project has been launched!
– e.g., Did I have eggs and cereal for breakfast? “No”
Slide
6-18 Relationship Between and and or
• Prob(A or B) = Prob(A)+Prob(B)–Prob(A and B)

= + –

• Prob(A and B) = Prob(A)+Prob(B)–Prob(A or B)


• Example: Customer purchases at appliance store
• Prob(Washer) = 0.20
• Prob(Dryer) = 0.25
• Prob(Washer and Dryer) = 0.15
– Then we must have
• Prob(Washer or Dryer) = 0.20+0.25–0.15 = 0.30
Slide
6-19 Conditional Probability
• Examples
– Prob (Win given Ahead at halftime)
• Higher than Prob (Win) evaluated before the game began

– Prob (Succeed given Good results in test market)


• Higher than Prob (Succeed) evaluated before marketing study

– Prob (Get job given Poor interview)


• Lower than Prob (Get job given Good interview)

– Prob (Have AIDS given Test positive)


• Higher than Prob (Have AIDS) for the population-at-large

– Prob (Sale of umbrellas given Rain)


Slide
6-20 Conditional Probability (continued)
• Given the extra information that B happens for
sure, how must you change the probability for A to
correctly reflect this new knowledge?
Prob (A and B)
Prob (A given B) =
Prob (B)
– This is a (conditional) probability about A
– The event B gives information
• Unconditional A B
– The probability of A
• Conditional A and B B
– A new universe, since B must happen
Slide
6-21 Conditional Probability (continued)
• Key words that may suggest conditional
probability
– By restricting attention to a particular situation where
some condition holds (the given information)
• Given …
• Of those …
• If …
• When …
• Within (this group) …
• …
Slide
6-22 Conditional Probability (continued)
• Example: appliance store purchases
• Prob(Washer) = 0.20
• Prob(Dryer) = 0.25
• Prob(Washer and Dryer) = 0.15
– Conditional probability of buying a Dryer given that
they bought a Washer
• Prob(Dryer given Washer)
• = Prob(Washer and Dryer)/Prob(Washer) = 0.15/0.20 = 0.75
• 75% of those buying a washer also bought a dryer
– Conditional probability of Washer given Dryer
• = Prob(Washer and Dryer)/Prob(Dryer) = 0.15/0.25 = 0.60
• 60% of those buying a dryer also bought a washer
Slide
6-23 Independent Events
• Two events are Independent if information about
one does not change the likelihood of the other
– Three equivalent ways to check independence
• Prob (A given B) = Prob (A)
• Prob (B given A) = Prob (B)
• Prob (A and B) = Prob (A)  Prob (B)

• Two events are Dependent if not independent


– e.g., Prob(Washer and Dryer) = 0.15
• Prob (Washer)  Prob (Dryer) = 0.20  0.25 = 0.05
– Washer and Dryer are not independent
• They are dependent
Slide
6-24 Mutually Exclusive Events
• Two events are Mutually Exclusive if they cannot
both happen, that is, if
Prob(A and B) = 0
• No overlap
A B
in Venn diagram

• Examples
– Profit and Loss (for a selected business division)
– Green and Purple (for a manufactured product)
– Country Squire and Urban Poor (marketing segments)
• Mutually exclusive events are dependent events
Slide
6-25 Probability Trees
• A method for solving probability problems
– Given probabilities for some events (perhaps union,
intersection, or conditional)
• Find probabilities for other events
– Record the basic information on the tree
• Usually three probability numbers are given
– Perhaps two probability numbers if events are independent
• The tree helps guide your calculations
– Each column of circled probabilities adds up to 1
– Circled prob times conditional prob gives next probability
– For each group of branches
• Conditional probabilities add up to 1
• Circled probabilities at end add up to probability at start
Slide
6-26 Probability Tree (continued)
• Shows probabilities and conditional probabilities
Event B

P(A and B)
Event A
P(A)

P(A and “not B”)

P(“not A” and B)
P(not A)

P(“not A” and “not B”)


Slide
6-27 Example: Appliance Purchases
• First, record the basic information
• Prob(Washer) = 0.20, Prob(Dryer) = 0.25
• Prob(Washer and Dryer) = 0.15

Washer? Dryer?
0.15
0.20
Slide
6-28 Example (continued)
• Next, subtract: 1–0.20 = 0.80, 0.25–0.15 = 0.10

Washer? Dryer?
0.15
0.20

0.10
0.80
Slide
6-29 Example (continued)
• Now subtract: 0.20–0.15 = 0.05, 0.80–0.10 = 0.70

Washer? Dryer?
0.15
0.20

0.05

0.10
0.80
0.70
Slide
6-30 Example (completed tree)
• Now divide to find conditional probabilities
0.15/0.20 = 0.75, 0.05/0.20 = 0.25
0.10/0.80 = 0.125, 0.70/0.80 = 0.875
Washer? Dryer?
0.15
0.20

0.05

0.10
0.80
0.70
Slide
6-31 Example (finding probabilities)
• Finding probabilities from the completed tree
P(Washer) = 0.20 Washer? Dryer?
P(Dryer) = 0.15+0.10 = 0.25 0.15
P(Washer and Dryer) = 0.15 0.20

P(Washer or Dryer) = 0.05


0.15+0.05+0.10 = 0.30
P(Washer and not Dryer) = 0.05 0.10
0.80
P(Dryer given Washer) = 0.75
0.70
P(Dryer given not Washer) = 0.125
P(Washer given Dryer) = 0.15/0.25 = 0.60
(using the conditional probability formula)
Slide
6-32 Example: Venn Diagram
• Venn diagram probabilities correspond to right-
hand endpoints of probability tree
P(Washer and Dryer) P(“not Washer” and Dryer)

P(Washer and “not Dryer”)


Washer Dryer
0.15
0.05 0.10

0.70

P(“not Washer” and “not Dryer”)


Slide
6-33 Example: Joint Probability Table
• Shows probabilities for each event, their
complements, and combinations using and
– Note: rows add up, and columns add up
P(“not Washer” and Dryer)
P(Washer and Dryer)
Washer
P(Dryer)
Yes No

Yes 0.15 0.10 0.25


Dryer

P(not Dryer)
No 0.05 0.70 0.75

0.20 0.80 1
P(Washer and “not Dryer”)
P(Washer) P(“not Washer” and “not Dryer”)

You might also like