0% found this document useful (0 votes)
40 views51 pages

4 TH

This document discusses probability and discrete distributions. It covers basic probability concepts like classical and subjective probabilities. It defines events, sample spaces, and uses Venn diagrams and contingency tables to visualize events. It discusses calculating probabilities of simple, joint, marginal, and conditional events. It introduces Bayes' theorem and counting rules to calculate outcomes and permutations. The goal is to understand probability distributions and apply probability concepts to situations that can be modeled by binomial and Poisson distributions.

Uploaded by

Rahul Soni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views51 pages

4 TH

This document discusses probability and discrete distributions. It covers basic probability concepts like classical and subjective probabilities. It defines events, sample spaces, and uses Venn diagrams and contingency tables to visualize events. It discusses calculating probabilities of simple, joint, marginal, and conditional events. It introduces Bayes' theorem and counting rules to calculate outcomes and permutations. The goal is to understand probability distributions and apply probability concepts to situations that can be modeled by binomial and Poisson distributions.

Uploaded by

Rahul Soni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

MODULE TWO: MEASURING

UNCERTAINTY; AND DRAWING


CONCLUSIONS ABOUT POPULATIONS
BASED ON SAMPLE DATA
TOPIC 4: PROBABILITY AND DISCRETE
DISTRIBUTIONS

Deakin University CRICOS Provider Code: 00113B


+ 2
Learning Objectives
At the completion of this topic, you should be able to:
• recognise basic probability concepts
• calculate probabilities of simple, marginal and joint events
• calculate conditional probabilities and determine whether events are
independent or not
• revise probabilities using Bayes’ theorem
• use counting rules to calculate the number of possible outcomes
• recognise and use the properties of a probability distribution
• calculate the expected value and variance of a probability distribution
• identify situations that can be modelled by Binomial and Poisson
distributions and calculate their probabilities
+4.1 Basic Probability Concepts 3

A probability is a numerical value that represents the chance,


likelihood or possibility that a particular event will occur (always
between 0 and 1)
There are 3 approaches to assigning a probability to an event:
1. a priori classical probability
• based on prior knowledge
2. empirical classical probability
• based on observed data
3. subjective probability
• based on individual judgment or opinion about the probability of occurrence
+Events and Sample Spaces 4

Events
Simple event (denoted A)
• An outcome from a sample space with one characteristic
e.g. planned to purchase TV
Complement of an event A (denoted A’)
• All outcomes that are not part of event A
e.g. did not plan to purchase TV
Joint event (denoted A∩B)
• Involves two or more characteristics simultaneously
e.g. planned to purchase a TV and did actually purchase TV
+Events and Sample Spaces (cont) 5

Space
• The sample space is the collection of ALL possible events
e.g. all 6 faces of a die
all 52 playing cards
+Joint Probability 8

Mutually Exclusive Events


• Events that cannot occur together
e.g. Event A = Male
Event B = Event A’= Other
• Events A and B are mutually exclusive
Collectively Exhaustive Events
• One of the events must occur
• The set of events covers the entire sample space
e.g. member of loyalty program or not member of loyalty program
+Visualising Events 6

Contingency Tables
•Event A = Order > $50
•Event B = Member loyalty program
+Visualising Events 7

Venn Diagrams
+ Probability and Events 9

The probability of any event must be between 0 and 1,


inclusively
0 ≤ P(A) ≤ 1 For any event A
The sum of the probabilities of all mutually exclusive and
collectively exhaustive events is 1

If A and B are mutually exclusive and collectively exhaustive


P(A ) + P(B) = 1
+Computing Joint and Marginal 10

Probabilities
The probability of a joint event, A and B:
number of outcomes satisfying A and B
P( A and B ) =
total number of elementary outcomes
+ Computing Joint Probability 11

Example:
+ Computing Marginal (or simple) 12

Probability
P(A) P(A and B1 ) + P(A and B2 ) +  + P(A and Bk )
where: B1, B2, …, Bk are k mutually exclusive and collectively exhaustive events

Example:
+Joint and Marginal Probabilities 13

Using Contingency Tables


+General Addition Rule 14

P(A or B) = P(A) + P(B) - P(A and B)

If A and B are mutually exclusive, then


P(A and B) = 0, so the rule can be simplified

P(A or B) = P(A) + P(B)


+General Addition Rule (cont) 15

Example:

Note: P(A ∩ B) is (double) counted in both P(A) AND P(B) so we must subtract it
+Conditional Probability 16

A conditional probability is the probability of one event, given


that another event has occurred
•The conditional probability of A given that B has occurred
P(A and B)
P(A|B) =
P(B)
•The conditional probability of B given that A has occurred
P(Aand B)
P(B | A) =
P(A)

where P(A and B) = joint probability of A and B


P(A) = marginal probability of A
P(B) = marginal probability of B
+Calculating Conditional 17

Probabilities
Example:
+Decision Trees 18

A Decision Tree:
• is an alternative to contingency tables
• allows sequential events to be graphed
• allows calculation of joint probabilities by multiplying respective branch
probabilities
+Statistical Independence 19

Two events are independent if and only if:


P(A | B) = P(A)

or P( B | A) = P(B)
Events A and B are independent when the probability of one
event is not affected by the other event
+Multiplication Rules 20

Multiplication rule for two events A and B:


P(A and B) = P(A | B) P(B)
Note: If A and B are independent then:
P(A|B) = P(A)
and the multiplication rule simplifies to:
P(A and B) = P(A)P(B)
+Marginal Probability Using the 21

General Multiplication Rule


Marginal probability for event A:
P(A) = P(A| B1)P(B1 ) + P(A| B2 )P(B2 ) ++ P(A| Bk )P(Bk )
where: B1, B2, …, Bk are k mutually exclusive and collectively exhaustive
events
+Bayes’ Theorem 22

A technique used to revise previously calculated probabilities


with the addition of new information
Need to identify:
•Prior probabilities P (Si) where i =1, …, k
•Conditional probabilities P (F|Si) where i =1, …, k
Then we can calculate:
•Joint probabilities P(F∩S)
•Revised probabilities P(Si|F) where i =1, …, k
+Bayes’ Theorem (cont) 23

Example:
Suppose a Consumer Electronics Company is considering marketing a new model of
television. In the past, 40% of the televisions introduced by the company have been
successful and 60% have been unsuccessful.
Before introducing a television to the marketplace, the marketing research
department always conducts an extensive study and releases a report, either
favourable or unfavourable. In the past, 80% of the successful televisions had
received a favourable market research report and 30% of the unsuccessful
televisions had received a favourable report.
For the new model of television under consideration, the marketing research
department has issued a favourable report. What is the probability that the
television will be successful, given this favourable report?
+Bayes’ Theorem (cont) 24

where: S = successful television


S’ = unsuccessful television (i.e. the complement of S)
F = favourable report
F’ = unfavourable report (i.e. the complement of F)
+Counting Rules 25

Counting Rule 1:
•If any one of k different mutually exclusive and collectively exhaustive
events can occur on each of n trials, the number of possible outcomes is
equal to:
kn
Example:
•Suppose you toss a coin 5 times. What is the number of different possible
outcomes (i.e. the sequence of heads and tails)
Answer:
•25 = 2 x 2 x 2 x 2 x 2 = 32 or (2)(2)(2)(2)(2) = 32 possible outcomes
+Counting Rules 26

Counting Rule 2:
•If there are k1 events on the first trial, k2 events on the second trial, … and kn
events on the nth trial, the number of possible outcomes is:
(k1)(k2)…(kn)
Example:
•Standard New South Wales vehicle registration plates previously consisted of
3 letters followed by 3 digits. How many possible combinations were there?
Answer:
•26 x 26 x 26 x 10 x 10 x 10 = 263 x 103 = 17,576,000 possible outcomes
+Counting Rules 27

Counting Rule 3:
•The number of ways that n items can be arranged in order is:
n! = (n)(n – 1)…(1)
Example:
•If a set of 6 textbooks are to be placed on a shelf, in how many ways can
the 6 books be arranged?
Answer:
•6! = (6)(5)(4)(3)(2)(1) = 720 possible outcomes
+Counting Rules 28

Counting Rule 4 - Permutations:


•The number of ways of arranging X objects selected from n objects in
order is: n!
P =
Example:
n x
(n − X)!

•If there are 6 textbooks but room for only 4 books on a shelf, in how many
ways can these books be arranged on the shelf?
6! 6! 720
6 P4 = = = = 360
(6 − 4)! 2! 2
Answer:
•360 different permutations
+Counting Rules 29

Counting Rule 5 - Combinations:


•The number of ways of arranging X objects selected from n objects
irrespective of order is:
n!
nCx =
Example: X !(n − X)!

•How many ways can you choose 4 textbooks out of the 6 to place on a
shelf? 6! 720
6 C4 = = = 15
4!(6 − 4)! (24)(2)
Answer:
•15 different combinations
+Introduction to Probability 30

Distributions
Random variable
•Represents a possible numerical value from an uncertain event
+Probability Distribution for a 31

Discrete Random Variable


A probability distribution for a discrete random variable is a
mutually exclusive list of all possible numerical outcomes of the
random variable with the probability of occurrence associated
with each outcome
+Discrete Random Variable 32

Can only assume a countable number of values


Examples:
•Roll a dice twice. Let X be the number of times 4 comes up; thus X could be
0, 1, or 2 times

•Toss a coin five times. Let X be the number of heads; thus X could = 0, 1, 2,
3, 4, or 5
+Discrete Probability Distribution 33

Experiment: Toss 2 Coins. Let X = # heads


4 possible outcomes Probability Distribution
+Expected Value of a Discrete 34

Random Variable
Expected value (or mean) of a discrete random variable
(weighted average) N
µ = E(X) = ∑ X P( X )
i i
i=1

Toss 2 coins, X = # of heads, calculate expected value of X:


E(X) = (0 x 0.25) + (1 x 0.50) + (2 x 0.25) = 1.0
X P(X)

0 0.25
1 0.50
2 0.25
+Variance and Standard Deviation of 35

a Discrete Random Variable


Example:
•Toss two coins, X = # heads, calculate the variance, σ2, and standard
deviation, σ (from previous slide, E(X) = 1)
N
σ 2
= ∑ 2
X i P ( X i ) − E(X) 2
i=1

σ 2 = (02 * 0.25 + 12 * 0.5 + 2 2 * 0.25)− (12 )= 0.5

Standard deviation σ = σ2 = 0.5 = 0.707


where: E(X) = expected value of the discrete random variable X
Xi = the ith outcome of the discrete random variable X
P(Xi) = probability of the ith occurrence of X
+Variance and Standard Deviation of 36

a Discrete Random Variable


+Binomial Distribution 37

A binomial distribution can be thought of as simply the


probability of a SUCCESS or FAILURE outcome in an experiment
or survey that is repeated multiple times
The binominal distribution is a mathematical model
Possible binominal scenarios:
•A manufacturing plant labels items as either defective or acceptable
•A firm bidding for contracts will either get a contract or not
•A marketing research firm receives survey responses of ‘yes, I will buy’ or
‘no, I will not’
•A new job applicant either accepts the offer or rejects it
+Binomial Distribution 38

There are 4 essential properties of the binominal distribution:


A fixed number of observations, or trials, n
• e.g. 15 tosses of a coin; 10 light bulbs taken from a warehouse
Two mutually exclusive and collectively exhaustive categories
• e.g. head or tail in each toss of a coin; defective or not defective light bulb
• generally called ‘success’ and ‘failure’
• probability of success is p, probability of failure is 1–p
+Binomial Distribution (cont) 39

Constant probability for each observation


• e.g. probability of getting a tail is the same each time we toss the coin
Observations are independent
• the outcome of one observation does not affect the outcome of the other
• two sampling methods can be used to ensure independence; either:
- selected from infinite population without replacement; or
- selected from finite population with replacement
+Binomial Distribution 40

The Binomial Distribution Formula


n!
P(X ) = p X (1 − p)n − X
X !(n − X )!

where:
P(X) = probability of X successes in n trials, with the probability of success
p on each trial
X = number of ‘successes’ in sample, (X = 0, 1, 2, ..., n)
n = sample size (number of trials or observations)
p = probability of ‘success’
1-p = probability of failure
+Binomial Distribution 41

Example:
A customer has a 35% probability of making a purchase. Ten
customers enter the shop. What is the probability of three
customer making a purchase?
Let x = # customer purchases: P(X = 3) = n!
p X (1 − p) n − X
where: X!(n − X)!
n = 10 =
10!
(0.35) 3 (1 − 0.35)10 −3
p = 0.35 3!(10 − 3)!
1-p = (1-0.35) = 0.65 = (120)(0.35) 3 (0.65) 7
X=3 = (120)(0.042875)(0.04902227890625)
= 0.2522
+Binomial Distribution 42
+Binomial Distribution 43
+Binomial Distribution 44

Characteristics of the Binomial Distribution


Mean
μ = E(x) = np
Variance and standard deviation
σ 2 = np(1- p)
σ = np(1- p)
Where: n = sample size
p = probability of success
(1 – p) = probability of failure
+Binomial Distribution 45
+Poisson Distribution 46

We can apply the Poisson distribution to calculate probabilities


when counting the number of times a particular event occurs in
an interval of time or space if:
• the probability an event occurs in any interval is the same for
all intervals of the same size
• the number of occurrences of the event in one interval is
independent of the number in any other interval
• the probability that two or more occurrences of the event in
an interval approaches zero as the interval becomes smaller
+Poisson Distribution (cont) 47

Mean
μ=λ
Variance and Standard Deviation
σ2 = λ
σ= λ
where: λ = expected number of events
+Poisson Distribution (cont) 48

The Poisson distribution has one parameter λ (lambda) which is


the mean or expected number of events per interval
e − λ λx
P( X ) =
X!
where:
P(X) = the probability of X events in a given interval
λ = expected number of events in the given interval
e = base of the natural logarithm system (2.71828...)
+Poisson Distribution (cont) 49

e −λ λ X e −0.50 (0.50)2
P(X = 2) = = = 0.0758
X! 2!
+Poisson Distribution (cont) 50
+Poisson Distribution (cont) 51

You might also like