0% found this document useful (0 votes)
6 views87 pages

CH 03

Chapter 3 of 'Applied Statistics and Probability for Engineers' focuses on discrete random variables and their probability distributions. It covers key concepts such as probability mass functions, cumulative distribution functions, and methods for calculating means and variances. The chapter also includes various examples and learning objectives to help readers understand and apply these statistical principles.

Uploaded by

ipekzel2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views87 pages

CH 03

Chapter 3 of 'Applied Statistics and Probability for Engineers' focuses on discrete random variables and their probability distributions. It covers key concepts such as probability mass functions, cumulative distribution functions, and methods for calculating means and variances. The chapter also includes various examples and learning objectives to help readers understand and apply these statistical principles.

Uploaded by

ipekzel2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 87

Applied Statistics and Probability

for Engineers
Seventh Edition
Douglas C. Montgomery George C. Runger

Chapter 3
Discrete Random Variables & Probability Distributions
Chapter 3 Title Slide
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 1
3 Discrete Random Variables and
Probability Distributions

CHAPTER OUTLINE
3.1 Probability Distributions and 3.6 Geometric and Negative
Probability Mass Functions Binomial Distributions
3.2 Cumulative Distribution 3.7 Hypergeometric Distribution
Functions 3.8 Poisson Distribution
3.3 Mean and Variance of a
Discrete Random Variable
3.4 Discrete Uniform Distribution
3.5 Binomial Distribution

Chapter 3 Contents
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 2
Learning Objectives for Chapter 3
After careful study of this chapter, you should be able to do the following:
1. Determine probabilities from probability mass functions and the reverse.
2. Determine probabilities and probability mass functions from cumulative
distribution functions and the reverse.
3. Calculate means and variances for discrete random variables.
4. Understand the assumptions for discrete probability distributions.
5. Select an appropriate discrete probability distribution to calculate
probabilities.
6. Calculate probabilities and determine means and variances for some
common discrete probability distributions.

Chapter 3 Learning Objectives

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 3
Example 3.1 | Flash Recharge Time
• The time to recharge the flash is tested in three
cellphone cameras Table 3.1 Camera Flash Tests
o The probability that a camera passes the test is 0.8 Camera #
and the cameras perform independently. 1 2 3 Probability X
• Table 3.1 shows the sample space for the experiment Pass Pass Pass 0.512 3
and associated probabilities. Fail Pass Pass 0.128 2
Pass Fail Pass 0.128 2
o For example, because the cameras are Fail Fail Pass 0.032 1
independent, the probability that the first and Pass Pass Fail 0.128 2
second cameras pass the test and the third one Fail Pass Fail 0.032 1
fails, denoted as 𝑝𝑝𝑓, is Pass Fail Fail 0.032 1
𝑃(𝑝𝑝𝑓) = (0.8)(0.8)(0.2) = 0.128 Fail Fail Fail 0.008 0
Sum 1.000
• The random variable 𝑋 denotes the number of cameras
that pass the test. The last column shows the values of
𝑋 assigned to each outcome of the experiment.
Sec 3.1 Probability Distributions and
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 4
Mass Functions
Probability Distributions
• A random variable is a function that assigns a real number to each outcome
in the sample space of a random experiment.

• The probability distribution of a random variable 𝑋 is a description of the


probabilities associated with the possible values of 𝑋.

• A discrete random variable has a probability distribution that specifies the


list of possible values of 𝑋 along with the probability of each, or it can be
expressed in terms of a function or formula.

Sec 3.1 Probability Distributions and


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 5
Mass Functions
Discrete Random Variables

This means that if the possible values are


arranged in order and there is a gap between
each value and the next one.

The set of possible values may be infinite; for


example, the set of all integers is a discrete set.
Example
‘‘Tossing a coin five times’’ is a random experiment and the sample
space can be written as:
S = {TTTTT, TTTTH,...,HHHHH}
Note that here the sample space S has 25=32 outcomes.

Suppose that in this experiment, we are interested in the number of


heads (H). We can define a random variable X whose value is the
number of observed heads (H) in each of the outcome.
The value of X will be one of 0, 1, 2, 3, 4 or 5 depending on the
outcome of the random experiment.
Example (Continued)
In the random experiment of ‘‘Tossing a coin five times’’, the random variable X
(whose value is the number of observed heads) assigns:
• the value 0 to the outcome TTTTT,
• the value 1 to the outcome HTTTT, and so on till the value 5.

TTTTT X=0
HTTTT
THTTT
TTHTT X=1
TTTHT
TTTTH
Hence, the random variable X is a function that assigns a real number value to an outcome (for
this particular random variable, the values are always integers between 0 and 5).
Range of Random Variables
‘‘A random variable is actually a real-valued function that assigns a
numerical value to each possible outcome of the random experiment.’’

Since a random variable is a function, we can talk about


its range.

The range of a random variable X, (shown by Range(X)


or RX), is the set of possible values for X.
Example
Find the range for each of the following random variables.

• I toss a coin 100 times. Let X be the number of heads (H)


I observe.

• I toss a coin until the first heads (H) appears. Let Y be the
total number of coin tosses.

.
Examples
Example
The number of flaws in a 1-inch length of copper wire manufactured
by a certain process varies from wire to wire. Overall, 48% of the
wires produced have no flaws, 39% have one flaw, 12% have two
flaws, and 1% have three flaws.
Let X be the number of flaws in a randomly selected piece of wire. List
the possible values of the random variable X and find the probabilities
of each of them.
Example 3.3 | Digital Channel
• There is a chance that a bit transmitted
through a digital transmission channel is
received in error.
• Let 𝑋 equal the number of bits received in
error in the next 4 bits transmitted.
• The associated probability distribution of
Figure 3.1 Probability distribution for bits in error.
𝑋 is shown in the table.
• The probability distribution of 𝑋 is given by P(X =0) = 0.6561
the possible values along with their P(X =1) = 0.2916
probabilities. P(X =2) = 0.0486
P(X =3) = 0.0036
P(X =4) = 0.0001
Sec 3.1 Probability Distributions and
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
1.0000
Mass Functions 13
Probability Mass Function
For a discrete random variable 𝑋 with possible values 𝑥1, 𝑥2, … , 𝑥𝑛, a
probability mass function is a function such that:

(1) 𝑓 𝑥𝑖 ≥ 0

(2) σ𝑛𝑖=1 𝑓 𝑥𝑖 = 1

(3) 𝑓 𝑥𝑖 = 𝑃(𝑋 = 𝑥𝑖 )

Sec 3.1 Probability Distributions and


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 14
Mass Functions
Example
The number of patients seen in the Emergency Room (ER) in any
given hour is a random variable represented by X. The probability
distribution for X is
X 10 11 12 13 14
P(X = x) 0.4 0.2 0.2 0.1 a

Find the probabilities of the following:

a. Exactly 14 patients arrive


b. At least 12 patients arrive
c. At least 16 patients arrive
Example 3.4 | Wafer Contamination
• Let the random variable 𝑋 denote the number
of wafers that need to be analyzed to detect a Probability Distribution
large particle of contamination. Assume that P(X = 1) = 0.01 0.01
the probability that a wafer contains a large
particle is 0.01, and that the wafers are P(X = 2) = (0.99)*0.01 0.0099
independent. Determine the probability P(X = 3) = (0.99)2*0.01 0.009801
distribution of 𝑋. P(X = 4) = (0.99)3*0.01 0.009703
• Let 𝑝 denote a wafer in which a large particle is : : :
present & let 𝑎 denote a wafer in which it is General formula
absent. 𝑃 𝑋 = 𝑥 = 𝑃 𝑎𝑎 … 𝑎𝑝
• The sample space is: 𝑆 = {𝑝, 𝑎𝑝, 𝑎𝑎𝑝, 𝑎𝑎𝑎𝑝, … } = 0.99𝑥−1 0.01 , 𝑥 = 1, 2, 3 …

• The range of the values of 𝑋 is 𝑥 = 1, 2, 3, 4, …

Sec 3.1 Probability Distributions and


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 16
Mass Functions
Example/Actual Lengths of Stay at a Hospital’s
Emergency Department
Noting that some longer stays are approximated as 15 hours, calculate
the probability mass function (pmf) of the wait time for service.

Hours Count
1 19
2 51
3 86
4 102
5 87
6 62
7 40
8 18
9 14
10 11
15 10
(Solve by yourself)
Cumulative Distribution Function
and Properties
The cumulative distribution function, is the probability that a random variable 𝑋
with a given probability distribution, will be found at a value less than or equal to 𝑥.
Symbolically,
𝐹 𝑥 = 𝑃 𝑋 ≤ 𝑥 = ෍ 𝑓 𝑥𝑖
𝑥𝑖 ≤𝑥

For a discrete random variable 𝑋, 𝐹(𝑥) satisfies the following properties:


(1) 𝐹 𝑥 = 𝑃 𝑋 ≤ 𝑥 = σ𝑥𝑖 ≤𝑥 𝑓 𝑥𝑖

(2) 0 ≤ 𝐹 𝑥 ≤ 1

(3) if 𝑥 ≤ 𝑦 , then 𝐹 𝑥 ≤ 𝐹(𝑦)


Sec 3.2 Cumulative Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 19
Functions
Cumulative Distribution Functions
Example 3.5 | Consider the probability distribution for the digital channel example. The last
column shows the cumulative probabilities of getting an error 𝑥 or less.
𝑋 𝑃(𝑋 = 𝑥) 𝑃(𝑋 ≤ 𝑥)
0 0.6561 = 𝑃 𝑋 ≤ 0 = 𝑃 𝑋 = 0 = 0.6561
1 0.2916 = 𝑃 𝑋 ≤ 1 = 𝑃 𝑋 = 0 + 𝑃 𝑋 = 1 = 0.6561 + 0.2916 = 0.9477
2 0.0486 = 𝑃 𝑋 ≤ 2 = 𝑃 𝑋 ≤ 1 + 𝑃 𝑋 = 2 = 0.9477 + 0.0486 = 0.9963
3 0.0036 = 𝑃 𝑋 ≤ 3 = 𝑃 𝑋 ≤ 2 + 𝑃 𝑋 = 3 = 0.9963 + 0.0036 = 0.9999
4 0.0001 = 𝑃 𝑋 ≤ 4 = 𝑃 𝑋 ≤ 3 + 𝑃 𝑋 = 4 = 0.9999 + 0.0001 = 1.0000

Find the probability of three or fewer bits in error.


• The event (𝑋 ≤ 3) is the total of the events: (𝑋 = 0), (𝑋 = 1), (𝑋 = 2), and (𝑋 = 3).
• For the probability calculation, see the shaded row in the table.
Sec 3.2 Cumulative Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 20
Functions
Example 3.6 | Cumulative Distribution Function
Determine the probability mass function
of 𝑋 from the following cumulative
distribution function:

0 𝑥 < −2
0.2 −2≤𝑥 <0
𝐹 𝑥 =
0.7 0≤𝑥<2
1 2≤𝑥 Figure 3.3 Cumulative Distribution Function

Note
𝑓 −2 = 0.2 − 0 = 0.2 Even if the random variable 𝑋 can assume only
𝑓 0 = 0.7 − 0.2 = 0.5 integer values, the cumulative distribution
𝑓 2 = 1.0 − 0.7 = 0.3 function is defined at non-integer values.

Sec 3.2 Cumulative Distribution


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 21
Functions
PMF and CDF of a Discrete Random Variable

𝑃𝑀𝐹 𝐶𝐷𝐹
𝑝(𝑥) F(x)
1
2/6

3/6
2/6 3/6
1/6
1/6 𝑥
𝑥
1 2 4 1 2 4
Example
Example
Suppose a day’s production of 850 parts contains 50 defective parts. Two
parts are selected at random, without replacement, from the batch. Let the
random variable X equal the number of defective parts. What is the
cumulative distribution function (cdf) of X?

• First find the possible values of X

• Then find probability mass function of p(x)=P(X=x)

• Finally find the cumulative distribution function of X F(x)=P(X ≤ x)


(Solve by yourself)

Given the cumulative distribution function F(x), obtain the


probabilities in parts (a)-(f).
Mean and Variance of a Discrete Random
Variable

Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 26
Mean and Variance of a Discrete Random
Variable

Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 27
Mean and Variance of a Discrete Random
Variable
• Used to summarize a probability Mean or expected value
distribution
𝜇 = 𝐸 𝑋 = ෍ 𝑥𝑓(𝑥)
• Mean: measure of center or middle of
𝑥
the probability distribution
• For a discrete random variable, a weighted
average of possible values with weights Variance
equal to probabilities 𝜎2 = 𝑉 𝑋 = 𝐸 𝑋 − 𝜇 2

• Variance: measure of the dispersion, or = ෍ 𝑥 − 𝜇 2 𝑓 𝑥 = ෍ 𝑥 2 𝑓 𝑥 − 𝜇2


variability in the distribution
𝑥 𝑥
• For a discrete random variable, a weighted
measure of each possible squared deviation
with weights equal to probabilities Standard deviation
𝜎 = 𝜎2
Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 28
Figure Distributions with equal means and
unequal dispersions

Copyright © 2017 Pearson Education, Ltd. All rights reserved. 4 - 29


Example 3.7 | Digital Channel
• In Example 3.3, there is a chance that a bit 𝑥 (𝑥 − 0.4) (𝑥 − 0.4)2 𝑓(𝑥) 𝑓(𝑥)(𝑥 − 0.4)2
transmitted through a digital transmission
channel is received in error. 𝑋 is the number of 0 −0.4 0.160 0.6561 0.1050
bits received in error of the next 4 transmitted. 1 0.6 0.360 0.2916 0.1050
• The probabilities are shown in the table in the 2 1.6 2.560 0.0486 0.1244
𝑓 𝑥 column.
3 2.6 6.760 0.0036 0.0243
• Use the table to calculate the mean &
variance. 4 3.6 12.960 0.0001 0.0013

Mean Variance
5
𝜇 = 𝐸 𝑋 = 0𝑓 0 + 1𝑓 1 + 2𝑓 2 + 3𝑓 3 + 𝜎 2 = 𝑉 𝑋 = ෍ 𝑓 𝑥𝑖 𝑥𝑖 − 0.4 2
= 0.36
4𝑓 4 = 0 0.6561 + 1 0.2916 + 2 0.0486 +
𝑖=1
3 0.0036 + 4 0.0001 = 0.4

Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 30
Example
2-32

Example
A certain industrial process is brought down for recalibration
whenever the quality of the items produced falls below
specifications. Let X represent the number of times the process is
recalibrated during a week and assume that X has the following
probability mass function.
x 0 1 2 3 4
p(x) 0.35 0.25 0.20 0.15 0.05
Find the mean and variance of X.

McGraw-Hill ©2014 by The McGraw-Hill Companies, Inc. All rights reserved.


Example
Example (Homework)

A salesperson for a medical device company has two appointments on a


given day. At the first appointment, he believes that he has a 70% chance
to make the deal, from which he can earn $1000 commission if
successful. On the other hand, he thinks he only has a 40% chance to
make the deal at the second appointment, from which, if successful, he
can make $1500. What is his expected commission based on his own
probability belief? Assume that the appointment results are independent
of each other.
Expected Value of a Function of a Discrete
Random Variable
• If 𝑋 is a discrete random variable with probability mass function 𝑓(𝑥)

𝐸[ℎ(𝑥)] = ෍ ℎ(𝑥)𝑓(𝑥)
𝑥

• The variance can be considered as an expected value of a specific function


of 𝑋, namely, ℎ 𝑋 = 𝑋 − 𝜇 2

Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 35
Example 3.9 | Digital Channel
• In Example 3.7, 𝑋 is the number of bits received in error of the next 4 transmitted.
• The probabilities are shown in the table in the 𝑓 𝑥 column.
• What is the expected value of the square of the number of bits in error?
• ℎ 𝑋 = 𝑋2

Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 36
Example
Discrete Uniform Distribution

Sec 3.4 Discrete Uniform Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 38
Mean and Variance of Discrete Uniform
Distribution
• Let 𝑋 be a discrete random variable ranging from 𝑎, 𝑎 + 1, 𝑎 +
2, … , 𝑏, 𝑓𝑜𝑟 𝑎 ≤ 𝑏.
• There are 𝑏 – (𝑎 − 1) values in the inclusive interval.
• Therefore 𝑓(𝑥) = 1/(𝑏 − 𝑎 + 1)

Sec 3.4 Discrete Uniform Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 39
Example 3.11 | Number of Voice Lines
Let the random variable 𝑋 denote the number of the 48 voice lines that are in use at a particular
time. Assume that 𝑋 is a discrete uniform random variable with a range of 0 to 48.
Find 𝐸(𝑋) & 𝜎.

Practical Interpretation
The average number of lines in use is 24, but the dispersion
(as measured by 𝜎) is large. Therefore, at many times far
more or fewer than 24 lines are used.

Sec 3.4 Discrete Uniform Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 40
The Bernoulli Distribution
We use the Bernoulli distribution when we have an experiment
which can result in one of two outcomes. One outcome is
labeled “success,” and the other outcome is labeled “failure.”

The probability of a success is denoted by p. The probability of a


failure is then 1 – p.

Such a trial is called a Bernoulli trial with success probability p.


Examples
1. The simplest Bernoulli trial is the toss of a coin. The two
outcomes are heads and tails. If we define heads to be the
success outcome, then p is the probability that the coin
comes up heads. For a fair coin, p = 0.5.
2. Another Bernoulli trial is a selection of a component from a
population of components, some of which are defective. If
we define “success” to be a defective component, then p is
the proportion of defective components in the population.
The Bernoulli Distribution
• There is a single trial.
• The trial can result in one of two possible outcomes.
• There are two possible outcomes.
• P(success)=p
• P(failure)=1-p
X ~ Bernoulli(p)
For any Bernoulli trial, we define a random variable X as follows:
✓If the experiment results in a success, then X = 1.
✓Otherwise, X = 0.
✓It follows that X is a discrete random variable, with probability mass
function p(x) defined by
p(0) = P(X = 0) = 1 – p (probability of failure)
p(1) = P(X = 1) = p (probability of success)
p(x) = 0 for any value of x other than 0 or 1
x=0,1
Mean and Variance
If X ~ Bernoulli(p), then

➢ X = 0(1 − p) + 1( p) = p

➢  X2 = (0 − p) 2 (1 − p) + (1 − p) 2 ( p) = p(1 − p)
Example

Ten percent of components manufactured by a certain process


are defective. A component is chosen at random. Let X = 1 if the
component is defective, and
X = 0 otherwise.

1. What is the distribution of X?

2. Find the mean and variance of X.


The Binomial Distribution (Motivation)
If a total of n Bernoulli trials are conducted, and

➢ The trials are independent.


➢ Each trial has the same success probability p
➢ X is the number of successes in the n trials

then X has the binomial distribution with parameters n and p,


denoted X ~ Bin(n, p).
Independence of trials
• When selecting items from a box, you have to replace the item
you selected, to make your selections independent! (with
replacement)
• There are some cases where your selections are independent
even though the selections are made without replacement:
• Selections from an infinite population
• Selections from finite population
Another Use of the Binomial
❑Assume that:
➢ a finite population contains items of two types, successes and failures,
➢and that a simple random sample is drawn from the population.
❑ Then if the sample size is no more than 5% of the population,
the binomial distribution may be used to model the number of
successes.
❑Sample size < 0.05*(population size)
Example

A lot contains several thousand components, 10% of which are


defective. Seven components are sampled from the lot. Let X
represent the number of defective components in the sample. What
is the distribution of X?

• Since the sample size is small compared to the population (i.e.,


less than 5%), the number of successes in the sample
approximately follows a binomial distribution.
• Therefore we model X with the Bin(7, 0.1) distribution.
Binomial Distribution
The random variable 𝑋 that equals the number of trials that result in a
success is a binomial random variable with parameters
0 < 𝑝 < 1 and 𝑛 = 1, 2, … .

The probability mass function is:


𝑓 𝑥 = 𝑛𝑥 𝑝 𝑥 1 − 𝑝 𝑛−𝑥 , for 𝑥 = 0, 1, … , 𝑛 (3.7)

For constants 𝑎 and 𝑏, the binomial expansion is:


𝑛
𝑛
𝑛 𝑘 𝑛−𝑘
𝑎+𝑏 =෍ 𝑎 𝑏
𝑘
Sec 3.5 Binomial Distribution
𝑘=0

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 51
Example 3.14: Binomial Coefficient
Exercises in binomial coefficient calculation:

10 10! 10 ⋅ 9 ⋅ 8 ⋅ 7!
= = = 120
3 3! 7! 3 ⋅ 2 ⋅ 1 ⋅ 7!

15 15! 15 ⋅ 14 ⋅ 13 ⋅ 12 ⋅ 11 ⋅ 10!
= = = 3,003
10 10! 5! 5 ⋅ 4 ⋅ 3 ⋅ 2 ⋅ 1 ⋅ 10!

100 100! 100 ⋅ 99 ⋅ 98 ⋅ 97 ⋅ 96! Recall: 0! = 1


= = = 3,921,225
4 96! 4! 4 ⋅ 3 ⋅ 2 ⋅ 1 ⋅ 96!
Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 52
Example 3.15a: Organic Pollution
Each sample of water has a 10% chance of containing a particular organic pollutant.
Assume that the samples are independent with regard to the presence of the pollutant. Find
the probability that, in the next 18 samples, exactly 2 contain the pollutant.

Answer: Let 𝑋 denote the number of samples that contain the pollutant in the next 18
samples analyzed. Then 𝑋 is a binomial random variable with 𝑝 = 0.1 and 𝑛 = 18.

18
𝑃 𝑋=2 = 0.12 1 − 0.1 18−2
= 153 0.1 2
0.9 16
= 0.2835
2
In Microsoft Excel®, use the BINOMDIST function to calculate 𝑃 𝑋 = 2 by:

= 𝐁𝐈𝐍𝐎𝐌𝐃𝐈𝐒𝐓(𝟐, 𝟏𝟖, 𝟎. 𝟏, 𝐅𝐀𝐋𝐒𝐄)


Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 53
Example 3.15b: Organic Pollution
Determine the probability that at least 4 samples contain the pollutant.

Answer: The problem calls for calculating 𝑃 𝑋 ≥ 4 but is easier to calculate the
complementary event, 𝑃 𝑋 ≤ 3 , so that:

3
18
𝑃 𝑋 ≥4 =1−෍ 0.1𝑥 0.9 18−𝑥 = 1 − 0.150 + 0.300 + 0.284 + 0.168 = 0.098
𝑥
𝑥=0

In Microsoft Excel®, use the BINOMDIST function to calculate 𝑃 𝑋 ≥ 4 by:

= 𝟏 − 𝐁𝐈𝐍𝐎𝐌𝐃𝐈𝐒𝐓(𝟑, 𝟏𝟖, 𝟎. 𝟏, 𝐓𝐑𝐔𝐄)


Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 54
Example 3.15c: Organic Pollution
Determine the probability that 3 ≤ 𝑋 < 7.

Answer:
6
18
𝑃 3≤𝑋<7 =෍ 0.1𝑥 0.9 18−𝑥
= 0.168 + 0.070 + 0.022 + 0.005 = 0.265
𝑥
𝑥=3
In Microsoft Excel®, use the BINOMDIST function to calculate 𝑃 3 ≤ 𝑋 < 7 by:

= 𝐁𝐈𝐍𝐎𝐌𝐃𝐈𝐒𝐓 𝟔, 𝟏𝟖, 𝟎. 𝟏, 𝐓𝐑𝐔𝐄 − 𝐁𝐈𝐍𝐎𝐌𝐃𝐈𝐒𝐓(𝟐, 𝟏𝟖, 𝟎. 𝟏, 𝐓𝐑𝐔𝐄)

Appendix A, Table II (pp. A-5 to A-7) presents cumulative binomial


tables (for selected values of 𝑝 and 𝑛) that will simplify calculations.

Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 55
Binomial Mean and Variance
If 𝑋 is a binomial random variable with parameters 𝑝 and 𝑛,
• The mean of 𝑋 is:
𝜇 = 𝐸(𝑋) = 𝑛𝑝

• The variance of 𝑋 is:


𝜎2 = 𝑉(𝑋) = 𝑛𝑝(1 − 𝑝)
• These quantities are derived by summing Bernoulli random variables and
using the definitions of the mean and variance of discrete random
variables.

Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 56
Example 3.16 | Binomial Mean and Variance

For the number of transmitted bits received in error in Example 3.13, 𝑛 = 4 and
𝑝 = 0.1. Find the mean and variance of the binomial random variable.

Answer:
𝜇 = 𝐸 𝑋 = 𝑛𝑝 = 4 ⋅ 0.1 = 0.4

𝜎2 = 𝑉 𝑋 = 𝑛𝑝 1 − 𝑝 = 4 ⋅ 0.1 ⋅ 0.9 = 0.36

Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 57
Example

A large industrial firm allows a discount on any invoice that is


paid within 30 days. Of all invoices, 10% receive the discount. In
a company audit, 12 invoices are sampled at random.
a) What is the probability that fewer than 4 of the 12 sampled
invoices receive the discount?
b) What are the mean and variance of the number of invoices
receive the discount?
Example

A large chain retailer purchases a certain kind of electronic device from a


manufacturer. The manufacturer indicates that the defective rate of the
device is 3%.
a) The inspector randomly picks 20 items from a shipment. What is the
probability that there will be at least one defective item among these 20?
b) Suppose that the retailer receives 10 shipments in a month and the
inspector randomly tests 20 devices per shipment. What is the probability
that there will be exactly 3 shipments each containing at least one defective
device among the 20 that are selected and tested from the shipment?
Geometric Distribution
• Binomial distribution has
• Fixed number of trials
• Random number of successes
• Geometric distribution has reversed roles
• Random number of trials
• Fixed number of successes, in this case 1

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 60
Example 3.18 | Wafer Contamination
• The probability that a wafer contains a large particle of contamination is 0.01. Assume that the
wafers are independent. What is the probability that exactly 125 wafers need to be analyzed before
a particle is detected?

Let 𝑋 denote the number of samples analyzed until a large particle is detected. Then
𝑋 is a geometric random variable with parameter 𝑝 = 0.01.

𝑃(𝑋 = 125) = (0.99)124(0.01) = 0.00289

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 61
Geometric Mean and Variance

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 62
Example 3.19 | Mean and Standard Deviation
• Consider the transmission of bits in Example 3.17.
• The probability that a bit transmitted through a digital transmission channel is received in error is
𝑝 = 0.1.
• Assume that the transmissions are independent events, and let the random variable 𝑋 denote the
number of bits transmitted until the first error. Find the mean and standard deviation.

Mean: 𝜇 = 𝐸(𝑋) = 1 / 𝑝 = 1 / 0.1 = 10

Variance: 𝜎2 = 𝑉(𝑋) = (1 − 𝑝) / 𝑝2 = 0.9 / 0.01 = 90

Standard deviation: 90 = 9.49

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 63
Example
A test of weld strength involves loading welded joints until a
fracture occurs. For a certain type of weld, 80% of the fractures
occur in the weld itself, while the other 20% occur in the beam. A
number of welds are tested. Let X be the number of tests up to
and including the first test that results in a beam fracture.

1. What is the distribution of X?


2. Find P(X = 3).
3. What are the mean and variance of X?
Lack of Memory Property
• For a geometric random variable, the trials are independent
• Count of the number of trials until the next success can be started at any trial without changing
the probability distribution of the random variable.
• Implication: the system presumably will not wear out.
• For all transmissions the probability of an error remains constant.
• Hence, the geometric distribution is said to lack any memory.

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 65
Example 3.20 | Lack of Memory Property
In Example 3.17, the probability that a bit is transmitted in error is 𝑝 = 0.1. Suppose 50 bits have
been transmitted. What is the mean number of bits transmitted until the next error?

The mean number of bits transmitted until the next error, after 50 bits have already been
transmitted, is 1/0.1 = 10, the same result as the mean number of bits until the first error.

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 66
Negative Binomial Distribution
• A generalization of a geometric distribution in which the random
variable is the number of Bernoulli trials required to obtain 𝑟
successes

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 67
Mean & Variance of Negative Binomial

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 68
Example 3.22 | Camera Flashes
• The probability that a camera passes a particular test is 0.8, and the cameras perform
independently.
• What is the probability that the third failure is obtained in five or fewer tests?

Let 𝑋 denote the number of cameras tested until three failures have been obtained.
The requested probability is 𝑃(𝑋 ≤ 5). Here 𝑋 has a negative binomial distribution
with 𝑝 = 0.2 and 𝑟 = 3. Therefore,

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 69
Example cont.
In a test of weld strength, 80% of tests result in a fracture in the
weld, while the other 20% result in a fracture in the beam.
Let X denote the number of tests up to and including the third
beam fracture.
a) What is the distribution of X?
b) Find P(X = 8).
c) Find the mean and variance of X.
Hypergeometric Distribution
• Samples are selected from a finite population without replacement. For
this reason the trials are not independent, so the number of
successes in the sample does not follow a binomial distribution.

Sec 3.7 Hypergeometric


Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 71
Example 3.23 | Sampling without replacement
• A day’s production of 850 manufactured parts contains 50 parts that do not conform to customer requirements
• Two parts are selected at random without replacement from the day’s production.
• Let A and B denote the events that the first and second parts are nonconforming, respectively.
• What is the probability that both parts conform, one part does not conform, and both parts do not conform?

Let 𝑋 denote the number of parts that


do not conform. Therefore,

Sec 3.7 Hypergeometric


Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 72
Example 3.24a | Parts from Suppliers
• A batch of parts contains 100 parts from a local supplier of circuit boards and 200 parts from a
supplier in the next state.
• If 4 parts are selected randomly, without replacement, what is the probability that they are all from
the local supplier?

Sec 3.7 Hypergeometric


Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 73
Example 3.24b | Parts from Suppliers
• What is the probability that two or more parts in the sample are from the local supplier?

Sec 3.7 Hypergeometric


Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 74
Example 3.24c | Parts from Suppliers
• What is the probability that at least one part in the sample is from the local supplier?

Sec 3.7 Hypergeometric


Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 75
Hypergeometric Mean & Variance

Sec 3.7 Hypergeometric


Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 76
Example
Of 50 buildings in an industrial park, 12 have electrical code
violations. If 10 buildings are selected at random for inspection,
what is the probability that exactly 3 of the 10 have code
violations? What are the mean and variance of X?
Poisson Distribution
• Poisson distribution expresses the probability of a given number of
events occurring in a fixed interval of time or space if these events
occur with a known constant rate and independently of the time since
the last event.
• The Poisson distribution can also be used for the number of events in
other specified intervals such as distance, area or volume.

Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 78
Poisson Distribution
• X: the number of events occurred in a fixed interval of time or space
• The number of births per hour during a given day
• The number of particles emitted by a radioactive source in a given time
• The number of cases of a disease in different towns
• The number of hits on a web site in one hour
• The number of goals scored by a football team in a match
• The number of accidents in a certain part of the road
• The number of customers who enters a market during an hour
Poisson Distribution

Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 80
Example 3.27a | Wire Flaws
• Flaws occur at random along the length of a thin copper wire.
• Let 𝑋 denote the random variable that counts the number of flaws in a length of 𝑇 𝑚𝑚 of wire and
suppose that the average number of flaws is 2.3 𝑝𝑒𝑟 𝑚𝑚.
• Find the probability of exactly 10 𝑓𝑙𝑎𝑤𝑠 𝑖𝑛 5 𝑚𝑚 of wire.

Let 𝑋 denote the number of flaws in 5 mm of wire. Then 𝑋 has the Poisson distribution with

Therefore,

Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 81
Example 3.27b | Wire Flaws
• Find the probability of at least 1 𝑓𝑙𝑎𝑤 𝑖𝑛 2 𝑚𝑚 of wire.

Let 𝑋 denote the number of flaws in 2 mm of wire. Then 𝑋 has the Poisson distribution with

Therefore,

Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 82
Poisson Mean & Variance

• The mean and variance of the Poisson model are the same.
• For example, if particle counts follow a Poisson distribution with a mean of 25
particles per square centimeter, the variance is also 25 and the standard deviation of
the counts is 5 per square centimeter.
• If the variance of a data is much greater than the mean, then the Poisson distribution
would not be a good model for the distribution of the random variable.
Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 83
Example
Particles are suspended in a liquid medium at a concentration of
10 particles per mL. A large volume of the suspension is
thoroughly agitated, and then 1 mL is withdrawn. What is the
probability that exactly eight particles are withdrawn?
Example
Particles are suspended in a liquid medium at a concentration of
6 particles per mL. A large volume of the suspension is
thoroughly agitated, and then 3 mL are withdrawn. What is the
probability that exactly 15 particles are withdrawn?
Example
Grandma bakes chocolate chip cookies in batches of 100. She puts
300 chips into the dough. When the cookies are done, she gives you
one.
What is the probability that your cookie contains no chocolate chips?
Important Terms & Concepts of Chapter 3

Chapter 3 Important Terms and Concepts Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 87

You might also like