0% found this document useful (0 votes)
13 views37 pages

4 - Discrete Random Variables & Probability Distributions

Chapter 3 of the document covers discrete random variables and their probability distributions, including concepts such as probability mass functions, cumulative distribution functions, and the calculation of mean and variance. It outlines learning objectives for understanding how to determine probabilities, calculate means and variances, and select appropriate distributions. Examples are provided to illustrate these concepts in practical scenarios.

Uploaded by

kevinalexkoshy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views37 pages

4 - Discrete Random Variables & Probability Distributions

Chapter 3 of the document covers discrete random variables and their probability distributions, including concepts such as probability mass functions, cumulative distribution functions, and the calculation of mean and variance. It outlines learning objectives for understanding how to determine probabilities, calculate means and variances, and select appropriate distributions. Examples are provided to illustrate these concepts in practical scenarios.

Uploaded by

kevinalexkoshy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

NGN 211: Introduction to

Engineering Probability and


Statistics
Prof. Dr. Mahmoud H. Ismail Ibrahim
Department of Electrical Engineering
American University of Sharjah
3 Discrete Random Variables and Probability
Distributions

CHAPTER OUTLINE
3.1 Probability Distributions and 3.6 Geometric Distributions
Probability Mass Functions 3.8 Poisson Distribution
3.2 Cumulative Distribution
Functions
3.3 Mean and Variance of a
Discrete Random Variable
3.4 Discrete Uniform Distribution
3.5 Binomial Distribution

Chapter 3 Contents
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 2
Learning Objectives for Chapter 3
After careful study of this chapter, you should be able to do the following:
1. Determine probabilities from probability mass functions and the reverse.
2. Determine probabilities and probability mass functions from cumulative
distribution functions and the reverse.
3. Calculate means and variances for discrete random variables.
4. Understand the assumptions for discrete probability distributions.
5. Select an appropriate discrete probability distribution to calculate
probabilities.
6. Calculate probabilities and determine means and variances for some
common discrete probability distributions.

Chapter 3 Learning Objectives

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 3
Probability Distributions
• A random variable is a function that assigns a real number to
each outcome in the sample space of a random experiment.
• The probability distribution of a random variable 𝑋 is a
description of the probabilities associated with the possible
values of 𝑋.
• A discrete random variable has a probability distribution that
specifies the list of possible values of 𝑋 along with the probability
of each, or it can be expressed in terms of a function or formula.

Sec 3.1 Probability Distributions and


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 4
Mass Functions
Example 3.1 | Flash Recharge Time
• The time to recharge the flash is tested in three
cellphone cameras. The probability that a camera
passes the test is 0.8 and the cameras perform Table 3.1 Camera Flash Tests
independently. Find the sample space for the Camera #
experiment and associated probabilities. 1 2 3 Probability X
Pass Pass Pass 0.512 3
• For example, because the cameras are independent, Fail Pass Pass 0.128 2
the probability that the first and second cameras pass Pass Fail Pass 0.128 2
the test and the third one fails, denoted as 𝑝𝑝𝑓, is Fail Fail Pass 0.032 1
𝑃(𝑝𝑝𝑓) = (0.8)(0.8)(0.2) = 0.128 Pass Pass Fail 0.128 2
Fail Pass Fail 0.032 1
• The random variable 𝑋 denotes the number of
Pass Fail Fail 0.032 1
cameras that pass the test. The last column shows Fail Fail Fail 0.008 0
the values of 𝑋 assigned to each outcome of the Sum 1.000
experiment.

Sec 3.1 Probability Distributions and


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 5
Mass Functions
Example 3.3 | Digital Channel
• There is a chance that a bit transmitted x P(X=x )
through a digital transmission channel is 0 P(X =0) = 0.6561
received in error. If the probability of error in 1 P(X =1) = 0.2916
transmitting a bit is 0.1. Let 𝑋 equal the 2 P(X =2) = 0.0486
number of bits received in error in the next 4 3 P(X =3) = 0.0036
4 P(X =4) = 0.0001
bits transmitted. Find the probability
Sum 1.0000
distribution of 𝑋.
• The associated probability distribution of 𝑋 is
shown in the table.
• The probability distribution of 𝑋 is given by the
possible values along with their probabilities.
Figure 3.1 Probability distribution for bits in error.

Sec 3.1 Probability Distributions and


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 6
Mass Functions
Probability Mass Function (PMF)
For a discrete random variable 𝑋 with possible values
𝑥1, 𝑥2, … , 𝑥𝑛, a probability mass function (PMF) is a function
such that:
(1) 𝑓 𝑥𝑖 ≥ 0

(2) σ𝑛𝑖=1 𝑓 𝑥𝑖 = 1

(3) 𝑓 𝑥𝑖 = 𝑃(𝑋 = 𝑥𝑖 )

Sec 3.1 Probability Distributions and


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 7
Mass Functions
Example 3.4 | Wafer Contamination
• Let the random variable 𝑋 denote the number
of wafers that need to be analyzed to detect a Probability Distribution
large particle of contamination. Assume that P(X = 1) = 0.01= 0.01
the probability that a wafer contains a large P(X = 2) = 0.99*0.01= 0.0099
particle is 0.01 , and that the wafers are P(X = 3) = (0.99)2*0.01= 0.009801
independent. Determine the probability P(X = 4) = (0.99)3*0.01= 0.009703
distribution of 𝑋. : : :

• Let 𝑝 denote a wafer in which a large particle is General formula


present and let 𝑎 denote a wafer in which it is 𝑓 𝑥 = 𝑃 𝑋 = 𝑥 = 𝑃 𝑎𝑎 … 𝑎𝑝
absent. = 0.99𝑥−1 0.01 , 𝑥 = 1, 2, 3 …
• The sample space is: 𝑆 = {𝑝, 𝑎𝑝, 𝑎𝑎𝑝, 𝑎𝑎𝑎𝑝, … }
• The range of the values of 𝑋 ∈ { 1, 2, 3, 4, … }
Sec 3.1 Probability Distributions and
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 8
Mass Functions
Cumulative Distribution Function (CDF) & Properties

• The cumulative distribution function (CDF), is the probability


that a random variable 𝑋 with a given probability distribution, will
be less than or equal to 𝑥. Symbolically,
𝐹 𝑥 = 𝑃 𝑋 ≤ 𝑥 = ෍ 𝑓 𝑥𝑖
𝑥𝑖 ≤𝑥
• For a discrete random variable 𝑋, 𝐹(𝑥) satisfies the following
properties:
(1) 0 ≤ 𝐹 𝑥 ≤ 1

(2) if 𝑥 ≤ 𝑦 , then 𝐹 𝑥 ≤ 𝐹(𝑦)


Sec 3.2 Cumulative Distribution
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 9
Functions
Example 3.5 | Cumulative Distribution Functions
Consider the probability distribution for the digital channel example. The last column shows
the cumulative probabilities of getting an error 𝑥 or less.
𝑋 𝑃(𝑋 = 𝑥) 𝑃(𝑋 ≤ 𝑥)
0 0.6561 = 𝑃 𝑋 ≤ 0 = 𝑃 𝑋 = 0 = 0.6561
1 0.2916 = 𝑃 𝑋 ≤ 1 = 𝑃 𝑋 = 0 + 𝑃 𝑋 = 1 = 0.6561 + 0.2916 = 0.9477
2 0.0486 = 𝑃 𝑋 ≤ 2 = 𝑃 𝑋 ≤ 1 + 𝑃 𝑋 = 2 = 0.9477 + 0.0486 = 0.9963
3 0.0036 = 𝑃 𝑋 ≤ 3 = 𝑃 𝑋 ≤ 2 + 𝑃 𝑋 = 3 = 0.9963 + 0.0036 = 0.9999
4 0.0001 = 𝑃 𝑋 ≤ 4 = 𝑃 𝑋 ≤ 3 + 𝑃 𝑋 = 4 = 0.9999 + 0.0001 = 1.0000

Find the probability of three or fewer bits in error.


• The event (𝑋 ≤ 3) is the total of the events: (𝑋 = 0), (𝑋 = 1), (𝑋 = 2), and (𝑋 = 3).
• For the probability calculation, see the shaded row in the table.

Sec 3.2 Cumulative Distribution


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 10
Functions
Example 3.6 | Cumulative Distribution Function
Determine the probability mass
function (PMF) of 𝑋 from the
following cumulative distribution
function (CDF):
0 𝑥 < −2
0.2 −2≤𝑥 <0
𝐹 𝑥 =
0.7 0≤𝑥<2
1 2≤𝑥 Figure 3.3 Cumulative Distribution Function

Note
𝑓 −2 = 0.2 − 0 = 0.2 Even if the random variable 𝑋 can assume only
𝑓 0 = 0.7 − 0.2 = 0.5 integer values, the cumulative distribution
𝑓 2 = 1.0 − 0.7 = 0.3 function is defined at non-integer values.

Sec 3.2 Cumulative Distribution


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 11
Functions
Mean and Variance of a Discrete Random Variable
• Used to summarize a probability distribution Mean or expected value
• Mean: measure of center or middle of the 𝜇 = 𝐸 𝑋 = ෍ 𝑥𝑓(𝑥)
probability distribution
𝑥
• For a discrete random variable, a
weighted average of possible values Variance
with weights equal to probabilities 𝜎2 = 𝑉 𝑋 = 𝐸 𝑋 − 𝜇 2
• Variance: measure of the dispersion, or = ෍ 𝑥 − 𝜇 2 𝑓 𝑥 = ෍ 𝑥 2 𝑓 𝑥 − 𝜇2
variability in the distribution 𝑥 𝑥
• For a discrete random variable, a
weighted measure of each possible Standard deviation
squared deviation with weights equal to 𝜎 = 𝜎2
probabilities
Sec 3.3 Mean and Variance of a Discrete Random Variable
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 12
Example 3.7 | Digital Channel
• In Example 3.3, there is a chance that a bit 𝑥 − 0.4 𝑥 − 0.4 2
𝑓 𝑥 × 𝑥 − 0.4 2
𝑥 𝑓(𝑥)
transmitted through a digital transmission
0 −0.4 0.160 0.6561 0.1050
channel is received in error. 𝑋 is the
1 0.6 0.360 0.2916 0.1050
number of bits received in error of the
next 4 transmitted. 2 1.6 2.560 0.0486 0.1244

3 2.6 6.760 0.0036 0.0243


• The probabilities are shown in the table in
4 3.6 12.960 0.0001 0.0013
the 𝑓 𝑥 column.
• Use the table to calculate the mean &
variance. Variance
5
Mean 𝜇 = 𝐸 𝑋 = 0 × 𝑓 0 + 1 × 𝑓 1 + 2 × 𝑓 2 +
3 × 𝑓 3 + 4 × 𝑓 4 = 0 × 0.6561 + 1 × 0.2916 + 𝜎 2 = 𝑉 𝑋 = ෍ 𝑓 𝑥𝑖 𝑥𝑖 − 0.4 2 = 0.36
2 × 0.0486 + 3 × 0.0036 + 4 × 0.0001 = 0.4 𝑖=1
Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 13
Expected Value of a Function of a Discrete Random
Variable
• If 𝑋 is a discrete random variable with probability mass function
𝑓(𝑥), then

𝐸[ℎ(𝑥)] = ෍ ℎ(𝑥)𝑓(𝑥)
𝑥
• The variance can be considered as an expected value of a
specific function of 𝑋, namely, ℎ 𝑋 = 𝑋 − 𝜇 2

Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 14
Example 3.9 | Digital Channel
• In Example 3.7, 𝑋 is the number of bits received in error of the
next 4 transmitted.
• The probabilities are shown in the table in the 𝑓 𝑥 column.
• What is the expected value of the square of the number of bits
in error?
• ℎ 𝑋 = 𝑋2

Sec 3.3 Mean and Variance of a Discrete Random Variable

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 15
Famous Types of Discrete Random Variables:
(1) Discrete Uniform Distribution

• Let 𝑋 be a discrete random variable ranging from 𝑎, 𝑎 + 1, 𝑎 +


2, … , 𝑏, for 𝑎 ≤ 𝑏.
• There are 𝑏 – 𝑎 + 1 values in the inclusive interval.
• Therefore 𝑓(𝑥) = 1/(𝑏 − 𝑎 + 1)
Sec 3.4 Discrete Uniform Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 16
Mean and Variance of Discrete Uniform Distribution

Sec 3.4 Discrete Uniform Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 17
Example 3.11 | Number of Voice Lines
Let the random variable 𝑋 denote the number of the 48 voice
lines that are in use at a particular time. Assume that 𝑋 is a
discrete uniform random variable with a range of 0 to 48. Find
𝐸(𝑋) and 𝜎.

Practical Interpretation
The average number of lines in use is 24, but the
dispersion (as measured by 𝜎) is large. Therefore, at
many times far more or fewer than 24 lines are used.

Sec 3.4 Discrete Uniform Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 18
(2) Binomial Distribution
An experiment is said to be a binomial experiment if
1. The experiment is performed a fixed number of times. Each
repetition of the experiment is called a trial.
2. The trials are independent. This means the outcome of one trial
will not affect the outcome of the other trials.
3. For each trial, there are two mutually exclusive (or disjoint)
outcomes, success or failure.
4. The probability of success is fixed for each trial of the
experiment. Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved
19
(2) Binomial Distribution (contd.)
Notations:
• There are n independent trials of the experiment.
• Let p denote the probability of success so that 1 – p is the
probability of failure.
• The random variable 𝑋 that equals the number of trials that
result in a success out of 𝑛 trials is a binomial random variable
with parameters 0 < 𝑝 < 1 and 𝑛 = 1, 2, … .

Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved
20
(2) Binomial Distribution (contd.)
• The probability mass function is:
𝑛
𝑓 𝑥 = 𝑥
𝑝𝑥 1 − 𝑝 𝑛−𝑥 = 𝐶𝑥𝑛 𝑝 𝑥 1 − 𝑝 𝑛−𝑥 , for 𝑥 = 0, 1, … , 𝑛
Remember:

10 10! 10 ⋅ 9 ⋅ 8 ⋅ 7!
= = = 120
3 3! 7! 3 ⋅ 2 ⋅ 1 ⋅ 7!

15 15! 15 ⋅ 14 ⋅ 13 ⋅ 12 ⋅ 11 ⋅ 10!
= = = 3,003
10 10! 5! 5 ⋅ 4 ⋅ 3 ⋅ 2 ⋅ 1 ⋅ 10!

100 100! 100 ⋅ 99 ⋅ 98 ⋅ 97 ⋅ 96!


= = = 3,921,225
4 96! 4! 4 ⋅ 3 ⋅ 2 ⋅ 1 ⋅ 96!
Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 21
Example 3.15a | Organic Pollution
Each sample of water has a 10% chance of containing a particular
organic pollutant. Assume that the samples are independent with
regard to the presence of the pollutant. Find the probability that, in the
next 18 samples, exactly 2 contain the pollutant.
Answer: Let 𝑋 denote the number of samples that contain the
pollutant in the next 18 samples analyzed. Then 𝑋 is a binomial
random variable with 𝑝 = 0.1 and 𝑛 = 18.
18
𝑃 𝑋=2 = 0.12 1 − 0.1 18−2 = 153 0.1 2 0.9 16 = 0.2835
2
In Microsoft Excel®, use the BINOMDIST function to calculate
𝑃 𝑋 = 2 by 𝐁𝐈𝐍𝐎𝐌𝐃𝐈𝐒𝐓(𝟐, 𝟏𝟖, 𝟎. 𝟏, 𝐅𝐀𝐋𝐒𝐄)
Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 22
Example 3.15b | Organic Pollution
Determine the probability that at least 4 samples contain the
pollutant.
Answer: The problem calls for calculating 𝑃 𝑋 ≥ 4 but is easier
to calculate the complementary event, 𝑃 𝑋 ≤ 3 , so that:
3
18
𝑃 𝑋 ≥4 =1−෍ 0.1𝑥 0.9 18−𝑥
𝑥
𝑥=0
= 1 − 0.150 + 0.300 + 0.284 + 0.168 = 0.098
In Microsoft Excel®, use the BINOMDIST function to calculate
𝑃 𝑋 ≥ 4 by = 𝟏 − 𝐁𝐈𝐍𝐎𝐌𝐃𝐈𝐒𝐓(𝟑, 𝟏𝟖, 𝟎. 𝟏, 𝐓𝐑𝐔𝐄)
Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 23
Example 3.15c | Organic Pollution
Determine the probability that 3 ≤ 𝑋 < 7.
Answer:
6
18
𝑃 3≤𝑋<7 =෍ 0.1𝑥 0.9 18−𝑥
𝑥
𝑥=3
= 0.168 + 0.070 + 0.022 + 0.005 = 0.265
In Microsoft Excel®, use the BINOMDIST function to calculate
𝑃 3≤𝑋<7 by 𝐁𝐈𝐍𝐎𝐌𝐃𝐈𝐒𝐓 𝟔, 𝟏𝟖, 𝟎. 𝟏, 𝐓𝐑𝐔𝐄 −
𝐁𝐈𝐍𝐎𝐌𝐃𝐈𝐒𝐓(𝟐, 𝟏𝟖, 𝟎. 𝟏, 𝐓𝐑𝐔𝐄)

Sec 3.5 Binomial Distribution Appendix A, Table II (pp. A-5 to A-7) presents cumulative binomial
tables (for selected values of 𝑝 and 𝑛) that will simplify calculations.
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 24
Binomial Mean and Variance
If 𝑋 is a binomial random variable with parameters 𝑝 and 𝑛,
• The mean of 𝑋 is:
𝜇 = 𝐸(𝑋) = 𝑛𝑝
• The variance of 𝑋 is:
𝜎2 = 𝑉(𝑋) = 𝑛𝑝(1 − 𝑝)

Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 25
Example 3.16 | Binomial Mean and Variance
For the number of transmitted bits received in error in Example
3.13, 𝑛 = 4 and 𝑝 = 0.1 . Find the mean and variance of the
binomial random variable.

Answer:
𝜇 = 𝐸 𝑋 = 𝑛𝑝 = 4 ⋅ 0.1 = 0.4

𝜎2 = 𝑉 𝑋 = 𝑛𝑝 1 − 𝑝 = 4 ⋅ 0.1 ⋅ 0.9 = 0.36

Sec 3.5 Binomial Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 26
(3) Geometric Distribution
• Binomial distribution has
• Fixed number of trials
• Random number of successes
• Geometric distribution has reversed roles
• Random number of trials
• Fixed number of successes, in this case 1

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 27
Example 3.18 | Wafer Contamination
• The probability that a wafer contains a large particle of
contamination is 0.01. Assume that the wafers are independent.
What is the probability that exactly 125 wafers need to be
analyzed before a particle is detected?

Answer: Let 𝑋 denote the number of samples analyzed until a


large particle is detected. Then 𝑋 is a geometric random variable
with parameter 𝑝 = 0.01.

𝑃(𝑋 = 125) = 0.99 124 (0.01) = 0.00289


Sec 3.6 Geometric and Negative
Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 28
Geometric Mean and Variance

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 29
Example 3.19 | Mean and Standard Deviation
• Consider the transmission of bits example again. The probability that
a bit transmitted through a digital transmission channel is received in
error is 𝑝 = 0.1. Assume that the transmissions are independent
events and let the random variable 𝑋 denote the number of bits
transmitted until the first error is encountered. Find the mean and
standard deviation.
Answer:

Mean: 𝜇 = 𝐸(𝑋) = 1 / 𝑝 = 1 / 0.1 = 10


Variance: 𝜎2 = 𝑉(𝑋) = (1 − 𝑝) / 𝑝2 = 0.9 / 0.01 = 90
Standard deviation: 90 = 9.49
Sec 3.6 Geometric and Negative
Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 30
Lack of Memory Property
• For a geometric random variable, the trials are independent.
• Count of the number of trials until the next success can be
started at any trial without changing the probability distribution
of the random variable.
• For all transmissions, the probability of an error remains
constant. Hence, the geometric distribution is said to lack any
memory.

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 31
Example 3.20 | Lack of Memory Property
In Example 3.19, the probability that a bit is transmitted in error
is 𝑝 = 0.1. Suppose 50 bits have been transmitted. What is the
mean number of bits transmitted until the next error?
Answer:

The mean number of bits transmitted until the next error, after
50 bits have already been transmitted, is 1/0.1 = 10, the same
result as the mean number of bits until the first error.

Sec 3.6 Geometric and Negative


Binomial Distributions
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 32
(4) Poisson Distribution
A widely used distribution emerges from the concept that events
occur randomly in an interval (or, more generally, in a region). The
random variable of interest is the count of events that occur within
the interval.

Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 33
Example 3.27a | Wire Flaws
• Assume that flaws occur at random along the length of a thin
copper wire. Let 𝑋 denote the random variable that counts the
number of flaws in a length of 𝑇 mm of wire and suppose that
the average number of flaws is 2.3 per mm. Find the probability
of exactly 10 flaws in 5 mm of wire.
Answer:
Let 𝑋 denote the number of flaws in 5 mm of wire. Then 𝑋 has the Poisson
distribution with . Therefore,

Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 34
Example 3.27b | Wire Flaws
• Find the probability of at least 1 flaw in 2 mm of wire.
Answer:

Let 𝑋 denote the number of flaws in 2 mm of wire. Then 𝑋 has


the Poisson distribution with
Therefore,

Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 35
Poisson Mean and Variance

• The mean and variance of the Poisson model are the same.
• For example, if particle counts follow a Poisson distribution with a mean of
25 particles per square centimeter, the variance is also 25 and the
standard deviation of the counts is 5 per square centimeter.
• If the variance of a data is much greater than the mean, then the Poisson
distribution would not be a good model for the distribution of the random
variable.
Sec 3.8 Poisson Distribution

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 36
Important Terms & Concepts of Chapter 3
• Lack of memory property—discrete random
• Binomial distribution
variable
• Cumulative distribution function—
• Mean—discrete random variable
discrete random variable
• Poisson distribution
• Discrete uniform distribution
• Probability distribution—discrete random
• Expected value of a function of a
variable
discrete random variable
• Probability mass function
• Geometric distribution
• Standard deviation—discrete random
variable
• Variance—discrete random variable

Chapter 3 Important Terms and Concepts Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 37

You might also like