0% found this document useful (0 votes)
22 views

Chap05 - Discrete Random Variables

The document discusses discrete probability distributions including definitions of discrete random variables, probability distributions, and expected value and variance. It also covers specific discrete distributions like the binomial and Poisson distributions. Examples are provided to help explain key concepts.

Uploaded by

Anh Pham Minh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views

Chap05 - Discrete Random Variables

The document discusses discrete probability distributions including definitions of discrete random variables, probability distributions, and expected value and variance. It also covers specific discrete distributions like the binomial and Poisson distributions. Examples are provided to help explain key concepts.

Uploaded by

Anh Pham Minh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Discrete Probability Distributions

McGraw-Hill/Irwin Copyright © 2015 by The McGraw-Hill Companies, Inc. All rights reserved.
Discrete Probability Distributions

Chapter Contents

5.1 Discrete Probability Distributions


5.2 Expected Value and Variance
5.3 Binomial Distribution
5.4 Poisson Distribution
5.5 Transformations of Random Variables (Optional)

6-2
5.1 Discrete Distributions

Random Variables

• A random variable is a function or rule that assigns a


numerical value to each outcome in the sample space
of a random experiment.
• Nomenclature:
- Upper case letters are used to represent
random variables (e.g., X, Y).
- Lower case letters are used to represent
values of the random variable (e.g., x, y).
• A discrete random variable has a countable number of
distinct values.
6-3
5.1 Discrete Distributions

Probability Distributions
• A discrete probability distribution assigns a probability to
each value of a discrete random variable X.
• To be a valid probability distribution, the following must
be satisfied.

6-4
5.1 Discrete Distributions

Example: Coin Flips Table 6.1)


When a coin is flipped 3 times, the sample space will
be S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}.

If X is the number TABLE 5.1


of heads, then X
is a random
variable whose
probability
distribution is
given in Table
5.1.

6-5
5.1 Discrete Distributions

Example: Coin Flips

Note that the values of X need Note also that a discrete probability
not be equally likely. However, distribution is defined only at specific
they must sum to unity. points on the X-axis.

FIGURE 5.2

6-6
5.1 Discrete Distributions

What is a PDF or CDF?

• A probability distribution function (PDF) is a mathematical function


that shows the probability of each X-value.

• A cumulative distribution function (CDF) is a mathematical function


that shows the cumulative sum of probabilities, adding from the
smallest to the largest X-value, gradually approaching unity.

6-7
5.1 Discrete Distributions
FIGURE 5.3
What is a PDF or CDF?
Consider the following illustrative histograms:
CDF = P(X ≤ x)
0.25
PDF = P(X = x) 1.00
0.90
0.20 0.80
0.70
Probability

Probability
0.15 0.60
0.50
0.10 0.40
0.30
0.05 0.20
0.10
0.00 0.00
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14
Value of X Value of X

Illustrative PDF Cumulative CDF


(Probability Density Function) (Cumulative Density Function)

6-8
5.2 Expected Value and Variance

Expected Value
• The expected value E(X) of a discrete random variable is the
sum of all X-values weighted by their respective probabilities.
• E(X) is a measure of central tendency.
• If there are n distinct values of X, then

6-9
5.2 Expected Value and Variance
Example: Service Calls
• The distribution of Sunday emergency service calls by Ace Appliance
Repair is shown in Table 5.2. The probabilities sum to 1, as must be
true for any probability distribution.

TABLE 5.4

E(X) = μ = 0(.05) + 1(.10) + 2(.30) + 3(.25) + 4(.20) + 5(.10) = 2.75


6-10
5.2 Expected Value and Variance
Example: Service Calls
FIGURE 5.5

This particular probability


distribution is not
symmetric around the
mean µ = 2.75.
However, the mean is
still the balancing point,
or fulcrum.

µ = 2.75

E(X) is an average and it does not have to be an observable point.


6-11
5.2 Expected Value and Variance
Variance and Standard Deviation
• If there are n distinct values of X, then the variance of a
discrete random variable is:

• The variance is a weighted average of the dispersion


about the mean, denoted either as s2 or V(X). It is a
measure of variability.
• The standard deviation is the square root of the variance
and is denoted s.

6-12
5.2 Expected Value and Variance
Example: Bed and Breakfast
• The Bay Street Inn is a seven-room bed-and-breakfast in the sunny California
coastal city of Santa Theresa. Demand for rooms generally is strong during
February, a prime month for tourists. However, experience shows that demand
is quite variable. The probability distribution of room rentals during February is
shown in Table 5.3 where X = the number of rooms rented (X = 0, 1, 2, 3, 4, 5,
6, 7). The worksheet shows the calculation of E(X) and Var(X).

TABLE 5.6

6-13
5.2 Expected Value and Variance
Example: Bed and Breakfast
The histogram shows that the distribution is skewed to the
left and bimodal. FIGURE 5.7

0.30

0.25

0.20
The mode is 7 Probability

rooms rented but 0.15

the average is only 0.10


4.71 room rentals. 0.05

0.00
0 1 2 3 4 5 6 7
Num ber of Room s Rented

s = 2.06 indicates considerable variation around µ.


6-14
5.3 Binomial Distribution

Bernoulli Experiments

• A random experiment with only 2 outcomes is a Bernoulli


experiment.
• One outcome is arbitrarily labeled a “success” (denoted X = 1)
and the other a “failure” (denoted X = 0).
• p is the P(success), 1 – p is the P(failure).
• “Success” is defined as the less likely outcome so that p < .5 for
convenience.
• Note that P(0) + P(1) = (1 – p) + p = 1 and 0 ≤ p ≤ 1.
• The expected value (mean) and variance of a Bernoulli experiment
is calculated as:
• E(X) = p and V(X) = p(1 - p)
6-15
5.3 Binomial Distribution

Characteristics of the Binomial Distribution

• The binomial distribution arises when a Bernoulli


experiment is repeated n times.
• Each trial is independent so the probability of success p
remains constant on each trial.
• In a binomial experiment, we are interested in X = number
of successes in n trials. So 𝑋 = 𝑋! + 𝑋" + ⋯ + 𝑋# .
• The probability of a particular number of successes P(X)
is determined by parameters n and p.

6-16
5.3 Binomial Distribution

Characteristics of the Binomial Distribution TABLE 5.8

6-17
5.3 Binomial Distribution
Example: Quick Oil Change Shop
• It is important to quick oil change shops to ensure that a
car’s service time is not considered “late” by the
customer.
• Service times are defined as either late or not late.
• X is the number of cars that are late out of the total
number of cars serviced.
• Assumptions:
- cars are independent of each other
- probability of a late car is consistent

6-18
5.3 Binomial Distribution
Example: Quick Oil Change Shop

• What is the probability that exactly 2 of the next n = 12


cars serviced are late (P(X = 2))?
• P(car is late) = p = .10
• P(car is not late) = 1 - p = .90

6-19
5.3 Binomial Distribution

Application: Uninsured Patients

• On average, 20% of the emergency room patients at


Greenwood General Hospital lack health insurance.
• In a random sample of 4 patients, what is the probability
that at least 2 will be uninsured?
• X = number of uninsured patients (“success”)
• P(uninsured) = p = 20% = 0.20.
• P(insured) = 1 – p = 1 - .20 = .80.
• n = 4 patients
• The range is X = 0, 1, 2, 3, 4 patients.

6-20
5.3 Binomial Distribution
Compound Events
• Individual probabilities can be added to obtain any
desired event probability.
• For example, the probability that the sample of 4
patients will contain at least 2 uninsured patients is
(HINT: What inequality means “at least?”).
• P(X ³ 2) = P(2) + P(3) + P(4)
• = .1536 + .0256 + .0016 = .1808

6-21
5.3 Binomial Distribution
Compound Events
• What is the probability that fewer than 2 patients have
insurance? HINT: What inequality means “fewer than?”
• P(X < 2) = P(0) + P(1) = .4096 + .4096 = .8192.
• What is the probability that no more than 2 patients have
insurance? HINT: What inequality means “no more than?”
• P(X £ 2) = P(0) + P(1) + P(2) = .4096 + .4096 + .1536 =
.9728

6-22
5.3 Binomial Distribution

Compound Events
FIGURE 5.9
It is helpful to sketch a diagram:

6-23
5.4 Poisson Distribution

• The Poisson distribution describes the number of occurrences


within a randomly chosen unit of time (e.g., minute, hour) or space
(square foot, linear mile).
• The events occur randomly and independently over a continuum of
time or space.
• We will call the continuum “time” since the most common Poisson
application is modeling arrivals per unit of time.

Each dot (•) is an occurrence of the event of interest.


6-24
5.4 Poisson Distribution
• Let X = the number of events per unit of time.
• X is a random variable that depends on when the unit of time is
observed.
• For example, we could get X = 3 or X = 1 or
X = 5 events, depending on where the randomly chosen unit of time
happens to fall.

• The Poisson model’s only parameter is l (Greek letter


“lambda”) where l represents the mean number of events per
unit of time or space.
6-25
5.4 Poisson Distribution
Characteristics of the Poisson Distribution

TABLE 5.10

6-26
5.4 Poisson Distribution
Example: Credit Union Customers
• On Thursday morning between 9 a.m. and 10 a.m. customers arrive
and enter the queue at the Oxnard University Credit Union at a
mean rate of 1.7 customers per minute. Using the Poisson formulas
with l = 1.7, find the PDF, mean and standard deviation:
• Note the unit for the mean and standard deviation is
customers/minute

6-27
5.4 Poisson Distribution

Compound Events

• Cumulative probabilities can be evaluated by summing


individual X probabilities.
• What is the probability that two or fewer customers will
arrive in a given minute?
• P(X £ 2) = P(0) + P(1) + P(2) = .1827 + .3106 + .2640 =
.7573.
• What is the probability of at least three customers (the
complimentary event)?
• P(X ³ 3) = 1 - P(X £ 2) = 1 - .7573 =.2427

6-28
5.4 Poisson Distribution

• The Poisson distribution may be used to approximate a binomial by


setting l = np. This approximation is helpful when the binomial
calculation is difficult (e.g., when n is large).

• The general rule for a good approximation is that n should be “large”


and should be “small.” A common rule of thumb says the
approximation is that n should be “large” and p should be “small.”

• A common rule of thumb says the approximation is adequate if n ³ 20


and p £ .05.

6-29
5.5 Geometric Distribution

Characteristics of the Geometric Distribution

• The geometric distribution describes the number of


Bernoulli trials until the first success.
• X is the number of trials until the first success.
• X ranges from {1, 2, . . .} since we must have at least
one trial to obtain the first success. However, the
number of trials is not fixed.
• p is the constant probability of a success on each trial.

6-30
5.5 Geometric Distribution
Characteristics of the Geometric Distribution

Example: Telefund Calling


• At Faber University, 15% of the alumni make a donation or pledge
during the annual telefund.
• What is the probability that the first donation will occur on the 7th
call?
6-31
5.5 Geometric Distribution

Example: Telefund Calling


• What is p? p = .15.
• P(x) = p(1– p)x-1
• P(7) = .15(1–.15)7-1 = .15(.85)6 = .0566
• What are the mean and standard deviation of this distribution?

6-32
5.6 Transformation of Random Variables

Linear Transformations

• A linear transformation of a random variable X is performed by


adding a constant or multiplying by a constant.
• Consider the random variable aX + b, where a and b are any
constants (a ³ 0). Then,

6-33
5.6 Transformation of Random Variables

Sums of Random Variables


• If we consider the sum of two independent random variables X and Y,
given as X + Y, Then,

• Note: This can be extended to the sum of any independent random


variables.

6-34
5.6 Transformation of Random Variables

Covariance

• If X and Y are not independent (i.e., if X and Y are correlated), then


we cannot use Rule 4 to find the standard deviation of their sum.
• The covariance of two random variables, denoted by Cov(X,Y) or
𝜎!" , describes how the variables vary in relation to each other.
• Cov(X,Y) > 0, indicates that the two variables tend to move in the
same direction while Cov(X,Y) < 0 indicates that the two variables
move in opposite direction.

6-35

You might also like