CH 5
CH 5
DISCRETE RANDOM
VARIABLES AND THEIR
PROBABILITY
DISTRIBUTIONS
RANDOM VARIABLES
● Definition
● A random variable is a variable whose
value is determined by the outcome of a
random experiment.
Discrete Random Variable
● Definition
● A random variable that assumes countable
values is called a discrete random
variable.
Examples of discrete random variables
● Definition
● A random variable that can assume any value
contained in one or more intervals is called a
continuous random variable.
Continuous Random Variable
Examples of continuous random variables
● Definition
● The probability distribution of a
discrete random variable lists all the
possible values that the random variable
can assume and their corresponding
probabilities.
Example 5-1
● Definition
● The symbol n!, read as “n factorial,”
represents the product of all the integers
from n to 1. In other words,
● n! = n(n - 1)(n – 2)(n – 3) · · · 3 · 2 · 1
● By definition,
● 0! = 1
Examples
Evaluate 7!
7! = 7 · 6 · 5 · 4 · 3 · 2 · 1 = 5040
Evaluate 10!
10! = 10 · 9 · 8 · 7 · 6 · 5 · 4 · 3 · 2 · 1
= 3,628,800
Evaluate (12-4)!
(12-4)! = 8! = 8 · 7 · 6 · 5 · 4 · 3 · 2 · 1
= 40,320
Example 5-11
Evaluate (5-5)!
(5-5)! = 0! = 1
Note that 0! is always equal to 1.
FACTORIALS, COMBINATIONS, AND
PERMUTATIONS
● Combinations
● Definition
● Combinations give the number of ways x elements
can be selected from n elements. The notation used
to denote the total number of combinations is
● where
● n = total number of trials
● p = probability of success
● q = 1 – p = probability of failure
● x = number of successes in n trials
● n - x = number of failures in n trials
● Let
D = a selected DVD player is defective P(D)=.05
● G = a selected DVD player is good P(G)=.95