4 Random Variables

Download as pdf or txt
Download as pdf or txt
You are on page 1of 68

University of Benghazi

Faculty of Engineering
Electrical and Electronics Engineering Department

Probability and Random (Stochastic) Process


EE277

Salma Elkawafi
[email protected]

ES233: Probability and Statistics - UOB


Goals
Understand the following:
• Random Variables
• Discrete random modeling
• Probability mass function and cumulative distribution function.
• Expectations and variance calculations.
• Probability distributions
❑ Bernoulli distribution
❑ Binomial distribution
❑ Multinomial distribution
❑ Poisson distribution
• Continuous random modeling
• Probability distribution function and cumulative distribution function
• Expectations and variance calculations.
• Probability distributions:
❑ Uniform distribution
❑ Exponential distribution
❑ Normal distribution
Random Variables
Random Variable Definition
• A random variable is a function that maps outcomes of a random experiment to real numbers.
(or) A random variable associates the points in the sample space with real numbers.
• A (real-valued) random variable, often denoted by 𝑋 (or some other capital letter), is a function
mapping a probability space (𝑆; 𝑃) into the real line 𝑅.
• The range of a random variable 𝑋 shown by 𝑅𝑎𝑛𝑔𝑒(𝑋) or 𝑅𝑋 is the set of possible values of X
• We shall use a capital letter, say 𝑋, to denote a random variable and its corresponding small
letter, 𝑥 in this case, for one of its values.
• Each possible value of 𝑋 represents an event that is a subset of the sample space for the given
experiment

In random experiments with outcomes that are not numerical, it is very useful to map outcomes to
numerical values. For example, in a coin toss experiment : 𝐻 → 1 𝑎𝑛𝑑 𝑇 → 0.
Random Variables
If a sample space contains a finite number If a sample space contains an infinite
of possibilities or an unending sequence number of possibilities equal to the
with as many elements as there are whole number of points on a line segment, it
numbers, it is called a discrete sample is called a continuous sample space.
space.
A Random variable whose set of
Random variable X is discrete if the set of possible values is an entire interval of
all possible values of X (that is, the range of numbers is not discrete. When a
the function represented by X, denoted X, is random variable can take on values
countable. on a continuous scale, it is called a
continuous random variable.
Discrete Random Variables
The probability function, probability mass function (PMF): ‫دالة الثقل االحتمالي أو تابع الكثافة االحتمالية‬

Let 𝑋 be a discrete random variable with range 𝑅𝑋 = {𝑥1 , 𝑥2 , 𝑥3 , … } (finite or countably


infinite).

The function
𝑝 𝑥𝑘 = 𝑃𝑋 𝑥𝑘 = 𝑃(𝑋 = 𝑥𝑘 ), 𝑓𝑜𝑟 𝑘 = 1,2,3, … ,
is called the probability mass function (PMF) of 𝑋.

The set of ordered pairs (𝑥, 𝑝(𝑥)) is called the probability function or probability mass
function, of the discrete random variable 𝑋.

For example: 𝑝 3 = 𝑃(𝑋 = 3).


A discrete random variable assumes each of its values with a certain probability.
Discrete Random Variables
It is often instructive to present the probability mass function in a graphical format
If:
1
𝑝(0) =
16

4
𝑝 1 =
16

6
𝑝 2 =
16

4
𝑝 3 =
16 Probability Mass Function Plot Probability Histogram
1
𝑝 4 =
16
Random Variables
For example, An experiment consists of tossing two fair coins. Letting 𝑌 denotes the
number of heads appearing. 𝑌 is RV with possible values 0,1,2 with respective
probabilities

1
𝑝 0 = 𝑃 𝑌=0 =4

1
𝑝 1 = 𝑃(𝑌 = 1) = 2

1 Probability mass function plot


𝑝 2 = 𝑃(𝑌 = 2) = 4

𝑦 0 1 2
𝑝(𝑦) 1 1 1
4 2 4
Discrete Random Variables
The probability function, probability mass function

• It is convenient to represent all the probabilities of a random variable 𝑋 by a formula.


Such a formula would necessarily be a function of the numerical values 𝑥 that we
shall denote by 𝑓 𝑥 , 𝑔 𝑥 , 𝑝 𝑥 , … .

• For each possible outcome:


• 𝑝(𝑥) ≥ 0
• σ𝑥 𝑝(𝑥) = 1
• 𝑃(𝑋 = 𝑥) = 𝑝(𝑥)
• for any set 𝐴 ⊂ 𝑅𝑋 , 𝑃(𝑋 ∈ 𝐴) = σ𝑥∈𝐴 𝑝(𝑥).
Discrete Random Variables
Example
A shipment of 20 similar laptop computers to a retail outlet contains 3 that are
defective. If a school makes a random purchase of 2 of these computers, find the
probability distribution for the number of defectives.
Solution
Let 𝑋 be a random variable whose values 𝑥 are the possible numbers of defective
computers purchased by the school. Then 𝑥 can only take the numbers 0, 1, and 2
𝐶03 𝐶217 68
𝑝 0 =𝑃 𝑋=0 = =
𝐶220 95
𝐶13 𝐶117 51
𝑝 1 =𝑃 𝑋=1 = =
𝐶220 190
𝐶23 𝐶017 3
𝑝 2 =𝑃 𝑋=2 = 𝐶220
= 190 𝑥 0 1 2
Thus, the probability distribution of 𝑋 is 𝑝(𝑥) 68 51 3
95 190 190
Discrete Random Variables
The cumulative distribution function
The cumulative distribution function 𝐹(𝑥) of a discrete random variable 𝑋 with
probability mass distribution 𝑝(𝑥) is
𝐹 𝑥 = 𝑃 𝑋 ≤ 𝑥 = σ𝑡≤𝑥 𝑝 𝑡 , 𝑓𝑜𝑟 − ∞ < 𝑥 < ∞
Experiment of tossing two fair coins
1
𝑝 0 =
4
1
𝑝 1 =
2
1
𝑝 2 =
4
0, 𝑓𝑜𝑟 𝑥 < 0
1
, 𝑓𝑜𝑟 0 ≤ 𝑥 < 1
𝐹 𝑥 = 4
3
, 𝑓𝑜𝑟 1 ≤ 𝑥 < 2 Discrete Cumulative Distribution Function
4
1, 𝑓𝑜𝑟 𝑥 ≥ 2
Discrete Random Variables
The cumulative distribution function
The distribution function 𝐹(𝑥) has the following properties:
1. 𝐹(𝑥) is nondecreasing [𝑖. 𝑒. , 𝐹(𝑥) ≤ 𝐹(𝑦) 𝑖𝑓 𝑥 ≤ 𝑦].
2. lim 𝐹 𝑥 = 0; lim 𝐹 𝑥 = 1;
𝑥→−∞ 𝑥→∞
3. For all 𝑎 ≤ 𝑏, we have 𝑃(𝑎 < 𝑋 ≤ 𝑏) = 𝐹(𝑏) − 𝐹(𝑎)
4. 𝑃(𝑋 < 𝑥) = 𝑃(𝑋 ≤ 𝑥) − 𝑃(𝑋 = 𝑥) = 𝐹(𝑥) − 𝑝(𝑥)
5. 𝑃 𝑋 > 𝑥 = 1 − 𝑃 𝑋 ≤ 𝑥 = 1 − 𝐹(𝑥)
If 𝑋 takes on only a finite number of values 𝑥1 , 𝑥2 , . . . , 𝑥𝑛 , then
the distribution function is given by:
0 − ∞ < 𝑥 < 𝑥1
𝑝 𝑥1 𝑥1 ≤ 𝑥 < 𝑥2
𝐹 𝑥 = 𝑝 𝑥1 + 𝑝(𝑥2 ) 𝑥2 ≤ 𝑥 < 𝑥𝑛
⋮ Discrete Cumulative
𝑝 𝑥1 + 𝑝 𝑥2 + ⋯ 𝑝(𝑥𝑛 ) 𝑥𝑛 ≤ 𝑥 < ∞ Distribution Function
Discrete Random Variables
Example
Let 𝑋 be a discrete random variable with the following PMF
0.3 , 𝑓𝑜𝑟 𝑥 = 3
0.2, 𝑓𝑜𝑟 𝑥 = 5
𝑝 𝑥 = 0.3, 𝑓𝑜𝑟 𝑥 = 8
0.2, 𝑓𝑜𝑟 𝑥 = 10
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Find the CDF of X
Solution 𝐹 𝑥 = 𝑃 𝑋 ≤ 𝑥 = ෍ 𝑝 𝑡 , 𝑓𝑜𝑟 − ∞ < 𝑥 < ∞
𝑡≤𝑥

0 − ∞ < 𝑥 < 𝑥1 0 −∞<𝑥 <3


𝑝 𝑥1 𝑥1 ≤ 𝑥 < 𝑥2 0.3 3≤𝑥<5
𝐹 𝑥 = 𝑝 𝑥1 + 𝑝(𝑥2 ) 𝑥2 ≤ 𝑥 < 𝑥3 = 0.3 + 0.2 = 0.5 5≤𝑥<8
𝑝 𝑥1 + 𝑝 𝑥2 + 𝑝(𝑥3 ) 𝑥3 ≤ 𝑥 < 𝑥4 0.3 + 0.2 + 0.3 = 0.8 8 ≤ 𝑥 < 10
𝑝 𝑥1 + 𝑝 𝑥2 + 𝑝(𝑥3 ) + 𝑝(𝑥4 ) 𝑥4 ≤ 𝑥 < ∞ 0.3 + 0.2 + 0.3 + 0.2 = 1 𝑥 ≥ 10
Discrete Random Variables
Expected Value:

If 𝑋 is a discrete random variable having a probability mass function 𝑝(𝑥), 𝑃𝑋 𝑥 then


the expectation, or the expected value, of 𝑋, denoted by 𝐸[𝑋], is defined by
𝐸𝑋 = ෍ 𝑥𝑝 𝑥 =𝜇
𝑥,𝑝 𝑥 >0
The expected value of 𝑋 is a weighted average of the possible values that 𝑋 can take
on, each value being weighted by the probability that 𝑋 assumes it.
For instance, on the one hand, if the probability mass function of X is given by
1
𝑝(0) = = 𝑝(1) Note:
2
𝑡ℎ𝑒𝑛 If the probabilities are all equal, it is
1 1 1 called the arithmetic mean, or simply
𝐸 𝑋 = 0× +1× =
2 2 2 the mean (𝝁)
Discrete Random Variables
Expected Value:

Assuming that 1 fair coin was tossed twice, we find that the sample space for
our experiment is 𝑆 = {𝐻𝐻, 𝐻𝑇, 𝑇𝐻, 𝑇𝑇}.
Since the 4 sample points are all equally likely, it follows that
1
𝑃 𝑋 = 0 = 𝑃 𝑇𝑇 =
4
1
𝑃(𝑋 = 1) = 𝑃(𝑇𝐻) + 𝑃(𝐻𝑇) =
2
1
𝑃(𝑋 = 2) = 𝑃(𝐻𝐻) =
4
Therefore,
1 1 1
𝜇 = 𝐸 𝑋 = 0 ∗ + 1 ∗ + 2 ∗ = 1
4 2 4
This result means that a person who tosses 2 coins over and over again will, on the average, get 1
head per toss.
Discrete Random Variables
Expected Value:

Some Theorems on Expectation

• If c is any constant, then


𝐸 𝑐𝑋 = 𝑐𝐸(𝑋)

• If X and Y are any random variables, then


𝐸 𝑋 + 𝑌 = 𝐸 𝑋 + 𝐸(𝑌)

• If X and Y are independent random variables, then


𝐸 𝑋𝑌 = 𝐸(𝑋)𝐸(𝑌)
Discrete Random Variables
Example:
Find 𝐸[𝑋], where 𝑋 is the outcome when we roll a fair die.

Solution:
1
Since 𝑝 1 = 𝑝 2 = 𝑝 3 = 𝑝 4 = 𝑝 5 = 𝑝 6 = 6
, we obtain
1 1 1 1 1 1 7
𝐸 𝑋 = 1× + 2× + 3× + 4× + 5× + 6× =
6 6 6 6 6 6 2
Discrete Random Variables
The Variance and Standard Deviation:

In the figure we have the histograms of two


discrete probability distributions that
have the same mean, 𝜇 = 2, but differ
considerably in variability, or the dispersion
of their observations about the mean.
Discrete Random Variables
The Variance and Standard Deviation:

Another quantity of great importance in probability and statistics is called the variance
and is defined by
𝑛
2
Var 𝑋 = 𝜎 2 = 𝐸 𝑋 − 𝜇 2 = ෍ 𝑥𝑗 − 𝜇 𝑝(𝑥𝑗 )
𝑗=1
The variance is a nonnegative number. The positive square root of the variance is
called the standard deviation and is given by
𝜎𝑋 = 𝜎 = Var 𝑋 = 𝐸[ 𝑋 − 𝜇 2 ]

The variance (or the standard deviation) is a measure of the dispersion,


or scatter, of the values of the random variable about the mean . If the
values tend to be concentrated near the mean, the variance is small; while
if the values tend to be distributed far from the mean, the variance is large.
Discrete Random Variables
The Variance and Standard Deviation:

• 𝜎2 = 𝐸 𝑋 − 𝜇 2 = 𝐸 𝑋2 − 𝐸 𝑋 2 = 𝐸 𝑋 2 − 𝜇2

• If X and Y are independent random variables,


Var 𝑋 + 𝑌 = Var 𝑋 + Var 𝑌

The variance of a sum of independent variables equals the sum of their variances.

• If 𝑐 and 𝑏 is any constant,


Var 𝑐𝑋 + 𝑏 = 𝑐 2 Var (𝑋)
Discrete Random Variables
Example:
Calculate 𝑉𝑎𝑟(𝑋) if 𝑋 represents the outcome when a fair die is rolled.

Solution:
𝑛
2
Var 𝑋 = 𝜎 2 = 𝐸 𝑋 2 − 𝜇2 = ෍ 𝑥𝑗 − 𝜇 𝑝(𝑥𝑗 )
𝑗=1
1 7
Since 𝑝 1 = 𝑝 2 = 𝑝 3 = 𝑝 4 = 𝑝 5 = 𝑝 6 = 6 and 𝐸 𝑋 = 2
2 2
1 2
1 2
1 2
1 2
1 2
1 1
𝐸 𝑋 = 1 × + 2 × + 3 × + 4 × + 5 × + 6 × = × 91
6 6 6 6 6 6 6
Hence,
2
2 2
91 7 35
Var 𝑋 = 𝐸 𝑋 − 𝐸 𝑋 = − =
6 2 12
Discrete Random Variables
Discrete random variables are often classified according to their probability
mass functions. In this course we consider some of these random variables,
including:

• The Bernoulli Random Variable/ Bernoulli distribution


• The Binomial Random Variable / Binomial distribution
• The Geometric Random Variable/ Geometric distribution
• The Poisson Random Variable/ Poisson distribution
Discrete Random Variables
The Bernoulli Random Variable/ Bernoulli distribution:

A trial, or an experiment, whose outcome can be classified as either a “success” or as a


“failure” is performed. If we let 𝑋 equal 1 if the outcome is a success and 0 if it is a
failure.
The random variable 𝑋 ∼ 𝐵𝑒𝑟𝑛𝑜𝑢𝑙𝑙𝑖(𝑝) is said to be a Bernoulli random variable if its
probability mass function is given by:

𝑝(0) = 𝑃{𝑋 = 0} = 1 − 𝑝,
𝑝(1) = 𝑃{𝑋 = 1} = 𝑝

where 𝑝, 0 ≤ 𝑝 ≤ 1, is the probability that the trial is a “success.”


Discrete Random Variables
Example:

(Expectation of a Bernoulli Random Variable) Calculate 𝐸[𝑋] when 𝑋 is a Bernoulli


random variable with parameter 𝑝.
Solution:

Since 𝑓(0) = 1 − 𝑝, 𝑓(1) = 𝑝, we have


𝐸[𝑋] = 0(1 − 𝑝) + 1(𝑝) = 𝑝

Thus, the expected number of successes in a single trial is just the probability that the
trial will be a success.
𝑉𝑎𝑟[𝑋] = 𝑝(1 − 𝑝)
Discrete Random Variables
The Binomial Random Variable /Binomial distribution:

Suppose that 𝑛 independent trials, each of which results in a “success” with probability
𝑝 and in a “failure” with probability 1 − 𝑝, are to be performed. If 𝑋 represents the
number of successes that occur in the 𝑛 trials, then X is said to be a binomial random
variable with parameters 𝑏 𝑥, 𝑛, 𝑝 , 𝐵𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑛, 𝑝 .
The probability mass function of a binomial random variable having parameters
𝑏(𝑥, 𝑛, 𝑝).is given by:
𝑛 𝑥
𝑝 𝑥 = 𝑃𝑋 𝑥 = 𝑝 1 − 𝑝 𝑛−𝑥 , 𝑥 = 0, 1, . . . , 𝑛
𝑥
Discrete Random Variables
The Binomial Random Variable /Binomial distribution:

Some Properties of the Binomial Distribution:

Mean 𝝁 = 𝒏𝒑
Variance 𝝈𝟐 = 𝒏𝒑(𝟏 − 𝒑)
Standard Deviation 𝝈= 𝒏𝒑(𝟏 − 𝒑)
Discrete Random Variables
Example:

Four fair coins are flipped. If the outcomes are assumed independent, what is the
probability that two heads and two tails are obtained?

Solution:
𝑛 𝑥 𝑛−𝑥 ,
𝑝 𝑥 = 𝑃𝑋 𝑥 = 𝑝 1 −𝑝 𝑥 = 0, 1, . . . , 𝑛
𝑥

Letting 𝑋 equal the number of heads (“successes”) that appear, then 𝑋 is a binomial
1
random variable with parameters (𝑥 = 2, 𝑛 = 4, 𝑝 = 2).
Hence,
2 2
4 1 1 3
𝑃 𝑋 = 2 = =
2 2 2 8
Discrete Random Variables
Example:

It is known that any item produced by a certain machine will be defective with
probability 0.1, independently of any other item. What is the probability that in a sample
of three items, at most one will be defective?

Solution:
𝑛 𝑥 𝑛−𝑥
𝑝 𝑥 = 𝑃𝑋 𝑥 = 𝑝 1 −𝑝 , 𝑥 = 0, 1, . . . , 𝑛
𝑥

If 𝑋 is the number of defective items in the sample, then 𝑋 is a binomial random


variable with parameters (3, 0.1). Hence, the desired probability is given by:

3 0 3
3 1 2
𝑃 𝑋 = 0 + 𝑃 𝑋 = 1 = 0.1 0.9 + 0.1 0.9 = 0.972
0 1
Discrete Random Variables
The Geometric Random Variable / Geometric distribution:

Suppose that independent trials, each having probability 𝑝 of being a success,


are performed until a success occurs. If we let 𝑁 be the number of trials required
until the first success, then 𝑁 ∼ 𝐺𝑒𝑜𝑚𝑒𝑡𝑟𝑖𝑐(𝑝) is said to be a geometric random
variable with parameter 𝑝. Its probability mass function is given by

𝑝 𝑛 = 𝑃{𝑁 = 𝑛} = 1 − 𝑝 𝑛−1 𝑝, 𝑛 = 1, 2, . . .
Random Variables
For example, Suppose that we toss a coin having a probability 𝑝 of coming up heads.
Letting 𝑁 denotes the number of flips required to obtain head, then assuming that the
outcome of successive flips are independent, 𝑁 is a RV taking on one of the values
1,2,3,…,𝑛 with respective probabilities

𝑝 1 = 𝑃(𝑁 = 1) = 𝑝
𝑝 2 = 𝑃(𝑁 = 2) = 1 − 𝑝 𝑝
𝑝 3 = 𝑃 𝑁 = 3 = 1 − 𝑝 2𝑝
.
.
.
𝑛−1 𝑝
𝑝 𝑛 = 𝑃 𝑁 =𝑛 = 1− 𝑝
Probability mass function plot
Discrete Random Variables
Example:
(Expectation of a Geometric Random Variable)
Calculate the expectation of a geometric random variable having parameter 𝑝.
Solution:
∞ ∞ ∞
𝑑
𝐸 𝑁 = ෍ 𝑛𝑝 1 − 𝑝 𝑛−1 = 𝑝 ෍ 𝑛𝑞 𝑛−1 = 𝑝 ෍ 𝑞 𝑛
𝑑𝑞
𝑛=1 𝑛=1 𝑛=1

𝑑 𝑛
𝑑 𝑞 𝑝 1
=𝑝 ෍ 𝑞 =𝑝 = 2 =
𝑑𝑞 𝑑𝑞 1 − 𝑞 1−𝑞 𝑝
𝑛=1

1−𝑝
𝑉𝑎𝑟 𝑁 =
𝑝2
Discrete Random Variables
The Poisson Random Variable/ Poisson distribution:
A random variable 𝑋 ∼ 𝑃𝑜𝑖𝑠𝑠𝑜𝑛(𝜆) , taking on one of the values 0, 1, 2, . . . , is said to be
a Poisson
random variable with parameter 𝜆, if for some 𝜆 > 0,

𝜆𝑥
𝑝 𝑥; 𝜆 = 𝑃 𝑋 = 𝑥 = 𝑒 −𝜆
, 𝑥 = 0, 1, . . .
𝑥!
where 𝜆 is the average number of outcomes per unit time, distance, area, or
volume and 𝑒 = 2.71828 . . . .
The Poisson distribution is one of the most widely used probability distributions.
It is usually used in scenarios where we are counting the occurrences of certain
events in an interval of time or space.
Counting the number of customers who visit a certain store from 1pm to 2pm
Based on data from previous days, we know that on average λ=15 customers visit the
store.
Discrete Random Variables
The Poisson Random Variable/ Poisson distribution:

Some Properties of the Poisson Distribution:

Mean 𝝁=𝝀
Variance 𝝈𝟐 = 𝝀
Standard Deviation 𝝈= 𝝀
Discrete Random Variables
Example:

Suppose that the number of typographical errors on a single page of this course has a
Poisson distribution with parameter 𝜆 = 1. Calculate the probability that there is at
least one error on this page.

Solution:

𝜆 𝑥
𝑝 𝑥; 𝜆 = 𝑃 𝑋 = 𝑥 = 𝑒 −𝜆 , 𝑥 = 0, 1, . . .
𝑥!

𝑃{𝑋 ≥ 1} = 1 − 𝑃{𝑋 = 0} = 1 − 𝑒 −1 ≈ 0.633


Discrete Random Variables
Example:

During a laboratory experiment, the average number of radioactive particles passing


through a counter in 1 millisecond is 4. What is the probability that 6 particles enter the
counter in a given millisecond?

Solution:

𝜆 𝑥
𝑝 𝑥; 𝜆 = 𝑃 𝑋 = 𝑥 = 𝑒 −𝜆 , 𝑥 = 0, 1, . . .
𝑥!

4 6
𝑝 6; 4 = 𝑒 −4 = 0.1042.
6!
Continuous Random Variables
• A continuous random variable has a probability of 0 of assuming exactly any of its
values.
• Its probability distribution cannot be given in tabular form.
• In dealing with continuous variables, 𝑓(𝑥) is usually called the probability density
function (PDF), or simply the density function, of 𝑋.
Continuous Random Variables
Probability Density Function
The function 𝑓(𝑥) is a probability density function (pdf) for the continuous random
variable 𝑋, defined over the set of real numbers, if

• 𝑓(𝑥) ≥ 0, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑥 ∈ 𝑅.



• ‫׬‬−∞ 𝑓(𝑥) 𝑑𝑥 = 1
𝑏
• 𝑃 𝑎 < 𝑋 < 𝑏 = ‫𝑥𝑑 𝑥 𝑓 𝑎׬‬

Note : 𝑃(𝑎 < 𝑋 ≤ 𝑏) = 𝑃(𝑎 < 𝑋 < 𝑏) + 𝑃(𝑋 = 𝑏) = 𝑃(𝑎 < 𝑋 < 𝑏).
Continuous Random Variables
Example
Suppose that the error in the reaction temperature, in 𝐶 ° , for a controlled laboratory
experiment is a continuous random variable X having the probability density function.
𝑥2
𝑓 𝑥 = ቐ3 , −1 < 𝑥 < 2
0, elsewhere
(a) Verify that 𝑓(𝑥) is a density function.
(b) Find 𝑃(0 < 𝑋 ≤ 1)
Solution
∞ 2 𝑥2 𝑥3 2 8 1
a) ‫׬‬
−∞
𝑓 𝑥 𝑑𝑥 = ‫׬‬−1 𝑑𝑥 = ȁ−1 = + = 1
3 9 9 9
1 1𝑥 2 𝑥3 1 1
b) 𝑃 0 < 𝑋 ≤ 1 = ‫׬‬0 𝑓 𝑥 𝑑𝑥 = ‫׬‬0 𝑑𝑥 = ȁ0 =
3 9 9
Continuous Random Variables
Cumulative Distribution Function

The cumulative distribution function 𝐹(𝑥) of a continuous random variable 𝑋 with


density function 𝑓(𝑥) is
𝑥
𝐹 𝑥 = 𝑃 𝑋 ≤ 𝑥 = ‫׬‬−∞ 𝑓 𝑡 𝑑𝑡, 𝑓𝑜𝑟 − ∞ < 𝑥 < ∞

𝑑𝐹(𝑥)
𝑓 𝑥 =
𝑑𝑥

𝑃 𝑎 <𝑋 <𝑏 =𝐹 𝑏 −𝐹 𝑎
Continuous Random Variables 𝑥2
𝑓 𝑥 = ቐ3 , −1 < 𝑥 < 2
0, elsewhere
Example
For the density function of the previous example. Find 𝐹(𝑥), and use it to evaluate
𝑃(0 < 𝑋 ≤ 1).
Solution
𝑥 𝑥
𝑡2 𝑡3 𝑥 𝑥3 + 1
F 𝑥 = න 𝑓 𝑡 𝑑𝑡 = න 𝑑𝑡 = ቚ =
−∞ −1 3 9 −1 9
0, 𝑥 < −1
𝑥3 + 1
𝐹 𝑥 = , −1 < 𝑥 < 2
9
1, 𝑥 ≥2

2 1 1
𝑃 0<𝑋 ≤1 =𝐹 1 −𝐹 0 = − =
9 9 9
which agrees with the result obtained by using the density function example
Continuous Random Variables
Expected Value

Let 𝑋 be a continues random variable with probability distribution 𝑓(𝑥). The mean, or
expected value, of 𝑋 is

𝜇 = 𝐸 𝑋 = න 𝑥𝑓 𝑥 𝑑𝑥
−∞

Some Theorems on Expectation

• If c is any constant, then


𝐸 𝑐𝑋 = 𝑐𝐸(𝑋)
• If X and Y are any random variables, then
𝐸 𝑋 ∓ 𝑌 = 𝐸 𝑋 ∓ 𝐸(𝑌)
• If X and Y are independent random variables, then
𝐸 𝑋𝑌 = 𝐸(𝑋)𝐸(𝑌)
Continuous Random Variables
Example
Let 𝑋 be the random variable that denotes the life in hours of a certain electronic
device. The probability density function is
20,000
𝑓 𝑥 = ቐ 𝑥3 , 𝑥 > 100
0, elsewhere
Find the expected life of this type of device.
Solution

𝜇 = 𝐸 𝑋 = න 𝑥𝑓 𝑥 𝑑𝑥
−∞
∞ ∞
20,000 20,000 20,000 ∞
𝜇= න 𝑥 3
𝑑𝑥 = න 2
𝑑𝑥 = − ቚ = 2000
100 𝑥 100 𝑥 𝑥 100
Continuous Random Variables
Variance and Standard deviation

Let 𝑋 be a continues random variable with probability distribution 𝑓(𝑥) and mean 𝜇.
The variance of 𝑋 is

𝜎2 = 𝐸[ 𝑋 − 𝜇 2] =න 𝑥 − 𝜇 2 𝑓 𝑥 𝑑𝑥
−∞
2
𝜎 = 𝐸(𝑋 ) − 𝜇2 .
2

The positive square root of the variance, 𝜎, is called the standard deviation of 𝑋.
Continuous Random Variables
Example
The weekly demand for a drinking-water product, in thousands of liters, from a local chain of
efficiency stores is a continuous random variable X having the probability density
2 𝑥−1 , 1 <𝑥< 2
𝑓 𝑥 = ቊ
0, elsewhere
Find the mean and variance of X.
Solution
∞ 2
𝑥3 𝑥 2 2 5
𝜇 = 𝐸 𝑋 = න 𝑥𝑓 𝑥 𝑑𝑥 = න 𝑥 ∗ 2 𝑥 − 1 𝑑𝑥 = 2( − ) ቚ =
−∞ 1 3 2 1 3
2 4 3
2 2
𝑥 𝑥 2 17
𝐸 𝑋 = න 2 ∗ 𝑥 𝑥 − 1 𝑑𝑥 = 2( − ) ቚ =
1 4 3 1 6
2
2 2 2
17 5 1
𝜎 =𝐸 𝑋 −𝜇 = − =
6 3 8
Continuous Random Variables
Continuous random variables are often classified according to their
probability distribution functions. In this course we consider some of these
random variables, including:

• The Uniform distribution


• The Exponential distribution
• The Normal distribution
Continuous Random Variables
The Uniform distribution (Rectangular distribution)

• This distribution is characterized by a density function that is “flat,” and thus


the probability is uniform in a closed interval, say [𝐴, 𝐵].
• It can be (A,B) as well.

The density function of the continuous uniform random variable X on the interval
[𝐴, 𝐵] is
1
, A ≤𝑥≤B
𝑓 𝑥 = 𝑓(𝑥; 𝐴, 𝐵) = ቐ (𝐵 − 𝐴)
0, elsewhere
Continuous Random Variables
The Uniform distribution (Rectangular distribution)

• The mean and variance of the uniform distribution are


𝐴+𝐵
𝜇=
2
𝐵 − 𝐴 2
𝜎2 =
12
• The cumulative distribution function of a uniform random variable on the
interval [A, B] is given by
0, 𝑥≤𝐴
𝑥−𝐴
𝐹 𝑥 = , A<𝑥<B
(𝐵 − 𝐴)
1, 𝑥≥𝐵
Continuous Random Variables
The Uniform distribution (Rectangular distribution)
Example:
Suppose that a large conference room at a certain company can be reserved for no more
than 4 hours. Both long and short conferences occur quite often. In fact, it can be
assumed that the length 𝑋 of a conference has a uniform distribution on the interval [0, 4].
(a) What is the probability density function?
(b) What is the probability that any given conference lasts at least 3 hours?
Solution:
1
𝑓 𝑥 = ቐ4 , 0 ≤𝑥≤4
0, elsewhere

4
1 1
𝑃 𝑥≥3 = න 𝑑𝑥 =
3 4 4
Continuous Random Variables
The Exponential distribution
• The continuous random variable 𝑋 has an exponential distribution, with parameter
𝜆 > 0 , if its density function is given by;
−𝜆𝑥
𝑓 𝑥 = 𝑓 𝑥; 𝜆 = ቊ 𝜆𝑒 , 𝑥>0
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒,
The mean and variance of the exponential distribution are
1 1
𝜇 = 𝜆 and 𝜎 2 = 𝜆2
The cumulative distribution function 𝐹(𝑥) of an exponential random variable is given by
𝑎 𝑎
−𝜆𝑥 −𝜆𝑥
𝐹 𝑥 = 𝑃 𝑋 ≤ 𝑎 = න 𝜆𝑒 𝑑𝑥 = − 𝑒 ቚ = 1 − 𝑒 −𝜆𝑎
0 0
Continuous Random Variables
The Exponential distribution
Example:
Based on extensive testing, it is determined that the time 𝑌 in years before a major repair is required for
a certain washing machine is characterized by the density function.
1 −𝑦
𝑓(𝑦) = ቐ 4 𝑒 , 𝑦 >0
4

0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒,
1
Note that 𝑌 is an exponential random variable with 𝜆 = . The machine is considered a bargain if it is
4
unlikely to require a major repair before the sixth year. What is the probability 𝑃(𝑌 > 6)?
Solution:
𝐹 𝑦 = 𝑃 𝑌 ≤ 𝑦 = 1 − 𝑒 −𝜆𝑦
6
−4
𝑃 𝑌 >6 =1−𝑃 𝑌 ≤6 =1−𝐹 6 =1− 1 −𝑒 = 0.2231
Continuous Random Variables
The Normal distribution
• The most important continuous probability distribution in the entire field of statistics
is the normal distribution.
• Its graph, called the normal curve, or bell-shaped curve which approximately
describes many phenomena that occur in nature, industry, and research.
• Often referred to as the Gaussian distribution, in honor of Karl Friedrich Gauss

The density of the normal random variable 𝑋, is

1 𝑥−𝜇 2

𝑓 𝑥 = 𝑓 𝑥; 𝜇, 𝜎 = 𝑁 𝑥; 𝜇, 𝜎 = 𝑒 2𝜎2 , −∞ < 𝑥 < ∞
2𝜋 𝜎
with mean 𝜇 and variance 𝜎 2,

• Usually written as 𝑋~𝑁(𝜇, 𝜎 2 )


Continuous Random Variables
The Normal distribution
Properties of the normal distribution:
• The mode, which is the point on the horizontal axis where the curve is a maximum,
occurs at 𝑥 = 𝜇.
• The curve is symmetric about a vertical axis through the mean 𝜇.

• The curve has its points of inflection at 𝑥 = 𝜇 ± 𝜎; it is


concave downward if 𝜇 − 𝜎 < 𝑋 < 𝜇 + 𝜎 and is
concave upward otherwise
• The normal curve approaches the horizontal axis
asymptotically as we proceed in either direction away
from the mean.
• The total area under the curve and above the horizontal
axis is equal to 1.
Continuous Random Variables
The Normal distribution

Normal curves with 𝜇1 < 𝜇2 and 𝜎1 = 𝜎2


Normal curves with 𝜇1 = 𝜇2 and 𝜎1 < 𝜎2

Normal curves with 𝜇1 < 𝜇2 and 𝜎1 < 𝜎2


Continuous Random Variables
The Normal distribution
• Areas under the Normal Curve:
The curve of any continuous probability distribution or density function is constructed so that the area under the
curve bounded by the two ordinates 𝑥 = 𝑥1 and 𝑥 = 𝑥2 is given by:
𝑥−𝜇 2
𝑥2 𝑥2 1 −
𝑃(𝑥1 < 𝑋 < 𝑥2 ) = ‫;𝑥(𝑁 𝑥׬‬ 𝜇, 𝜎) 𝑑𝑥 = ‫ 𝑥׬‬2𝜋𝜎 𝑒 2𝜎2 𝑑𝑥
1 1

• There are many types of statistical software that can be used in


calculating areas under the normal curve.
• However, it would be a hopeless task to attempt to set up separate
tables for every value of 𝜇 and 𝜎. Fortunately, we are able to transform all
the observations of any normal random variable 𝑋 into a new set of
observations of a normal random variable 𝑍 with mean 0 and variance 1.
𝑋−𝜇
𝑍=
𝜎
𝑥−𝜇
Whenever 𝑋 assumes a value 𝑥, the corresponding value of 𝑍 is given by 𝑧 = .
𝜎
Therefore, if 𝑋 falls between the values 𝑥 = 𝑥1 and 𝑥 = 𝑥2 , the random variable 𝑍 will fall
𝑥 −𝜇 𝑥 −𝜇
between the corresponding values 𝑧1 = 1 and 𝑧2 = 2 .
𝜎 𝜎
Continuous Random Variables
The Normal distribution
• Consequently, we may write
𝑥2
1 𝑥−𝜇 2 𝑧2
1 1
− −2𝑧 2
𝑃(𝑥1 < 𝑋 < 𝑥2 ) = න 𝑒 2𝜎2 𝑑𝑥 =න 𝑒 𝑑𝑧
𝑥1 2𝜋𝜎 𝑧1 2𝜋
𝑧2
= න 𝑁(𝑧; 0, 1) 𝑑𝑥 = 𝑃(𝑧1 < 𝑋 < 𝑧2 )
𝑧1
The distribution of a normal random
variable with mean 0 and variance 1
is called a standard normal
distribution.
1 −1𝑧 2
𝑓(𝑧) = 𝑒 2
2𝜋
The original and transformed normal distributions.
Continuous Random Variables
The Normal distribution
• CDF of the standard normal
• To find the CDF of the standard normal distribution, we need to integrate the PDF
function. In particular, we have
𝑧
1 −1𝑢2
𝐹 𝑧 = P(Z≤z) = න 𝑒 2 𝑑𝑢
−∞ 2𝜋

This integral does not have a closed form solution. Nevertheless, because of the
importance of the normal distribution, the values of 𝐹(𝑧) have been tabulated and many
calculators and software packages have this function. We usually denote the standard
normal CDF by Φ.
𝐹 𝑧 =Φ(z)
Next table indicates
the area under
the standard normal
curve corresponding to
𝑃(𝑍 < 𝑧) for values of
𝑧 ranging from
−3.49 to 3.49.
𝑃(𝑍 < 1.74) = 0.9591
Continuous Random Variables
The Normal distribution
Example
Given a standard normal distribution, find the area
under the curve that lies
(a) to the right of 𝑧 = 1.84 and
(b) between 𝑧 = −1.97 and 𝑧 = 0.86.

Solution:
(a) The area in (a) to the right of 𝑧 = 1.84 is equal to 1 minus the area
in The table to the left of 𝑧 = 1.84, namely, 1 − 0.9671 = 0.0329.

(b) The area in Figure (b) between 𝑧 = −1.97 and 𝑧 = 0.86 is equal to the
area to the left of 𝑧 = 0.86 minus the area to the left of z = −1.97. From the table we find
the desired area to be 0.8051 − 0.0244 = 0.7807.
Continuous Random Variables
The Normal distribution
Example:
Given a random variable 𝑋 having a normal distribution with 𝜇 = 50 and 𝜎 = 10, find
the probability that 𝑋 assumes a value between 45 and 62.
Solution:
The z values corresponding to x1 = 45 and x2 = 62 are
45 − 50 62 − 50
𝑧1 = 10 = −0.5 and 𝑧2 = 10 = 1.2.

𝑃(45 < 𝑋 < 62) = 𝑃(−0.5 < 𝑍 < 1.2).

𝑃(−0.5 < 𝑍 < 1.2) is shown by the area of the shaded region in the Figure
Using the 𝑍 Table we have
𝑃(45 < 𝑋 < 62) = 𝑃(−0.5 < 𝑍 < 1.2) = 𝑃(𝑍 < 1.2) − 𝑃(𝑍 < −0.5)
= 0.8849 − 0.3085 = 0.5764.
Discrete Random Variables
𝐹 𝑥 = 𝑃 𝑋 ≤ 𝑥 = ෍ 𝑝 𝑡 , 𝑓𝑜𝑟 − ∞ < 𝑥 < ∞
𝑡≤𝑥
𝑃 𝑎<𝑋 ≤𝑏 =𝐹 𝑏 −𝐹 𝑎
𝑃 𝑋 > 𝑥 = 1 − 𝑃 𝑋 ≤ 𝑥 = 1 − 𝐹(𝑥)
Exercise:
Let 𝑋 be a discrete random variable with range 1,2,3, … Suppose the 𝑃𝑋 𝑘 is given by
1
𝑃𝑋 𝑘 = 𝑘 , 𝑘 = 1,2,3, …
2
Using CDF, Find a) Find 𝑃 2 < 𝑋 ≤ 5 , b) Find 𝑃(𝑋 > 4).
Solution
𝐹𝑜𝑟 𝑥 < 1, 𝐹(𝑥) = 0.
1
𝐹𝑜𝑟 1 ≤ 𝑥 < 2, 𝐹(𝑥) = 𝑝(1) = .
2
1 1 3
𝐹𝑜𝑟 2 ≤ 𝑥 < 3, 𝐹(𝑥) = 𝑝(1) + 𝑝(2) = + = .
2 4 4

In general, we have
𝐹(𝑥) = 𝑝(1) + 𝑝(2) + ⋯ + 𝑝(𝑘)
1 1 1 2𝑘 −1
= + + ⋯+ = .
2 4 2𝑘 2𝑘
Discrete Random Variables
Exercise: Cont.
Let 𝑋 be a discrete random variable with range 1,2,3, … Suppose the 𝑃𝑋 𝑘 is given by
1
𝑃𝑋 𝑘 = 𝑘 , 𝑘 = 1,2,3, …
2
Using CDF, Find a) Find 𝑃 2 < 𝑋 ≤ 5 , b) Find 𝑃(𝑋 > 4).
Solution

2𝑘 −1
𝐹(𝑥) = .
2𝑘

31 3 7
a) 𝑃 2 < 𝑋 ≤ 5 = 𝐹 5 − 𝐹 2 = 32 − 4 = 32
15 1
b) 𝑃 𝑋 > 4 = 1 − 𝑃 𝑋 ≤ 4 = 1 − 𝐹 4 = 1 − 16 = 16 .
Discrete Random Variables
Exercise:
Suppose that a game is to be played with a single die assumed fair. In this game a
player wins $20 if a 2 turns up, $40 if a 4 turns up; loses $30 if a 6 turns up; while the
player neither wins nor loses if any other face turns up. Find the expected sum of money
to be won
Solution:
Let 𝑋 be the random variable giving the amount of money won on any toss. Therefore,
the expected value or expectation is
1 1 1 1 1 1
𝐸 𝑋 =0 + 20 +0 + 40 +0 + −30 =5
6 6 6 6 6 6
It follows that the player can expect to win $5.
Discrete Random Variables
Exercise:
A school class of 120 students is driven in 3 buses to a symphonic performance. There
are 36 students in one of the buses, 40 in another, and 44 in the third bus. When the
buses arrive, one of the 120 students is randomly chosen. Let X denote the number of
students on the bus of that randomly chosen student, and find 𝐸[𝑋].
Solution:
36 40 44
𝑃 𝑋 = 36 = , 𝑃 𝑋 = 40 = , 𝑃 𝑋 = 44 =
120 120 120
Hence,
3 1 11 1208
𝐸 𝑋 = 36 + 40 + 44 = = 40.2667
10 3 30 30
Discrete Random Variables
Exercise:
Let the random variable 𝑋 represent the number of defective parts for a machine when
3 parts are sampled from a production line and tested. The following is the probability
distribution of 𝑋. Find the Variance of 𝑋
Solution:
𝜎 2 = 𝐸 𝑋 2 − 𝜇2

𝜇 = (0)(0.51) + (1)(0.38) + (2)(0.10) + (3)(0.01) = 0.61.


Now,
𝐸 𝑋 2 = (0)(0.51) + (1)(0.38) + (4)(0.10) + (9)(0.01) = 0.87.
Therefore,
𝜎 2 = 0.87 − 0.61 2 = 0.4979
Discrete Random Variables
Excersice
Three balls are randomly chosen from an urn containing 3 white, 3 red, and 5 black
balls. Suppose that we win $1 for each white ball selected and lose $1 for each red ball selected.
Find the probability distribution of the total winnings from the experiment. And the probability that
we win money.
Solution
If we let 𝑋 denote our total winnings from the experiment, then 𝑋 is a random
variable taking on the possible values 0 ± 1, ±2, ±3 with respective probabilities
𝐶35 +𝐶13 𝐶13 𝐶15 55
𝑝 0 =𝑃 𝑋=0 = =
𝐶311 165 𝑥 -3 -2 -1 0 1 2 3
𝐶13 𝐶25 +𝐶23 𝐶13 39
𝑝 1 = 𝑃 𝑋 = 1 = 𝑃 𝑋 = −1 = = 𝑝(𝑥) 1 15 39 55 39 15 1
𝐶311 165
3
𝐶 𝐶 5 15
𝑝 2 = 𝑃 𝑋 = 2 = 𝑃 𝑋 = −2 = 2111 = 165 165 165 165 165 165 165
𝐶3 165
𝐶3 3 1
𝑝 3 = 𝑃 𝑋 = 3 = 𝑃 𝑋 = −3 = 11 =
𝐶3 165

55 1
The probability that we win money is given by σ3𝑖=1 𝑃 𝑋 = 𝑖 = =
165 3
Discrete Random Variables
Exercise
𝑐𝜆𝑖
The probability mass function of a random variable X is given by 𝑝 𝑖 = ,𝑖 = 0,1,2, . .
𝑖!
where 𝜆 is some positive value. Find a) 𝑃 𝑋 = 0 and b) 𝑃 𝑋 > 2
Solution

Since σ∞
𝑖=0 𝑝 𝑖 = 1
𝑐𝜆 𝑖
σ∞
𝑖=0 𝑖! =1
𝑥 𝑖
Because 𝑒𝑥 = σ∞
𝑖=0 𝑖! implies that 𝑐 = 𝑒 −𝜆
hence :
𝑒 −𝜆 𝜆0
a) 𝑃 𝑋=0 = 0!
= 𝑒 −𝜆
b) 𝑃 𝑋 > 2 = 1 − 𝑃 𝑋 ≤ 2 = 1 − 𝑃 𝑋 = 0 − 𝑃 𝑋 = 1 − 𝑃 𝑋 = 2 = 1 − e−𝜆
𝑒 −𝜆 𝜆2
− 𝜆𝑒 −𝜆 − 2
Discrete Random Variables
Exercise
A stockroom clerk returns three safety helmets at random to three steel mill employees
who had previously checked them. If Smith, Jones, and Brown, in that order, receive one
of the three hats, list the sample points for the possible orders of returning the helmets,
and find the value 𝑚 of the random variable 𝑀 that represents the number of correct
matches. And find the cumulative distribution function
Sample Space m
SJB 3
Solution
SBJ 1
For the random variable 𝑀, the cumulative distribution function is
BJS 1
0, 𝑓𝑜𝑟 𝑚 < 0
1
JSB 1
3
, 𝑓𝑜𝑟 0 ≤ 𝑚 < 1 JBS 0
𝐹 𝑚 = 5 BSJ 0
, 𝑓𝑜𝑟 1 ≤ 𝑚 < 3
6
1, 𝑓𝑜𝑟 𝑚 ≥ 3

You might also like