0% found this document useful (0 votes)
44 views13 pages

Prob 3160 CH 7

This document provides an overview of continuous distributions with three main points: 1) It defines continuous random variables and their probability density functions (PDFs), which allow calculating probabilities of events as integrals of the PDF over the event space. 2) It introduces the cumulative distribution function (CDF) as another way to characterize a continuous distribution, and notes the CDF can be used to define expectations similarly to discrete random variables. 3) It proves that continuous random variables can be approximated by discrete random variables, with the expectations of the approximations converging to the true expectation of the continuous variable as the approximations improve.

Uploaded by

Md Fâîsâl
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views13 pages

Prob 3160 CH 7

This document provides an overview of continuous distributions with three main points: 1) It defines continuous random variables and their probability density functions (PDFs), which allow calculating probabilities of events as integrals of the PDF over the event space. 2) It introduces the cumulative distribution function (CDF) as another way to characterize a continuous distribution, and notes the CDF can be used to define expectations similarly to discrete random variables. 3) It proves that continuous random variables can be approximated by discrete random variables, with the expectations of the approximations converging to the true expectation of the continuous variable as the approximations improve.

Uploaded by

Md Fâîsâl
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

CHAPTER 7

Continuous distributions

7.1. Basic theory


7.1.1. Denition, PDF, CDF. We start with the denition a continuous random
variable.

Denition (Continuous random variables)


A random variable X is said to have a continuous distribution if there exists a non-
negative function f = fX such that
 b
P(a 6 X 6 b) = f (x)dx
a
for every a and b. The function f is called the density function for X or the PDF for
X.

More precisely, such an


∞ X is said to have an absolutely continuous
 a distribution. Note that
−∞
f (x)dx = P(−∞ < X < ∞) = 1. In particular, P(X = a) = a f (x)dx = 0 for every a.

Example
 7.1. Suppose we are given that f (x) = c/x3 for x>1 and 0 otherwise. Since

f (x)dx = 1 and
−∞  ∞  ∞
1 c
c f (x)dx = c 3
dx = ,
−∞ 1 x 2
we have c = 2.

PMF or PDF?
Probability mass function (PMF) and (probability) density function (PDF) are two
names for the same notion in the case of discrete random variables. We say PDF or
simply a density function for a general random variable, and we use PMF only for
discrete random variables.

Denition (Cumulative distribution function (CDF))


The distribution function of X is dened as
 y
F (y) = FX (y) := P(−∞ < X 6 y) = f (x)dx.
−∞
It is also called the cumulative distribution function (CDF) of X.

97
98 7. CONTINUOUS DISTRIBUTIONS

We can dene CDF for any random variable, not just continuous ones, by setting F (y) :=
P(X 6 y). Recall that we introduced it in Denition 5.3 for discrete random variables. In
that case it is not particularly useful, although it does serve to unify discrete and continuous
random variables. In the continuous case, the fundamental theorem of calculus tells us,
provided f satises some conditions, that

f (y) = F 0 (y) .

By analogy with the discrete case, we dene the expectation of a continuous random variable.

7.1.2. Expectation, discrete approximation to continuous random variables.


Denition (Expectation)
For a continuous random variable X with the density function f we dene its expec-
tation by  ∞
EX = xf (x)dx
−∞
if this integral is absolutely convergent. In this case we call X integrable.

Recall that this integral is absolutely convergent if

 ∞
|x|f (x)dx < ∞.
−∞
In the example above,
 ∞  ∞
2
EX = x 3 dx = 2 x−2 dx = 2.
1 x 1
Later in Example 10.1 we will see that a continuous random variable with Cauchy distribution
has innite expectation.

Proposition 7.1 (Discrete approximation to continuous random variables)


Suppose X is a nonnegative continuous random variable with a nite expectation.
Then there is a sequence of discrete random variables {Xn }∞
n=1 such that

EXn −−−→ EX.


n→∞

Proof. First observe that if a continuous random variable X is nonnegative, then its
density f (x) = 0 x < 0. In particular, F (y) = 0 for y 6 0, thought the latter is not needed
for our proof. Thus for such a random variable

 ∞
EX = xf (x)dx.
0

Suppose n ∈ N, then we dene Xn (ω) to be k/2n if k/2n 6 X(ω) < (k+1)/2n , for k ∈ N∪{0}.
−n
This means that we are approximating X from below by the largest multiple of 2 that is
still below the value of X . Each Xn is discrete, and Xn increase to X for each ω ∈ S .
7.1. BASIC THEORY 99

Consider the sequence {EXn }∞


n=1 . This sequence is an increasing sequence of positive num-
bers, and therefore it has a limit, possibly innite. We want to show that it is nite and it
is equal to EX .
We have

∞  
X k k
EXn = P Xn = n
k=1
2n 2
∞  
X k k k+1
= P n 6X<
k=1
2n 2 2n
∞  (k+1)/2n
X k
= n
f (x)dx
k=1
2 k/2n

∞  (k+1)/2n
X k
= f (x)dx.
k=1 k/2n 2n

If x ∈ [k/2n , (k + 1)/2n ), then x diers from k/2n by at most 1/2n , and therefore

 (k+1)/2n  (k+1)/2n
k
06 xf (x)dx − f (x)dx
k/2n k/2n 2n
 (k+1)/2n    (k+1)/2n
k 1
= x − n f (x)dx 6 n f (x)dx
k/2n 2 2 k/2n

Note that

∞ 
X (k+1)/2n  ∞
xf (x)dx = xf (x)dx
k=1 k/2n 0

and

∞  (k+1)/2n ∞  n  ∞
X 1 1 X (k+1)/2 1 1
n
f (x)dx = n f (x)dx = n f (x)dx = n .
k=1
2 k/2n 2 k=1 k/2n 2 0 2

Therefore
100 7. CONTINUOUS DISTRIBUTIONS

 ∞ ∞  (k+1)/2n
X k
0 6 EX − EXn = xf (x)dx − f (x)dx
0 k=1 k/2n 2n
∞  (k+1)/2n ∞  (k+1)/2n
X X k
= xf (x)dx − f (x)dx
k=1 k/2n k=1 k/2n 2n
∞  (k+1)/2n  (k+1)/2n
!
X k
= xf (x)dx − f (x)dx
k=1 k/2n k/2n 2n
∞  (k+1)/2n
X 1 1
6 f (x)dx = −−→ 0.
k=1
2n k/2n 2n n→0

We will not prove the following, but it is an interesting exercise: if Xm is any sequence of
discrete random variables that increase up to X , then limm→∞ EXm will have the same value
EX .
This fact is useful to show linearity, if X and Y are positive random variables with nite
expectations, then we can take Xm discrete increasing up to X and Ym discrete increasing
up to Y. Then Xm + Ym is discrete and increases up to X + Y , so we have

E(X + Y ) = lim E(Xm + Ym )


m→∞
= lim EXm + lim EYm = EX + EY.
m→∞ m→∞

Note that we can not easily use the approximations to X, Y and X+Y we used in the
previous proof to use in this argument, since Xm + Ym might not be an approximation of the
same kind.

If X is not necessarily positive, we can show a similar result; we will not do the details.

Similarly to the discrete case, we have

Proposition 7.2
Suppose X is a continuous random variable with density fX and g is a real-valued
function, then
 ∞
Eg(X) = g(x)f (x)dx
−∞
as long as the expectation of the random variable g (X) makes sense.

As in the discrete case, this allows us to dene moments, and in particular the variance
Var X := E[X − EX]2 .
As an example of these calculations, let us look at the uniform distribution.
7.1. BASIC THEORY 101

Uniform distribution
We say that a random variable X has a uniform distribution on [a, b] if fX (x) = 1
b−a
if a6x6b and 0 otherwise.

To calculate the expectation of X

 ∞  b
1
EX = xfX (x)dx = x dx
−∞ a b−a
 b
1
= x dx
b−a a
1  b 2 a2  a + b
= − = .
b−a 2 2 2
This is what one would expect. To calculate the variance, we rst calculate
 ∞  b
2 2 1 a2 + ab + b2
EX = x fX (x)dx = x2 dx = .
−∞ a b−a 3
We then do some algebra to obtain

(b − a)2
Var X = EX 2 − (EX)2 = .
12
102 7. CONTINUOUS DISTRIBUTIONS

7.2. Further examples and applications


Example 7.2. Suppose X has the following p.d.f.
(
2
x3
x>1
f (x) =
0 x < 1.
Find the CDF of X, that is, nd FX (x). Use the CDF to nd P (3 6 X 6 4).

Solution : we have FX (x) = 0 if x61 and will need to compute


 x
2 1
FX (x) = P (X 6 x) = 3
dy = 1 − 2
1 y x
when x > 1. We can use this formula to nd the following probability

P (3 6 X 6 4) = P (X 6 4) − P (X < 3)
   
1 1 7
= FX (4) − FX (3) = 1 − 2 − 1 − 2 = .
4 3 144

Example 7.3. Suppose X has density


(
2x 0 6 x 6 1
fX (x) = .
0 otherwise

Find EX .

Solution : we have that


  1
2
E [X] = xfX (x)dx = x · 2x dx = .
0 3

Example 7.4. The density of X is given by


(
1
2
if 06x62
fX (x) = .
0 otherwise
 
Find E eX .

Solution : using Proposition 7.2 with g(x) = ex we have


 2
1 1 2
EeX = ex · dx =

e −1 .
0 2 2

Example 7.5. Suppose X has density


(
2x 0 6 x 6 1
f (x) = .
0 otherwise

© Copyright 2017 Phanuel Mariano, Patricia Alonso Ruiz, Copyright 2020 Masha Gordina.
7.2. FURTHER EXAMPLES AND APPLICATIONS 103

Find Var(X).
Solution : in Example 7.3 we found E [X] = 2
3
. Now
 1  1
2 2 1
x3 dx = .
 
E X = x · 2xdx = 2
0 0 2
Thus  2
1 2 1
Var(X) = − = .
2 3 18

Example 7.6. Suppose X has density


(
ax + b 0 6 x 6 1
f (x) = .
0 otherwise

1
and that E [X 2 ] = 6
. Find the values of a and b.

∞
Solution : We need to use the fact that
−∞
f (x)dx = 1 and E [X 2 ] = 1
6
. The rst one gives
us  1
a
1= (ax + b) dx = + b,
0 2
and the second one give us
 1
1 a b
= x2 (ax + b) dx = + .
6 0 4 3
Solving these equations gives us

a = −2, and b = 2.
104 7. CONTINUOUS DISTRIBUTIONS

7.3. Exercises
Exercise 7.1. Let X be a random variable with probability density function
(
cx (5 − x) 0 6 x 6 5,
f (x) =
0 otherwise.

(A) What is the value of c?


(B) What is the cumulative distribution function of X? That is, nd FX (x) = P (X 6 x).
(C) Use your answer in part (b) to nd P (2 6 X ≤ 3).
(D) What is E [X]?
(E) What is Var(X)?

Exercise 7.2. UConn students have designed the new U-phone. They have determined
that the lifetime of a U-Phone is given by the random variable X (measured in hours), with
probability density function
(
10
x2
x > 10,
f (x) =
0 x ≤ 10.

(A) Find the probability that the u-phone will last more than 20 hours.
(B) What is the cumulative distribution function of X? That is, nd FX (x) = P (X 6 x).
(C) Use part (b) to help you nd P (X > 35)?

Exercise 7.3. Suppose the random variable X has a density function


(
2
x2
x > 2,
f (x) =
0 x 6 2.

Compute E [X].

Exercise 7.4. An insurance company insures a large number of homes. The insured value,
X, of a randomly selected home is assumed to follow a distribution with density function
(
3
x4
x > 1,
f (x) =
0 otherwise.

Given that a randomly selected home is insured for at least 1.5, calculate the probability
that it is insured for less than 2.

Exercise 7.5. The density function of X is given by


(
a + bx2 0 6 x 6 1,
f (x) =
0 otherwise.

7
If E [X] = 10
, nd the values of a and b.
7.3. EXERCISES 105

Exercise 7.6. Let X be a random variable with density function


(
1
a−1
1 < x < a,
f (x) =
0 otherwise.

Suppose that E [X] = 6 Var(X). Find the value of a.

Exercise 7.7. Suppose you order a pizza from your favorite pizzeria at 7:00 pm, knowing
that the time it takes for your pizza to be ready is uniformly distributed between 7:00 pm
and 7:30 pm.

(A) What is the probability that you will have to wait longer than 10 minutes for your
pizza?
(B) If at 7:15pm, the pizza has not yet arrived, what is the probability that you will have
to wait at least an additional 10 minutes?

Exercise 7.8. The grade of deterioration X of a machine part has a continuous distribution
on the interval (0, 10) with probability density function fX (x), where fX (x) is proportional
x
to on the interval. The reparation costs of this part are modeled by a random variable Y
5
2
that is given by Y = 3X . Compute the expected cost of reparation of the machine part.

Exercise 7.9. A bus arrives at some (random) time uniformly distributed between 10 : 00
and 10 : 20, and you arrive at a bus stop at 10 : 05.
(A) What is the probability that you have to wait at least 5 minutes until the bus comes?
(B) What is the probability that you have to wait at least 5 minutes, given that when you
arrive today to the station the bus was not there yet (you are lucky today)?

Exercise∗ 7.1. For a continuous random variable X with nite rst and second moments
prove that

E (aX + b) = aEX + b,
Var (aX + b) = a2 Var X.
for any a, b ∈ R.

Exercise∗ 7.2. Let X be a continuous random variable with probability density function

1
fX (x) = xe− 2 1[0,∞) (x) ,
x

4
where the indicator function is dened as



1, 0 6 x < ∞;


1[0,∞) (x) =

 0,
 otherwise.

Check that fX is a valid probability density function, and nd E (X) if it exists.
106 7. CONTINUOUS DISTRIBUTIONS

Exercise∗ 7.3. Let X be a continuous random variable with probability density function

4 ln x
fX (x) = 1[1,∞) (x) ,
x3
where the indicator function is dened as



1, 1 6 x < ∞;


1[1,∞) (x) =

 0,
 otherwise.

Check that fX is a valid probability density function, and nd E (X) if it exists.
7.4. SELECTED SOLUTIONS 107

7.4. Selected solutions


∞
Solution to Exercise 7.1(A): We must have that −∞
f (x)dx = 1, thus

5  5
5x2 x3
 
1= cx(5 − x)dx = c −
0 2 3 0
and so we must have that c = 6/125.
Solution to Exercise 7.1(B): We have that
 x
FX (x) = P (X 6 x) = f (y)dy
−∞
 x x
5y 2 y 3

6 6
= y (5 − y) dx = −
0 125 125 2 3 0
 2 3

6 5x x
= − .
125 2 3

Solution to Exercise 7.1(C): We have


P (2 6 X 6 3) = P (X 6 3) − P (X < 2)
5 · 32 33 5 · 22 23
   
6 6
= − − −
125 2 3 125 2 3
= 0.296.

Solution to Exercise 7.1(D): we have


 ∞  5
6
E [X] = xfX (x)dx = x· x(5 − x)dx
−∞ 0 125
= 2.5.

Solution to Exercise 7.1(E): We need to rst compute


 ∞  5
2 2 6
x2 ·
 
E X = x fX (x)dx = x(5 − x)dx
−∞ 0 125
= 7.5.
Then
Var(X) = E X 2 − (E [X])2 = 7.5 − (2.5)2 = 1.25.
 

Solution to Exercise 7.2(A): We have


 ∞
10 1
2
dx = .
20 x 2
Solution to Exercise 7.2(B): We have
 x
10 10
F (x) = P(X 6 x) = 2
dy = 1 −
10 y x
for x > 10, and F (x) = 0 for x < 10.
108 7. CONTINUOUS DISTRIBUTIONS

Solution to Exercise 7.2(C): We have


P (X > 35) = 1 − P (X < 35) = 1 − FX (35)
 
10 10
=1− 1− = .
35 35
Solution to Exercise 7.3: +∞
Solution to Exercise 7.4: 37
64
.
∞
Solution to Exercise 7.5: we need to use the fact that
−∞
f (x)dx = 1 and E [X] = 7
10
.
The rst one gives us
 1
b
a + bx2 dx = a +

1=
0 3
and the second one gives
 1
7 a b
x a + bx2 dx = + .

=
10 0 2 4
Solving these equations gives
1 12
a= , and b= .
5 5
Solution to Exercise 7.6: Note that
 a
x 1 1
EX = dx = a + .
1 a−1 2 2
Also
Var(X) = EX 2 − (EX)2
then we need  a
2 x2 1 1 1
EX = dx = a2 + a + .
1 a−1 3 3 3
Then
   2
1 2 1 1 1 1
Var(X) = a + a+ − a+
3 3 3 2 2
1 1 1
= a2 − a + .
12 6 12
Then, using E [X] = 6 Var(X), we simplify and get 12 a2 − 32 a = 0, which gives us a = 3.
Another way to solve this problem is to note that, for the uniform distribution on [a, b],
a+b (a−b)2 (a−1)2
the mean is
2
and the variance is
12
. This gives us an equation 6
12
= a+1
2
. Hence
2
(a − 1) = a + 1, which implies a = 3.
Solution to Exercise 7.7(A): Note that X is uniformly distributed over (0, 30). Then

2
P(X > 10) = .
3
Solution to Exercise 7.7(B): Note that X is uniformly distributed over (0, 30). Then

P (X > 25) 5/30


P(X > 25 | X > 15) = = = 1/3.
P(X > 15) 15/30
7.4. SELECTED SOLUTIONS 109

Solution to Exercise 7.8: First of all we need to nd the PDF of


(
X. So far we know that
cx
5
0 6 x 6 10,
f (x) =
0 otherwise.

Since  10
x
c dx = 10c,
0 5
1
we have c= 10
. Now, applying Proposition 7.2 we get
 10
3 3
EY = x dx = 150.
0 50

Solution to Exercise 7.9(A): The probability that you have to wait at least 5 minutes
1 1
until the bus comes is . Note that with probability you have to wait less than 5 minutes,
2 4
1
and with probability you already missed the bus.
4

Solution to Exercise 7.9(B): The conditional probability is 2


3
.

You might also like