0% found this document useful (0 votes)
7 views52 pages

Continuous Random Variables and Probability .Distributions

The document discusses continuous random variables and probability distributions, outlining key concepts such as probability density functions, cumulative distribution functions, means, and variances. It provides examples and exercises to illustrate how to calculate probabilities and apply different continuous distributions, including uniform and normal distributions. Additionally, it covers the approximation of binomial and Poisson distributions using normal distribution techniques.

Uploaded by

fortuneodiase7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views52 pages

Continuous Random Variables and Probability .Distributions

The document discusses continuous random variables and probability distributions, outlining key concepts such as probability density functions, cumulative distribution functions, means, and variances. It provides examples and exercises to illustrate how to calculate probabilities and apply different continuous distributions, including uniform and normal distributions. Additionally, it covers the approximation of binomial and Poisson distributions using normal distribution techniques.

Uploaded by

fortuneodiase7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 52

Continuous Random Variables and

Probability Distributions

Isaac Akpor Adjei

January 10, 2025

1 / 52
Learning Objectives

After careful study of this topic, you should be able to do the following:
1. Determine probabilities from probability density functions.
2. Determine probabilities from cumulative distribution functions, and cumulative
distribution functions from probability density functions, and the reverse.
3. Calculate means and variances for continuous random variables.
4. Understand the assumptions for continuous probability distributions.
5. Select an appropriate continuous probability distribution to calculate probabilities
for specific applications.
6. Calculate probabilities, means and variances for continuous probability
distributions.
7. Standardize normal random variables.
8. Use the table for the cumulative distribution function of a standard normal
distribution to calculate probabilities.
9. Approximate probabilities for Binomial and Poisson distributions.

2 / 52
Continuous Random Variables

I A continuous random variable is one which takes values in


an uncountable set.
I They are used to measure physical characteristics such as
height, weight, time, volume, position, etc...

Examples

1. Let Y be the height of a person (a real number).

2. Let X be the volume of juice in a can.

3. Let Y be the waiting time until the next person arrives at


the server.

3 / 52
Probability Density Function

For a continuous random variable X , a probability density


function is a function such that

1. f (x) ≥ 0 means that the function is always non-negative.


R∞
2. f (x)dx = 1
−∞

Rb
3. P(a ≤ X ≤ b) = f (x)dx = area under f (x)dx from a to b
a

4. f (x) = 0 means there is no area exactly at x.

4 / 52
Example 4-1: Electric Current
Let the continuous random variable X denote the current
measured in a thin copper wire in milliamperes(mA).
Assume that the range of X is 4.9 ≤ x ≤ 5.1 and f (x) = 5.
What is the probability that a current is less than 5mA?
Answer:

R5
P(X <5) = f (x)dx =
4.9
R5
5dx = 0.5
4.9

P(4.95<X <5.1) =
5.1
R
f (x)dx = 0.75 Figure 1: P(X <5) illustrated
4.95

5 / 52
Cumulative Distribution Functions

The cumulative distribution function of a continuous random


variable X is,
Rx
F (X ) = P(X ≤ x) = f (u)du for −∞<x<∞
−∞

The cumulative distribution function is defined for all real


numbers.

6 / 52
Example 4-3: Electric Current
For the copper wire current measurement in Exercise 4-1, the
cumulative distribution function consists of three expressions.

Figure 2: The plot of F (x) is


shown in Figure 2 Figure 3: Figure 2 Cumulative
distribution function

7 / 52
Probability Density Function from the Cumulative
Distribution Function

I The probability density function (PDF) is the derivative of


the cumulative distribution function (CDF).

I The cumulative distribution function (CDF) is the integral of


the probability density function (PDF).

dF (x)
Given F(x) , f(x) = as long as the derivative exists.
dx

8 / 52
Exercise 4-5: Reaction Time
I The time until a chemical reaction is complete (in
milliseconds, ms) is approximated by this cumulative
distribution function:
(
0, for x<0
F (x) = −0.01x
1-e , for 0≤ x
I What is the Probability (
density function?
dF (x) d 0
f (x) = = = =
dx dx 1-e−0.01x
(
0, for x<0
0.01e−0.01x , for 0≤ x
I What proportion of reactions is complete within 200 ms?
P(X <200) = F (200) = 1 − e−2 = 0.8647
9 / 52
Mean & Variance

Suppose X is a continuous random variable with probability


density function f(x). The mean or expected value of X ,
denoted as µ or E(X) , is
R∞
µ = E(X ) = xf (x)dx
−∞

The variance of X, denoted as V(X) or σ 2 , is


R∞ R∞
σ 2 = V (X ) = (x − µ)2 f (x)dx = x 2 f (x)dx − µ2
−∞ −∞

The standard deviation of X is σ = σ2

10 / 52
Example 4-6: Electric Current

For the copper wire current measurement, the PDF is


f (x) = 0.05 for 0 ≤ x ≤ 20. Find the mean and variance.
R20 0.05x 2 20
E(X ) = = x · f (x)dx = |0 = 10
0 2

R20 0.05(x − 10)3 20


V (X ) = = (x − 10)2 f (x)dx = |0 = 33.33
0 3

11 / 52
Mean of a Function of a Continuous Random Variable

If X is a continuous random variable with probability distribution


function f(x),
R∞
E[h(x)] = h(x)f (x)dx
−∞
Example 4-7:
Let X be the current measured in mA. The PDF is f (x) = 0.05
for 0 ≤ x ≤ 20. What is the expected value of power when the
resistance is 100 ohms? Use the result that power in watts
P = 10−6 RI 2 , where I is the current in milliamperes and R is
the resistance in ohms. Now, h(X ) = 10−6 100X 2 .
R20 x3
E[h(x)] = 10−4 x 2 dx = 0.0001 |20 = 0.2667 watts
0 3 0

12 / 52
Continuous Uniform Distribution
I This is the simplest continuous distribution and analogous
to its discrete counterpart.
I A continuous random variable X with probability density
function
f(x) = 1 / (b-a) for a ≤ x ≤ b

Figure 4: Continuous uniform Probability Density Function

13 / 52
Mean & Variance

I Mean & variance are:

(a + b)
µ = E(X ) =
2

and

(b − a)2
σ 2 = V (X ) =
12

14 / 52
Example 4-9: Uniform Current
The random variable X has a continuous uniform distribution on
[4.9, 5.1]. The probability density function of X is f(x) = 5,
4.9 ≤ x ≤ 5.1. What is the probability that a measurement of
current is between 4.95 & 5.0 mA?
5.0
R
P(4.95<x<5.0) = f (x)dx = 5(0.05) = 0.25
4.95
The mean and variance formulas can be applied with a = 4.9
and b = 5.1. Therefore,
(0.2)2
µ = E(X ) = 5 mA and V (X ) = = 0.0033mA2
12

Figure 5 15 / 52
Cumulative
x
distribution function of Uniform distribution
R1 x −a
F (x) = du =
a (b − a) b−a
The Cumulative distribution function is

0,
 x<a
F (x) = (x − a)/(b − a), a ≤ x<b

1 b≤x

Figure 6: Cumulative distribution function


16 / 52
Normal Distribution

A random variable X with probability density function


−(x − µ)2
1
f (x) = √ e 2σ 2 −∞<x<∞
2πσ
is a normal random variable with parameters µ,
where −∞ < µ < ∞, and σ > 0. Also,
E(X ) = µ and V (X ) = σ 2
and the notation N(µ, σ 2 ) is used to denote the distribution.

17 / 52
Empirical Rule
For any normal random variable,
P(µ–σ<X <µ + σ) = 0.6827
P(µ–2σ<X <µ + 2σ) = 0.9545
P(µ–3σ<X <µ + 3σ) = 0.9973

Figure 7: Probabilities associated with a normal distribution


18 / 52
Standard Normal Random Variable

A normal random variable with


µ = 0 and σ 2 = 1
is called a standard normal random variable and is denoted as
Z. The cumulative distribution function of a standard normal
random variable is denoted as:
φ(z) = P(Z ≤ z)
Values are found in Appendix Table III and by using Excel and
Minitab.

19 / 52
Example 4-11: Standard Normal Distribution
Assume Z is a standard normal random variable.
Find P(Z ≤ 1.50). Answer: 0.93319

Figure 8: Standard normal Probability density function

Find P(Z ≤ 1.53). Answer: 0.93699


Find P(Z ≤ 0.02). Answer: 0.50398
NOTE : The column headings refer to the hundredths digit of the value of z in
P(Z ≤ z).
For example, P(Z ≤ 1.53) is found by reading down the z column to the row 1.5 and
then selecting the probability from the column labeled 0.03 to be 0.93699. 20 / 52
Standardizing a Normal Random Variable

Suppose X is a normal random variable with mean µ and


variance σ 2 , the random variable
(X − µ)
Z =
σ
is a normal random variable with E(Z ) = 0 and V (Z ) = 1.

The probability is obtained by using Appendix Table III with


(x − µ)
z=
σ

21 / 52
Example 4-14: Normally Distributed Current-1
Suppose that the current measurements in a strip of wire are
assumed to follow a normal distribution with µ = 10 and σ = 2
mA, what is the probability that the current measurement is
between 9 and 11 mA?
Answer:
 
9 − 10 x − 10 11 − 10
P(9<X <11) = P < <
2 2 2
= P(−0.5<z<0.5)
= P(z<0.5) − P(z< − 0.5)
= 0.69146 − 0.30854 = 0.38292

Using Excel
0.38292 NORMDIST(11,10,2,TRUE)-NORMDIST(9,10,2,TRUE)
22 / 52
Example 4-14: Normally Distributed Current-2
Determine the value for which the probability that a current
measurement is below 0.98.
Answer:
 
X − 10 x − 10
P(X <x) = P <
2 2
 
X − 10
= P Z< = 0.98
2
z = 2.055 is the closest value.
x = 2(2.055) + 10 = 14.11mA.

Using Excel
14.107 =NORMINV(0.98,10,2)

23 / 52
Normal Approximations

I The binomial and Poisson distributions become more


bell-shaped and symmetric as their mean value increase.
I For manual calculations, the normal approximation is
practical – exact probabilities of the binomial and Poisson,
with large means, require technology (Minitab, Excel).
I The normal distribution is a good approximation for:
- Binomial if np > 5 and n(1 − p) > 5.
- Poisson if λ = 5.

24 / 52
Normal Approximation to the Binomial Distribution
If X is a binomial random variable with parameters n and p,
X − np
Z =p
np(1 − p)
is approximately a standard normal random variable. To
approximate a binomial probability with a normal distribution, a
continuity correction is applied as follows: !
x + 0.5 − np
P(X ≤ x) = P(X ≤ x + 0.5) ≈ P Z ≤ p
np(1 − p)
and
!
x − 0.5 − np
P(X < x) = P(X ≤ X − 0.5) ≈ P Z ≤ p
np(1 − p)
The approximation is good for np > 5 and n(1 − p) > 5

25 / 52
Example 4-18: Applying the Approximation
In a digital communication channel, assume that the number of bits received
in error can be modeled by a binomial random variable. The probability that a
bit is received in error is 10−5 .If 16 million bits are transmitted, what is the
probability that 150 or fewer errors occur?

P(X ≤ 150) = P(X ≤ 150.5)


!
X − 160 150.5 − 160
=P p ≤ p
160(1 − 10−5 ) 160(1 − 10−5 )
 
−9.5
=P Z ≤ = P(Z ≤ −0.75104) = 0.2263
12.6491

Using Excel
0.2263 =NORMDIST(150.5,160,SQRT(160*(1-0.00001)),TRUE)
-0.7% =(0.2293-0.228)/0.228=percent error in the approximation

26 / 52
Normal Approximation to Hypergeometric

Recall that the hypergeometric distribution is similar to the


binomial such that p = K /N and when sample sizes are small
relative to population size. Thus the normal can be used to
approximate the hypergeometric distribution.

Figure 9: Conditions for approximating hypergeometric and binomial


probabilities

27 / 52
Normal Approximation to the Poisson

If X is a Poisson random variable with E(X ) = λ and V (X ) = λ,


X −λ

λ
is approximately a standard normal random variable.
The same continuity correction used for binomial distribution
can also be applied. The approximation is good for λ ≥ 5

28 / 52
Example 4-20: Normal Approximation to Poisson
Assume that the number of asbestos particles in a square meter of dust on a
surface follows a Poisson distribution with a mean of 1000. If a square meter
of dust is analyzed, what is the probability that 950 or fewer particles are
found?
P e−1000 1000x
950
P(X ≤ 950) = ... too hard manually!
x=0 x!
The probability can be approximated as

P(X ≤ 950) = P(X ≤ 950.5)


950.5 − 1000
≈ P(Z ≤ √
1000
= P(Z ≤ −1.57) = 0.058

Using Excel
0.0578 =POISSON(950,1000,TRUE)
0.0588 =NORMDIST(950.5,1000,SQRT(1000),TRUE)
1.6% =(0.0588-0.0578)/0.0578 = percent error
29 / 52
Exponential Distribution Definition

The random variable X that equals the distance between


successive events of a Poisson process with mean number of
events λ >0 per unit interval is an exponential random variable
with parameter λ. The probability density function of X is:

f (x) = λe−λx for 0 ≤ x<∞

30 / 52
Exponential distribution - Mean & Variance

If the random variable X has an exponential distribution with


parameter λ,
1 1
µ = E(X ) = and σ 2 = V (X ) = 2
λ λ
Note:
I Poisson distribution : Mean and variance are same.
I Exponential distribution : Mean and standard deviation are
same.

31 / 52
Example 4-21: Computer Usage-1
In a large corporate computer network, user log-ons to the system can be
modeled as a Poisson process with a mean of 25 log-ons per hour. What is
the probability that there are no log-ons in the next 6 minutes (0.1 hour)?
Let X denote the time in hours from the start of the interval until the first
log-on.
R∞
P(X >0.1) = 25e−25x dx =
0.1
e−25(0.1) = 0.082
The cumulative distribution function
also can be used to obtain the same
results as follows
P(X > 0.1) = 1 − F (0.1) = 0.082

Using Excel Figure 10: Desired probability


0.0821 =EXPONDIST(0.1,25,TRUE)

32 / 52
Example 4-21: Computer Usage-2
Continuing, what is the probability that the time until the next log-on is
between 2 and 3 minutes (0.033 & 0.05 hours)?

0.05
Z
P(0.033<X <0.05) = 25e−25x dx
0.033

= e−25x |0.05
0.033 = 0.152

An alternative solution is

P(0.033<X <0.05) = F (0.05) − F (0.033) = 0.152

Using Excel
0.148 =EXPONDIST(3/60,25,TRUE)-EXPONDIST(2/60,25,TRUE)
(difference due to round-off error)
33 / 52
Example 4-21: Computer Usage-3

I Continuing, what is the interval of time such that the


probability that no log-on occurs during the interval is 0.90?
P(X > x) = e−25x = 0.90 , −25x = ln(0.9)
−0.10536
x= = 0.00421hour = 0.25minute
−25
I What is the mean and standard deviation of the time until
the next log-in?
1 1
µ= = =0.04 hour = 2.4 minutes
λ 25
1 1
σ= = =0.04 hour = 2.4minutes
λ 25

34 / 52
Lack of Memory Property
An interesting property of an exponential random variable
concerns conditional probabilities.
For an exponential random variable X ,
P(X > t1 + t2 |X > t1 ) = P(X > t2 )

Figure 11: Lack of memory property of an exponential distribution.

35 / 52
Example 4-22: Lack of Memory Property

Let X denote the time between detections of a particle with a Geiger counter.
Assume X has an exponential distribution with E(X ) = 1.4 minutes. What is
the probability that a particle is detected in the next 30 seconds?

P(X < 0.5) = F (0.5) = 1 − e−0.5/1.4 Using Excel


=0.30 0.300 =EXPONDIST(0.5,1/1.4,TRUE)

No particle has been detected in the last 3 minutes. Will the probability
increase since it is ”due”?
P(3<X <3.5) F (3.5) − F (3) 0.035
P(X <3.5|X >3) = = = = 0.30
P(X >3) 1 − F (3) 0.117
No, the probability that a particle will be detected depends only on the
interval of time, not its detection history.

36 / 52
Gamma Function

The gamma function is the generalization of the factorial


function for r > 0, not just non-negative integers.
R∞
Γ(r ) = x r −1 e−x dx for r > 0
0
Properties of the gamma function
Γ(r ) = (r − 1)Γ(r − 1) recursive property
Γ(r ) = (r − 1)! factorial function
Γ(1) = 0! = 1
Γ(1/2) = π 1/2 = 1.77 useful if manual

37 / 52
Gamma Distribution

The random variable X with probability density function:


λr x r −1 e−λx
f (x) = for x > 0
Γ(r )

is a gamma random variable with parameters λ >0 and r > 0.

38 / 52
Mean & Variance of the Gamma

If X is a gamma random variable with parameters λ and r,

µ = E(X ) = r /λ
and
σ 2 = V (X ) = r /λ2

39 / 52
Example 4-24: Gamma Application-1
The time to prepare a micro-array slide for high-output genomics is a Poisson
process with a mean of 2 hours per slide. What is the probability that 10
slides require more than 25 hours?
Let X denote the time to prepare 10 slides. Because of the assumption of a
poisson process, X has a gamma distribution with λ =1/2, r = 10, and the
requested probability is P(X > 25).
Using the Poisson distribution, let the random variable N denote the number
of slides made in 10 hours. The time until 10 slides are made exceeds 25
hours if and only if the number of slides made in 25 hours is ≤ 9.

Using Excel
P(X >25) = P(X ≤ 9) 0.2014 =POISSON(9, 12.5, TRUE)
E(N) = 25(1/2) = 12.5 slides in 25 hours
9
X e−12.5 (12.5)k
P(N ≤ 9) = = 0.2014
k!
k =0

40 / 52
Example 4-24: Gamma Application-2
Using the gamma distribution, the same result is obtained.

Using Excel
P(X >25) = 1 − P(X ≤ 25) 0.2014 =1-GAMMADIST(25, 10, 2, TRUE
Z25
0.510 x 9 e−0.5x
=1− dx
Γ(10)
0

= 1 − 0.7986
= 0.2014

What is the mean and standard deviation of the time to prepare 10 slides?

r 10
E(X ) = = = 20 hours
λ 0.5
r 10
V (X ) = 2 = = 40 hours
λ 0.25
p √
SD(X ) = V (X ) = 40 = 6.32 hours
41 / 52
Example 4-24: Gamma Application-3
The slides will be completed by what length of time with 95%
probability? That is: P(X ≤ x) = 0.95

Figure 12: Minitab: Graph >Probability Distribution Plot >View


Probability

Using Excel
31.41 =GAMMAINV(0.95, 10, 2) 42 / 52
Weibull Distribution
The random variable X with probability density function
β  x β−1 −(x/δ)β
f (x) = e , for x > 0
δ δ
is a Weibull random variable with
scale parameter δ > 0 and shape parameter β > 0.
cumulative distribution function is :
β
F (X ) = 1 − e−(x/δ)
The mean and variance is given by
 
1
µ = E(X ) = δ · Γ 1 + and
β
1 2
     
2 2 2 2
σ = V (X ) = δ Γ 1 + −δ Γ 1+
β β
43 / 52
Example 4-25: Bearing Wear

I The time to failure (in hours) of a bearing in a mechanical shaft is modeled as a


Weibull random variable with β = 1/2 and δ = 5,000 hours.
I What is the mean time until failure?

E(X ) = 5000 · Γ(1 + 1/2) =


5000 · Γ(1.5)√ Using Excel
= 5000 · 0.5 π = 4, 431.1 hours 4,431.1 =5000*EXP(GAMMALN(1.5))

I What is the probability that a bearing will last at least 6,000 hours?

p(X >6, 000) = 1 − F (6, 000) = = e−1.0954 = 0.334


6000 0.5
!

e 5000 Using Excel
0.334 =1-WEIBULL(6000,1/2,5000,TRUE)

Only 33.4% of all bearings last at least 6000 hours.

44 / 52
Lognormal Distribution

Let W denote a normal random variable with mean θ and


variance ω 2 , then X = exp(W ) is a lognormal random variable
with probability density function
(ln (x) − θ)2
 

− 
1 2ω 2
f (x) = √ e , −∞ < θ < ∞, ω > 0
xω 2π
The mean and variance of X are
2 /2)
E(X ) = e(θ+ω and
2 2
V (X ) = e(2θ+ω ) (eω − 1)

45 / 52
Example 4-26: Semiconductor Laser-1
The lifetime of a semiconductor laser has a lognormal
distribution with θ = 10 and ω = 1.5 hours.
What is the probability that the lifetime exceeds 10,000 hours?

P(X >10, 000) = 1 − P[exp(W ) ≤ 10, 000]


= 1 − P[W ≤ ln (10, 000)]
 
ln (10, 000) − 10
=1−Φ
1.5
= 1 − Φ(−0.5264)
= 1 − 0.30
= 0.701

1-NORMDIST(LN(10000), 10, 1.5, TRUE)= 0.701


46 / 52
Example 4-26: Semiconductor Laser-2
I What lifetime is exceeded by 99% of lasers?

P(X >x) = P[exp(W )>x] = From appendix table


P[W > ln
(x)]  III,1-Φ(z) = 0.99 when z=-2.33
ln (x) − 10 ln (x) − 10
=1−Φ = 0.99 Hence = −2.33 and
1.5 1.5
x=exp(6.505)=668.48 hours
-2.3263 =NORMSINV(0.99)
6.5105 =-2.3263*1.5+10=ln (x)
672.15 =EXP(6.5105)
(difference due to round-off error)
I What is the mean and variance of the lifetime?

2 2
E(X ) = eθ+ω /2
= e10+1.5 /2

= exp(11.125) = 67, 846.29


2 2 2 2
−1
V (X ) = E 2θ+ω (eω ) = e2.10+1.5 (e1.5 − 1)
= exp(22.25) · [exp(2.25) − 1] = 39, 070, 059, 886.6
SD(X ) = 197, 661.5
47 / 52
Beta Distribution

The random variable X with probability density function

Γ(α + β) α−1
f (x) = x (1 − x)β−1 , for X in [0, 1]
Γ(α) · Γ(β)

is a beta random variable with parameters α > 0 and β > 0.

48 / 52
Example 4-27: Beta Computation-1
Consider the completion time of a large commercial real estate development.
The proportion of the maximum allowed time to complete a task is a beta
random variable with α = 2.5 and β = 1. What is the probability that the
proportion of the maximum time exceeds 0.7?
Let X denote the proportion.
Z1
Γ(α + β) α−1
P(X > 0.7) = x (1 − x)β−1 dx
Γ(α) · Γ(β)
0.7

Z1
Γ(3.5)
= x 1.5 dx
Γ(2.5) · Γ(1)
0.7

2.5(1.5)(0.5) π x 2.5 1
= √ · |0.7
1.5(0.5) π 2.5
= 1 − (0.7)2.5 = 0.59

Using Excel
0.590 1-BETADIST(0.7,2.5,1,0,1)
49 / 52
Mean & Variance of the Beta Distribution

If X has a beta distribution with parameters α and β,


α
µ = E(X ) =
α+β
αβ
σ 2 = V (X ) = 2
(α + β) (α + β + 1)
Example 4-28: In the above example, α = 2.5 and β = 1.
What are the mean and variance of this distribution?
2.5 2.5
µ= = = 0.71
2.5 + 1 3.5
2.5(1) 2.5
σ2 = 2
= = 0.045
(2.5 + 1) (2.5 + 1 + 1) 3.52 (4.5)

50 / 52
Mode of the Beta Distribution
If α > 1 and β > 1, then the beta distribution is mound-shaped and
has an interior peak, called the mode of the distribution. Otherwise,
the mode occurs at an endpoint.
General formula:
α−1
Mode= , for [0, 1]
α+β−2
For the above Example 4.28 the mode is

α−1 2.5 − 1
Mode = =
α+β−2 2.5 + 1 − 2
1.5
=
1.5
=1

case alpha beta mode


Example 4-28 2.25 1 1.00 =(2.5-1)/(2.5+1.0-2)

51 / 52
Important Terms & Concepts
I Beta distribution I Mean of a function of a
I Chi-squared distribution continuous random variable
I Continuity correction I Normal approximation to
binomial Poisson probabilities
I Continuous uniform distribution
I Normal distribution
I Cumulative probability
distribution for a continuous I Probability density function
random variable I Probability distribution of a
I Erlang distribution continuous random variable
I Exponential distribution I Standard deviation of a
I Gamma distribution continuous random variable
I Standardizing
I Lack of memory property of a
continuous random variable I Standard normal distribution
I Lognormal distribution I Variance of a continuous
I Mean for a continuous random random variable
variable I Weibull distribution
52 / 52

You might also like