0% found this document useful (0 votes)
30 views

STA2017 Continuous Random Variable Notes

Uploaded by

Ajani McPherson
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views

STA2017 Continuous Random Variable Notes

Uploaded by

Ajani McPherson
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 29

STA2017

PROBABILITY DISTRIBUTION
THEORY
NOTES
Continuous Random Variable
A Random Variable is a function that maps each sample point of to a value in the set of real numbers. For continuous random
variables the set of values will be the union of a countable finite or infinite number of intervals.
The Cumulative Probability Distribution Function (CDF) is given by
.
The probability density function (pdf) represented by is given by
It is found by differentiating the cumulative (probability) distribution function
Deriving The Cumulative Distribution Function
It follows then that the cumulative distribution function may
be derived from the probability density function by
integration

The cumulative distribution function will always be a non-


decreasing function such that as the values of the variable
increase the probability will never decrease. The minimum
value is 0 and the maximum value is 1. For any continuous
random variable X with support S(X) having probability
density function it follows that
Finding Probabilities and Expected Values
For an interval ,
The probability for a single interval is equivalent to finding
the area under the curve representing the probability density
function within the defined interval.
The expected value of a continuous random variable X is
defined as

Of special interest are piecewise continuous distributions.


Consider the example below.
The probability density function for a continuous random
variable X is given

Determine the value of ‘k’ and the Cumulative Distribution


Function
Solution

And
Therefore 3k+k=1 Therefore k=0.25
If , then
F(2) = 0.75
If , then
= and F(3) = 1
We therefore have the complete CDF
Expected Values - Functions of Random Variables

For any function of the random variable X the expected value


is evaluated from

Specifically, deriving the moment generating function for a


continuous random variable it would be:
Some Special Distributions
We will now look at special continuous probability
distributions such as the Continuous Uniform, Gamma,
Exponential, Chi-Square, Normal, Log-normal Distribution
etc.
The Continuous Uniform Distribution
also known as the Rectangular distribution has two
parameters which are the end points of the interval over
which the Uniform random variable is defined. The pdf is
equal to the constant that makes the area of the rectangle
equal to unity. It is the height of the line segment parallel to
and above the horizontal axis.
Uniform Distribution (continued)
The probability density function (pdf) is given by for where
are constants.

Therefore
Uniform Distribution (continued)

Substituting t=0 in the first derivative results in the expected


value, the first population moment. Take note that
substituting t=0 gives an undefined value. Therefore the limit
to zero of t is taken of the derivative. This is dine using
L’Hopital’s Rule. L’Hopital’s Rule involves differentiating
both numerator and denominator until the undefined result is
not apparent.

By L’Hopital’s Rule
=
The Gamma Distribution

The Gamma Distribution is a skewed distribution defined by a


special function the gamma Function which is given by

It follows that and


By definition
follows from integration by parts
Review: Integration by Parts (The Gamma Function)
Integration by parts is as follows For functions u and v of any
variable say x

Using and , we have and

==
Gamma distribution ~ a true probability distribution
The probability density function for the gamma random variable
with parameters α and β is given by

Using the substitution it follows that P(X>0)=1 satisfying the


criterion for a continuous probability distribution.
P(X > 0) =
.
So P(X > 0) =
=
Form of the Gamma Integral

It then follows by cross multiplying the constant term we have a


useful result

or for constants ‘a’ and ‘b’

What about the mean of the gamma random variable X

=
The moment generating function is given by

=
Now we rewrite the power in the exponential such that negative
‘x’ is divided by a constant
=
=
Integration by parts – Gamma distribution probabilities

When evaluating probabilities for gamma random variables


integer values of α would require use of the integration by parts
technique but noninteger values would usually require
numerical estimation methods.
An example of a gamma density function is shown for α = 3 and
= 0.2

Integration by parts would have to be applied twice in order to


evaluate a probability using this p.d.f.
Exponential Distribution
The Exponential distribution is a special case of the Gamma
distribution. When α = 1 in the gamma density function it
represents an exponential random variable with parameter .

It follows that the expected value and variance for this


exponential random variable is
Are respectively and
The moment generating function would be
Memoryless Property

The Exponential distribution is said to have the Memoryless


property represented by
where s > t
The expression represents the conditional probability that an
event occurs after time ‘s’ given that it has not taken place
before time ‘t’.
The Chi-Square Distribution

Another distribution that is a special case of the gamma


distribution is the Chi-square Distribution. If and the
distribution will be a Chi-Square with degrees of freedom which
is the chi-square parameter.
The p.d.f. is
The Normal and Standard Normal Distribution

The general normal random variable with parameters and has


probability density function given by
, which is symmetrical about the line x =
A special case is the standard normal random variable Z with
mean = 0 and variance =1

It can be shown that by means of polar transformation and use


of the Jacobi transformation. (See)
Pie in the normal density function
An interesting result is that
Proof:
Since the standard normal probability density function is an
even function, then
= 1 implies that
Using the substitution and dy, we have

By comparison and simplification we accomplish our target


Using substitution . So
So =
=
The integral remaining is an odd function and therefore has a
value of zero.
For , and for
Integrals on either side of 0, have equal and opposite values
cancelling out to 0.
Therefore E(X)=
Normal Distribution Variance

For the variance we first find using the substitution

=+ +
The first integral is that of a multiple of the standard normal
density function. The second integral is equal to zero based on
the fact that it contains an odd function. The third integral is
made up of an even function symmetrical about z = 0.
Therefore, this third integral is equal to
Normal Distribution Variance (continued)
For the integral use the substitution and
dy to give =
Putting it all together =
Finally Var(X) =
The Moment generating function is given by
=
=
=
=
==
The log-Normal distribution
The Log-Normal distribution is the distribution of a random
variable whose logarithm is normally distributed. The
probability density function is given by

The transformation converts the distribution to a standard


normal distribution.
Log-Normal~Mean, Variance & Probability

Using substitution
So = where meaning that it is a normally distributed variable
with mean and variance

To evaluate P(0 < X < )


= = ) where Z is the standard normal variable.
The Weibull random variable has probability density function
given by

and
The Beta Distribution

The Beta distribution random variable is defined on the interval


(0, 1) with probability density function given by

where
Using the integral form it can be shown that and

You might also like