0% found this document useful (0 votes)
81 views11 pages

15 Discrete Distributions

The document summarizes several special discrete and continuous probability distributions including the hypergeometric, negative binomial, Poisson, normal, lognormal, gamma, exponential, chi-squared, and bivariate normal distributions. It provides the probability mass or density functions of each distribution and describes their key properties such as their means, variances, and parameters. Examples are also given to illustrate concepts like how the Poisson distribution relates to a Poisson process and how distributions can be transformed or combined.

Uploaded by

yashpanjrath
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views11 pages

15 Discrete Distributions

The document summarizes several special discrete and continuous probability distributions including the hypergeometric, negative binomial, Poisson, normal, lognormal, gamma, exponential, chi-squared, and bivariate normal distributions. It provides the probability mass or density functions of each distribution and describes their key properties such as their means, variances, and parameters. Examples are also given to illustrate concepts like how the Poisson distribution relates to a Poisson process and how distributions can be transformed or combined.

Uploaded by

yashpanjrath
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Lecture Note 6

Special Distributions (Discrete and Continuous) MIT 14.30 Spring 2006 Herman Bennett

15

Discrete Distributions

We have already seen the binomial distribution and the uniform distribution.

15.1

Hypergeometric Distribution

Let the RV X be the total number of successes in a sample of n elements drawn from a population of N elements with a total number of M successes. Then, the pmf of X , called hypergeometric distribution, is given by: M N M f (x) = P (X = x) =
x x Nn n

for x = 0, 1, ..., n.

(40)

With mean and variance: nM E (X ) = N and Var(X ) = N n nM M 1 N 1 N N

Caution: These notes are not necessarily self-explanatory notes. They are to be used as a complement

to (and not as a substitute for) the lectures.

Herman Bennett

LN6MIT 14.30 Spring 06

15.2

Negative Binomial Distribution

The binomial distribution counts the number of successes in a xed number of trials (n). Suppose that, instead, we count the number of trials required to get a xed number of successes (r). Let the RV X be the total number of trials required to get r successes. The pmf of X , called negative binomial distribution, is given by: x1 r f (x) = P (X = x) = p (1 p)xr r1

for x = r, r + 1, r + 2, ...

(41)

With mean and variance: E (X ) = r p and Var(X ) = r(1 p) p2

r = 1 Geometric distribution: waiting for the success.

15.3

Poisson Distribution

A RV X is said to have a Poisson distribution with parameter ( > 0) if the pmf of X is: X P () : f (x) = P (X = x) = e x x! for x = 0, 1, 2, ... (42)

With mean and variance: E (X ) = and Var(X ) =

can be interpreted as a rate per unit of time or per unit of area.

Herman Bennett

LN6MIT 14.30 Spring 06

If X1 and X2 are independent RVs that have a Poisson distribution with means 1 and 2 , respectively, then the RV Y = X1 + X2 has a Poisson distribution with mean 1 + 2 (function of RVs, Lecture Note 5).
x

Note:

x=0

f (x) = e

x! x=0
= e

= 1.

The Poisson distribution is not derived from a natural experiment, as with the two previous distributions.

Example 15.1. Let X be distributed Poisson (). Compute the E (X ).

Example 15.2. Assume the number of customers that visit a store daily is a random variable distributed Poisson(). It is known that the store receives on average 20 customers per day, so = 20. What is the probability i) that tomorrow there will be 20 visits? ii) that during the next 2 days there will be 30 visits? iii) that tomorrow before midday there will be at least 7 visits?

Herman Bennett 15.3.1

LN6MIT 14.30 Spring 06

Poisson Distribution and Poisson Process

A common source of confusion...

A Poisson process with rate per unit time is a counting process that satises the following two properties: i) The number of arrivals in every xed interval of time of length t has a Poisson distribution for which the mean is t. ii) The number of arrivals in every two disjoint time intervals are independent.

Poisson process: Use t when your experiment covers t units. Example 15.3. Answer Example 15.2 assuming now that the number of customers that visit a certain store follows a Poisson process (with the same average of 20 visits per day).

Poisson v/s binomial approach. As n , p 0, and np , the limit of the binomial distribution Poisson distribution.

Herman Bennett

LN6MIT 14.30 Spring 06

16

Continuous Distributions

We have already seen the the uniform distribution.

16.1

Normal Distribution

A RV X is said to have a Normal distribution with parameters and 2 ( 2 > 0), if the pdf of X is: X N (, 2 ) : 1 2 2 f (x) = e(x) /(2 ) , 2 for < x < (43)

With mean, variance, and MGF: E (X ) = , Var(X ) = 2 , and E (etX ) = et+


2 t2 2

Why is the Normal distribution so important? 1. The Normal distribution has a familiar bell shape. It gives a theoretical base to the empirical observation that many random phenomena obey, at least approximately, a normal probability distribution: The further away any particular outcome is from the mean, it is less likely to occur; this characteristic is symmetric whether the deviation is above or below the mean. Examples: height or weight of individuals in a population; error made in measuring a physical quantity; level of protein in a particular seed; etc.

Herman Bennett

LN6MIT 14.30 Spring 06

2. The Normal distribution gives a good approximation to other distributions, such as the Poisson and the Binomial. 3. The Normal distribution is analytically much more tractable than other bell shape distributions. 4. Central limit theorem (more on this later in LN7). 5. The Normal distribution is very helpful to represent population distributions (linked to point 1).

Graphic properties. 1. Bell shape and symmetric. 2. Centered in the mean (), which coincides with the median. 3. Dispersion/atness only depends on the variance ( 2 ). 4. P ( < X < + ) = 0.6826 5. P ( 2 < X < + 2 ) = 0.9544 , 2 ! , 2 !

If X N (, 2 ), then the RV Z = (X )/ is distributed Z N (0, 1). This distribution, N (0, 1), is called standard normal distribution, and sometimes its cdf is denoted FZ (z ) = (z ).

The cdf of the normal distribution does not have an analytic solution and its values must be looked up in a N (0, 1) table (see attached table).

Herman Bennett

LN6MIT 14.30 Spring 06

Note that (z ) = 1 (z ). In fact: FY (y ) = 1 FY (y ) Y N (0, 2 ).

2 If Xi N (i , i ) and all n Xi are mutually independent, then the RV H is distributed:

H=

n i=0

n n 2 ai Xi + bi N ( ai i + bi , a2 i i ). i=0 i=0

(44)

Example 16.1. Using the tools developed in Lecture Note 5, derive the distribution of Z = (X )/ as a transformation of the RV X N (, 2 ).

Example 16.2. Compute E (X ) where X N (, 2 ).

Herman Bennett

LN6MIT 14.30 Spring 06

Example 16.3. Assume that the RV X has a normal distribution with mean 5 and standard deviation 2. Find P (1 < X < 8) and P (|X 5| < 2).

Example 16.4. Assume two types of light bulbs (A and B). The life of bulb type A is distributed normal with mean 100 (hours) and variance 16. The life of bulb type B is distributed normal with mean 110 (hours) and variance 30. i) What is the probability that bulb type A lasts for more than 110 hours? ii) If a bulb type A and a bulb type B are turned on at the same time, what is the probability that type A lasts longer than type B? iii) What is the probability that both bulbs last more than 105 hours?

Herman Bennett

LN6MIT 14.30 Spring 06

The binomial distribution can be approximated with a normal distribution. Rule of thumb: min(np, n(1 p)) 5.

16.2

LogNormal Distribution

If X is a RV and the ln(X ) is distributed N (, 2 ), then X has a lognormal distribution with pdf (RV transformation):

f (x) =

1 1 (ln(x))2 /(22 ) e , 2 x

for 0 < x < ,

< < , > 0

(45)

ln(X ) N (, 2 ) X LnN (, 2 ) With mean and variance: E (X ) = e+(


2 /2)

and

Var(X ) = e2(+ ) e2+ .

If X N (, 2 ), then eX LnN (, 2 ).

16.3

Gamma Distribution

A RV X is said to have a gamma distribution with parameters and (, > 0) if the pdf of X is: 1 x1 ex/ , () () =
0

f (x) = where,

for 0 < x <

(46)

x1 ex dx nite if > 0.

() = ( 1)! if is a positive integer, and (0.5) = .

Herman Bennett With mean and variance: E (X ) = and

LN6MIT 14.30 Spring 06

Var(X ) = 2

Assume a Poisson process. Let Y have a Poisson distribution with parameter . Denote X as the waiting time for the rth event to occur. Then, X is distributed gamma with parameters = r and = 1/.

16.4

Exponential Distribution

A RV X is said to have an exponential distribution with parameter ( > 0) if the pdf of X is: 1 x/ e ,

f (x) = With mean and variance:

for 0 < x <

(47)

E (X ) =

and

Var(X ) = 2

The exponential distribution is a gamma distribution with = 1.

16.5

Chi-squared Distribution

A RV X is said to have an chi-squared distribution with parameter p > 0 (degrees of freedom) if the pdf of X is:

X 2 ( p) :

f (x) =

1 xp/21 ex/2 , (p/2)2p/2

for 0 < x < and p integer.

(48)

10

Herman Bennett With mean and variance: E (X ) = p and

LN6MIT 14.30 Spring 06

Var(X ) = 2p

The chi-squared distribution is a gamma distribution with = p/2 and = 2.

If Y N (0, 1), then the RV Z = Y 2 is distributed: Z = Y 2 2 (1) (random variable transformation.) (49)

2 If X1 2 (p) and X2 (q ) are independent, then the RV H = X1 + X2 is distributed:

H = X1 + X2 2 (p+q )

(random vector transformation).

(50)

Extensively used in Econometrics. Concept of single distribution vs. family of distributions (indexed by one or more parameters).

16.6

Bivariate Normal Distribution

A bivariate random vector (X1 , X2 ) is said to have a bivariate normal distribution if the pdf of (X1 , X2 ) is: f (x1 , x2 ) = 21 2 1 eb/(2(1
2 ))

1 2

(51)

= Corr(X1 , X2 ) b (x1 1 ) 2(x1 1 )(x2 2 ) (x2 2 )2 + 2 2 1 1 2 2


2

= 0 X1 and X2 independent (only in the normal case) fX1 ,X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 ).

11

You might also like