0% found this document useful (0 votes)
18 views5 pages

LNCom Distr Poiss Proc

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views5 pages

LNCom Distr Poiss Proc

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

1 Common Discrete Distributions

1.1 Binomial and Bernoulli Distribution


Total number of successes in a sequence of n Bernoulli trials;
in other words: total number of heads when tossing n times a biased coin with probability
of head equal to p. We have the pmf:
!
n
f (x) = P (X = x) = px (1 − p)n−x x = 0, 1, 2, . . . , n .
x

E(X) = n p , Var(X) = n p (1 − p) , MGF: M(s) = (1 − p + pes )n .


R-command: dbinom(x,size=n,prob=p)

Special case (n = 1) Bernoulli distribution.

1.2 Geometric Distribution


Suppose that independent trials, each having probability p, 0 < p < 1, of being a success are
performed until one successes occurs. If we let X equal the number of trials required, then
it is possible to show that the pmf of the geometric binomial distribution is:

f (x) = P (X = x) = p(1 − p)x−1 x = 1, 2, . . . .

It is easy to find also the formula for the CDF of the geometric distribution.

F (x) = P (X ≤ x) = 1 − (1 − p)x x = 1, 2, . . . .

It is also not very difficult to prove the formulas for the expectation and the variance of
geometric random variates:
1 1−p pes
E(X) = Var(X) = , MGF: M(s) = .
p p2 1 − (1 − p)es

R-command: dgeom(x-1,prob=p)

The geometric distribution has the memoryless property (see book p40).

1.3 Negative Binomial Distribution


Suppose that independent trials, each having probability p, 0 < p < 1, of being a success are
performed until a total of r successes is accumulated. If we let X equal the number of trials
required, then it is possible to show that the pmf of the negative binomial distribution is:
!
x−1
f (x) = P (X = x) = pr (1 − p)x−r x = r, r + 1, . . . .
r−1

The geometric distribution is a special case of the negative Binomial distribution with
parameter r = 1. It is not difficult to understand that a Negative Binomial random variate

1
with parameter r is the sum of r independent geometric random variates. Thus it is possible
to proof the formulas for the expectation and the variance of Negative Binomial random
variates:
!r
r r(1 − p) pes
E(X) = Var(X) = , MGF: M(s) = .
p p2 1 − (1 − p)es
R-command: dnbinom(x-r,size=r,prob=p)

1.4 Poisson Distribution


The Poisson distribution is the limiting case of the Binomial distribution for n p = λ constant
and n tending to infinity while p tends to zero. For the pmf we have:
λx −λ
f (x) = P (X = x) = e x = 0, 1, 2, . . . , n .
x!
s −1)
E(X) = λ Var(X) = λ , MGF: M(s) = eλ(e .
R-command: dpois(x, lambda=λ)

1.5 Hypergeometric Distribution


is the distribution of the number of marked balls obtained when drawing n balls WITHOUT
replacement from an urn containing a total of N balls of which r are marked.
With the parameters N (total number of balls), n (number of balls drawn) and r (marked
balls) we have the pmf:
! !
r N −r
x n−x
f (x) = P (X = x) = ! for max(0, r + n − N ) ≤ x ≤ min(n, r) .
N
n
r r r N −n
 
E(X) = n Var(X) = n 1− .
N N N N −1
R-command: dhyper(x, m=r, n=N-r, k=n)

2 Common Continuous Distributions


2.1 Normal (Gaussian) Distribution
Is very important in probability and statistics as it is the approximate distribution of the
average of a large number of independent random variables: pdf:
1 1 x−µ
f (x) = √ e− 2 ( σ ) .
2π σ
2 s2 /2
E(X) = µ Var(X) = σ 2 MGF: M(s) = eµ s+σ .
R-command: pnorm((x-µ)/σ)
Reproduction property: The sum of independent normal variates is normal.

2
2.2 Uniform Distribution
Is a especailly simple continuous distribution as the probability of any interval in its domain
is proportional only to the length of the domain. Simple and popular but rarely observed
for real data. pdf:
1
f (x) = a ≤ x ≤ b.
b−a
CDF:
x−a
F (x) = a ≤ x ≤ b.
b−a
a+b (b − a)2 eb s + ea s
E(X) = Var(X) = MGF: M(s) = .
2 12 (b − a) s
R-command: runif(n,min=a,max=b)

2.3 Exponential Distribution


Very popular as lifetime and waiting time distribution. Its pdf is:

f (x) = λe−λ x x ≥ 0.

CDF:
F (x) = 1 − e−λ x x ≥ 0.
1 1 1
E(X) = Var(X) = MGF: M(s) = , s < λ.
λ λ2 1 − s/λ
R-command: pexp(x,rate=λ)

The exponential distribution has the memoryless propert! See book p44.

2.4 Gamma and Chi-square Distribution


The sum of k independent exponential random variates, all with rate parameter λ, follows
the Gamma distribution with shape parameter α = k and rate parameter λ.
pdf:
λα α−1 −λ x
f (x) = x e x ≥ 0.
Γ(α)

α α 1
E(X) = Var(X) = 2 MGF: M(s) = ,s < λ.
λ λ 1 − s/λ
R-command: pgamma(x,shape=α,rate=λ)
The special case (α = 1) is called exponential distribution.

A chi-square distribution with n degrees of freedom is a Gamma-distribution with α = n/2


and λ = 1/2.

3
IE255 Table of Standard Distributions

Distrib !
pmf domain E(X) Var(X) MGF: M(s)
n
Binomial px (1 − p)n−x 0, 1, . . . , n np n p (1 − p) (1 − p + pes )n
x
R-command: dbinom(x,size=n,prob=p)
1 (1−p) pes
Geometric p (1 − p)x 1, 2, . . . p p2 1−(1−p)es
R-command:
 
dgeom(x-1,prob=p)
r
x−1 pes

r r(1−p)
Neg. Bin. pr (1 − p)x−r r, r + 1, . . . p2 s
r−1 p 1−(1−p)e
R-command: dnbinom(x-r,size=r,prob=p)
λx −λ s −1)
Poisson x!
e 0, 1, . . . λ λ eλ(e
R-command:
!
dpois(x,
!
lambda=λ)
r N −r
x n−x
Hypergeom n Nr -
r r
 N −n
! x ≤ min(n, r) n N 1− N N −1
N
n
R-command: dhyper(x, m=r, n=N-r, k=n)
1 x−µ 2
√ 1 e− 2 ( σ )
2 s2 /2
Normal 2π σ
R µ σ2 eµ s+σ
R-command: dnorm((x-µ)/σ)
1 log(x)−µ 2 σ2
√ 1 e− 2 ( σ )
2 2
Log Normal 2π σ x
R+ eµ+ 2 e2 µ+σ (eσ − 1)
R-command: dlnorm(x,meanlog=µ,sdlog=σ)
1 a+b (b−a)2 eb s +ea s
Uniform b−a
(a, b) 2 12 (b−a) s
R-command: runif(n,min=a,max=b)
Exponential f (x) = λ e−λ x R+ 1
λ
1
λ2
1
1−s/λ
, s<λ
−λ x
F (x) = 1 − e R-command: pexp(x,rate=λ) α
λα

α−1 −λ x α α
Gamma Γ(α)
x e R+ λ λ2
1
1−s/λ ,s < λ
R-command: dgamma(x,shape=α,rate=λ)
Chi-square A chi square distribution with n degrees of freedom is Gamma-distributed
with α = n/2 and λ = 1/2
R-command: dchisq(x,df=n)
β αβ α α2 β
Pareto (x+α)β+1
R+ β−1 (β−1)2 (β−2)
 β 1
F (x) = 1 − α
x+α
F −1 (u) = α/(1 − u) β − α

4
3 Poisson Processes
The Poisson process is a continuous time stochastic process that can be used to describe the
random properties of the occurence of events in time. Examples of events are eg. arrival of
customers, occurence of an accident, arrival of a message on a cellphone, ... .
The only parameter of the Poisson process is the rate λ which is equal to the expectation
of events per time unit.
The three properties that define the Poisson Process are:
1) Only one event can occur at a time point.
2) The number of events in any two non-overlapping intervals are independent of each.
3) The expected number of events in any time interval only depends on the length of that
interval.
Using these three defining properties it is not difficult to show that:
The number of events E of the Poisson process in a time interval of length l follows the
Poisson distribution with parameter λ̃ = λ · l.
The waiting time for the next event in the Poisson process follows an exponential distri-
bution with parameter λ.
Due to the independece assumption of the Poisson process it is easy to understand that
the waiting time for the r-th event in the Poisson process is the sum of r independent
exponential distributions all with rate parameter λ. Thus the waiting time for the r-th
event follows a Gamma distribution with shape parameter r and rate λ.

You might also like