0% found this document useful (0 votes)
28 views

Distribution of Sums, Ratios and Order Statistics

This document summarizes techniques for determining the distributions of functions and order statistics of random variables. It discusses how to find the distribution of sums, ratios, and order statistics of independent random variables by using transformations and the Jacobian method to transform the joint density into the desired density. Examples are provided to illustrate finding the distributions of sums and ratios of common distributions like the uniform, exponential, and normal distributions, as well as order statistics like the maximum and minimum values.

Uploaded by

Mainak Samanta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views

Distribution of Sums, Ratios and Order Statistics

This document summarizes techniques for determining the distributions of functions and order statistics of random variables. It discusses how to find the distribution of sums, ratios, and order statistics of independent random variables by using transformations and the Jacobian method to transform the joint density into the desired density. Examples are provided to illustrate finding the distributions of sums and ratios of common distributions like the uniform, exponential, and normal distributions, as well as order statistics like the maximum and minimum values.

Uploaded by

Mainak Samanta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

3.

Distribution of Sums, Ratios and Order Statistics


HW: Read about common continuous distributions from GW 5.4: Uniform, Exponen-
tial, Normal, Cauchy, Gamma, Beta. (Also available in 5.3-5.6 of Ross in more detail).

Sec 6.4 of GW, 6.2 & 6.5 of HPS, 6.3 & 6.6 of Ross
Distributions of functions of random variables. (X, Y ) random vector with joint density
fX,Y . Looking at a function g : R2 → R. What is the distribution of Z = g(X, Y )?
Transform to (Z, W ) for some suitably defined W so that the transformation is one-to-
one using the Jacobian method. Then integrate away W .

1 Sums of random variables


Example 1. Z = X + Y . Define W = Y . The transformation is one-to-one and the
Jacobian is 1. The density of Z is
Z ∞
f (z − w, w)dw
−∞

Theorem 1. Let X and Y be independent random variables with pdf fX and fY . Then
the pdf of Z = X + Y is given by the convolution formula
Z ∞
fZ (a) = fX (a − y)fY (y)dy
−∞

Theorem 2. Let X and Y be independent random variables with mgf mX and mY . Then
the mgf of Z = X + Y is given by

mZ (t) = mX (t)mY (t)

. Using uniqueness of mgf, this can be used to derive the distribution of Z.


Example 2. If X and Y are independent unif(0,1) random variables, then the pdf of
Z = X + Y is given by

 z if 0 ≤ z ≤ 1
fZ (z) = 2 − z if 1 ≤ z ≤ 2
0 otherwise

Example 3. If X and Y are independent Exp(λ) random variables, then the distribution
of Z = X + Y is Gamma(2,λ).
λα α−1 λx
Definition
R ∞ 1. Gamma(α, λ) distribution pdf fX (x) = Γα x e ,0 ≤ x < ∞, where
Γα = 0 uα−1 eu du
Example 4. If X and Y are independent Gamma(α1 , λ) and Gamma(α2 , λ) random
variables, then the distribution of Z = X + Y is Gamma(α1 + α2 , λ).

1. Exp(λ) is Gamma(1, λ).

1
2. χ2n is Gamma( n2 , 12 ).
3. α is hsape parameter and λ is scale parameter.
R1 Γ(α1 +α2 )
4. 0 uα1 −1 (1 − u)α2 −1 du = Γ(α 1 )Γ(α2 )
= β(α1 , α2 ). This is the beta(α1 , α2 ) density
1 α1 −1 α2 −1
fU (u) = β(α1 ,α2 ) u (1 − u) , 0 ≤ u ≤ 1.

5. Γn = (n − 1)!

2 Ratio of random variables


Example 5. Z = X/Y . Define W = Y . The transformation is one-to-one. Inverse is
X = W Z, Y = W and the Jacobian is | W |. The density of Z is
Z ∞
| w | f (wz, w)dw
−∞

Example 6. Z = X/Y where X and Y are independent standard normals. Then fZ (z) =
1 1
2 1+z 2 , that is the ratio is the standard Cauchy distribution.

3 Distribution of order statistics


Let U1 , · · · , Un be independent continuous random variables, each having distribution
function F and density function f . Let X1 , · · · , Xn be random variables obtained by
permuting U1 , · · · , Un so as to be in increasing order. In particular,

X1 (ω) = min(U1 (ω), · · · , Un (ω)) and Xn (ω) = max(U1 (ω), · · · , Un (ω)).

The random variable Xk is called the k-th order statistic. Another related variable of
interest is the range R, defined by R(ω)) = Xn (ω) − X1 (ω).
It follows from the assumptions (continuous) on U1 , ..., Un that, with probability one,
the Ui ’s are distinct and hence X1 < X2 < · · · < Xn .
The probability that exactly j of the Ui ’s are less than x equals nj F (x)j (1−F (x))n−j .


Thus
n  
X n
FXk (x) = P (Xk ≤ x) = F (x)j (1 − F (x))n−j .
j
j=k

Example 7. 1. FXn (x) = F (x)n


2. fXn (x) = nF (x)n−1 f (x)
3. FX1 (x) = 1 − (1 − F (x))n

4. fX1 (x) = n(1 − F (x))n−1 f (x)


5. fXk (x) = nk kf (x)F (x)k−1 (1 − F (x))n−k


6. FX1 ,Xn (x, y) = F (y)n − (F (y) − F (x))n

7. fX1 ,Xn (x, y) = n(n − 1)f (y)f (x)(F (y) − F (x))n−2 , for x ≤ y.

You might also like