Test
Test
1
Problem 2: Moments of the Beta Distribution
Let X ∼ Beta(r, s) with density
Γ(r + s) r−1
fX (x) = x (1 − x)s−1 , 0 < x < 1.
Γ(r)Γ(s)
Mean
The mean is Z 1 Z 1
Γ(r + s)
E(X) = xfX (x) dx = xr (1 − x)s−1 dx.
0 Γ(r)Γ(s) 0
Γ(r + s) rΓ(r)Γ(s) r
E(X) = · = .
Γ(r)Γ(s) Γ(r + s + 1) r+s
Variance
Similarly, one can show that
r(r + 1)
E(X 2 ) = .
(r + s)(r + s + 1)
Thus,
2
2 r(r + 1)
2 r rs
Var(X) = E(X ) − (E(X)) = − = .
(r + s)(r + s + 1) r+s (r + s)2 (r+ s + 1)
λα α−1 −λx
fX (x) = x e , x > 0.
Γ(α)
2
(a) Density of Y
Since Y = kX, we have X = y/k and
dx 1
= .
dy k
Thus,
y 1 λα y α−1 −λ(y/k) 1 λα
fY (y) = fX = e = y α−1 e−λy/k , y > 0.
k k Γ(α) k k Γ(α)k α
3
Problem 4: Simulation and Transformation
The random variable X has density
5
fX (x) = 1[1,∞) (x).
x6
fY (y) = fX (ey ) ey .
4
Problem 5: Linear Transformation of a Uniform Ran-
dom Variable
Let X ∼ U (a, b) with density
1
fX (x) = , x ∈ [a, b].
b−a
Consider the transformation
Y = cX + d, with c ̸= 0.
5
Problem 7: Gamma Function and the Normal Density
√
(a) Showing Γ(1/2) = π
Recall the definition of the Gamma function:
Z ∞
1
Γ = t−1/2 e−t dt.
2 0
It is known that ∞ ∞ √
√
Z Z
−x2 −x2 π
e dx = π =⇒ e dx = .
−∞ 0 2
Thus, √
π √
1
Γ =2· = π.
2 2
(x − µ)2
1
f (x) = √ exp − .
2πσ 2σ 2
x−µ
Making the change of variable z = (so that dz = dx
σ σ
), we have
Z ∞ Z ∞ 2
1 z
f (x) dx = √ exp − dz = 1,
−∞ 2π −∞ 2
6
Problem 8: Calculating E(|X|) for X ∼ N (0, 1) in Three
Ways
q
2
Let X ∼ N (0, 1) and define Y = |X|. We want to show that E(Y ) = π
.
A standard result (or evaluating the integral via integration by parts) shows that
Z ∞
1
(1 − Φ(y)) dy = √ ,
0 2π
so that r
1 2
E(Y ) = 2 · √ = .
2π π
Thus, r
2 2
E(Y ) = √ = .
2π π
7
(c) Direct Integration with the Transformation g(x) = |x|
Here, Z ∞
E(Y ) = E(|X|) = |x| ϕ(x) dx.
−∞
2
As in part (b), with the substitution u = x2 , we find
Z ∞
e−u du = 1,
0
yielding r
2
E(|X|) = .
π
8
Problem 9: Relationship Between the Beta and Gamma
Functions
(a) Expressing Γ(a)Γ(b) as a Double Integral
By definition, Z ∞ Z ∞
a−1 −t
Γ(a) = t e dt, Γ(b) = sb−1 e−s ds.
0 0
Multiplying these, Z ∞ Z ∞
Γ(a)Γ(b) = ta−1 sb−1 e−(t+s) ds dt.
0 0
∂(t, s)
= x.
∂(x, y)
Simplify: Z 1 Z ∞
Γ(a)Γ(b) = y a−1
(1 − y) b−1
dy xa+b−1 e−x dx.
0 0
Recognize that Z ∞
xa+b−1 e−x dx = Γ(a + b),
0
and Z 1
y a−1 (1 − y)b−1 dy = B(a, b).
0
Thus,
Γ(a)Γ(b)
Γ(a)Γ(b) = Γ(a + b)B(a, b) =⇒ B(a, b) = .
Γ(a + b)
9
Problem 10: Continuous Mixtures
Let f (x|λ) be an exponential density with parameter λ > 0, and suppose λ is random with
density g(λ). Define Z ∞
h(x) = f (x|λ)g(λ) dλ.
0
R∞
Since for each fixed λ, f (x|λ) is a density in x, we have f (x|λ) ≥ 0 and −∞ f (x|λ) dx = 1.
Also, g(λ) is nonnegative and integrates to 1. Interchanging the order of integration (by
Fubini’s theorem) gives
Z ∞ Z ∞ Z ∞ Z ∞
h(x) dx = g(λ) f (x|λ) dx dλ = g(λ) dλ = 1.
−∞ 0 −∞ 0
remains a density.
10
Problem 11: Minimizing the Mean Squared Error
Let X ∈ L2 and define
h(a) = E (X − a)2 ,
a ∈ R.
Expanding the square, we have
Since E X − E(X) = 0, this simplifies to
Clearly, h(a) is minimized when (E(X) − a)2 is minimized, i.e., when a = E(X), and the
minimum value is
h(E(X)) = Var(X).
11
Problem 12: Moments of a Normal Distribution
2
Let X ∼ N (µX , σX ) and Y ∼ N (0, 1).
(a) Standardizing X
Define
X − µX
Z= .
σX
Since linear transformations of normal variables are normal, it follows that
Z ∼ N (0, 1).
This result is often derived using the moment generating function of Y or via integration in
polar coordinates.
(c) Moments of (X − µX )
Since
X − µX = σX Z,
we have for any integer r ≥ 0,
r
σ X r!
r
, r even,
r r
E(Z r ) 2r/2 !
E (X − µX ) = σX = 2
0, r odd.
12
The kurtosis (more precisely, the excess kurtosis) is given by
E (X − µX )4
Kurtosis = 4
.
σX
4! 4 24
4
E (X − µX )4 = σX 4
2
= σX = σX · 3.
2 2! 4·2
Thus,
E (X − µX )4
4
= 3.
σX
Both skewness and kurtosis do not depend on µX or σX , which confirms that they are
properties of the standard normal distribution.
13