0% found this document useful (0 votes)
12 views13 pages

Assignment07 Sol 2024

Uploaded by

jenny930319
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views13 pages

Assignment07 Sol 2024

Uploaded by

jenny930319
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

QF314800: Mathematical Statistics I

Assignment 06: Multiple Random Variables


Instructor: Chung-Han Hsieh ([email protected])
Teaching Assistant: Zong Gan ([email protected])

We will use completion grade for this assignment. That is, if one attempts each individual
problem, then one would receive full credits of this assignment.
Reading Task: Before you attempt the problems below, we encourage you to review Sections 4.2
and 4.5 in Chapter 4 handout and read Sections 3.1 and 3.2 in Chapter 3 handout.
Caution: If you use a large language model like ChatGPT to help you prepare the assignment,
you are responsible for (i) indicating that you are making use of it and (ii) ensuring that the
answer produced by it is correct and justifiable. (iii) Your answer should be completely on
your own.

Problem 7.1 (Properties of Gamma Random Variable). Let X ∼ gamma(α, β).


(i) Show that the mgf of X is
 α
1 1
MX (t) = , t< .
1 − βt β

(ii) Show that E[X] = αβ.


(iii) Show that var(X) = αβ 2 .
Proof. Begin by observing that

MX (t) = E[etX ]
Z ∞
1
= etx xα−1 e−x/β dx
0 Γ(α)β α
Z ∞
1
= xα−1 etx e−x/β dx.
Γ(α)β α 0

Introduce a change of variable by setting u := x/β, then du = dx/β and we have


Z ∞
1
MX (t) = uα−1 e−(1−tβ)u du.
Γ(α) 0

Take c := 1/(1 − tβ) > 0; i.e., 1 − tβ > 0, then


Z ∞

MX (t) = uα−1 e−u/c du
Γ(α)cα 0
Z ∞
1
= cα uα−1 e−u/c du
Γ(α)cα 0
| {z }
R ∞
= 0
fX (x)dx=1

1
= cα =
(1 − tβ)α

for 1 − tβ > 0. (or equivalently, t < 1/β).

1
 α+1
′ 1 ′
(ii). Observe that MX (t) = αβ 1−βt . Thus, E[X] = MX (0) = αβ.
(iii) To find the variance, we first calculate the second moment. Indeed,
 α+2
′′ 1
MX (t) = α(α + 1)β 2 .
1 − βt
′′
Hence, E[X 2 ] = MX (0) = α(α + 1)β 2 . Therefore, the variance of X is

var(X) = E[X 2 ] − (E[X])2 = α(α + 1)β 2 − (αβ)2 = αβ 2 .

Problem 7.2 (Toy Example of Bivariate Random Variables). Let (X, Y ) be the bivariate random
variables with pdf fX,Y (x, y) = 4xy for x ∈ (0, 1) and y ∈ (0, 1), zero elsewhere.
(i) Find P (0 < X < 1/2, 1/4 < Y < 1).
(ii) Find P (X = Y ).
(iii) Find P (X < Y ).
(iv) Find P (X ≤ Y ).
Proof. (i). Observe that
Z 1 Z 1/2
P (0 < X < 1/2, 1/4 < Y < 1) = 4xydxdy = 15/64.
1/4 0

(ii). As the random variables X and Y are continuous random variables, the difference of these
random variables is also continuous. The probability of P (X = Y ) = P (X − Y = 0) = 0.
(iii) Observe that
Z 1 Z y
P (X < Y ) = 4xydxdy = 1/2.
0 0

(iv) Combining (ii) and (iii), we have

P (X ≤ Y ) = P (X < Y ) + P (X = Y ) = 1/2.

Problem 7.3 (Conditional Expectation and Variance). Let X, Y have the same joint pdf as
Problem A.3; i.e., (
6y 0 < y < x < 1
fX,Y (x, y) :=
0 o.w.
(i) Find the conditional expectations E[Y |X = x] and E[Y |X].
(ii) Define Z := 2X/3. Then find cdf of Z. Also, find the pdf fZ , mean E[Z], and variance
var(Z).
(iii) Find E[E[Y |X]] and var(E[Y |X]).
(iv) Show that E[Y ] = E[E[Y |X]] = E[Z] and var(Z) = var(E[Z|X]) ≤ var(Y ).
Proof. (i). Note that, by Problem A.3, fX (x) = 3x2 for x ∈ (0, 1). The conditional pdf of Y
given X = x is
fX,Y (x, y) 6y 2y
f (y|x) = = 2 = 2
fX (x) 3x x

2
for 0 < y < x and x ∈ (0, 1). In addition, the conditional mean of Y given X = x is
Z
E[Y |X = x] = yf (y|x)dy
Z x
2y
= y 2 dy
0 x
2 x3 2x
= 2 =
x 3 3
for x ∈ (0, 1). Hence, E[Y |X] = 2X/3.
(ii). Let Z := 2X/3. Since x ∈ (0, 1), the support of Z is (0, 2/3). The cdf of Z is given by
FZ (z) = P (Z ≤ z) = P (2X/3 ≤ z) = P (X ≤ 3z/2) = FX (3z/2).
for z ∈ (0, 2/3). The pdf of Z is given by
2
81z 2

d 3 3z 3
fZ (z) = FZ (z) = fX (3z/2) = 3 = .
dz 2 2 2 8
The mean of Z is
2/3
81z 2 81 (2/3)4
Z Z
E[Z] = zfZ (z)dz = z dz = = 1/2.
0 8 8 4
and
2/3
81y 2 81 (2/3)5
Z Z
4
E[Y 2 ] = y 2 fY (y)dy = y2 dy = = .
0 8 8 5 15
4
Hence, the variance of Z is var(Z) = E[Z 2 ] − (E[Z])2 = 15 − ( 12 )2 = 1/60.
(iii). By part (i), we know that E[Y |X] = 2X/3. Hence,
E[E[Y |X]] = E[2X/3] = 2/3E[X] = (2/3)(3/4) = 1/2.
On the other hand, var(E[Y |X]) = var(2X/3) = 4/9var(X) = (4/9)(3/80) = 1/60.
(iv) Note that E[Y ] = 1/2 (by Problem A.3) and E[Z] = 1/2 by part (ii). Therefore, we conclude
that
E[Y ] = E[E[Y |X]] = E[Z] = 1/2.
Similarly, var(Y ) = 1/20 (by Problem A.3) and var(Z) = 1/60 by part (ii). Moreover, var(E[Y |X]) =
1/60. Therefore, we conclude
1/60 = var(Z) = var(E[Y |X]) ≤ var(Y ) = 1/20.

Problem 7.4 (Transformation of Discrete Random Vector). Let (X, Y ) be a bivariate discrete
random vector with joint pmf
( x+y 2−x−y
3 2
5 5 , (x, y) ∈ {(0, 0), (0, 1), (1, 0), (1, 1)}
fX,Y (x, y) :=
0, o.w.

Consider a transformation U := X − Y and V := X + Y.


(i) Determine the support of (X, Y ).
(ii) Determine the support of (U, V ).
(iii) Find the joint pmf of (U, V ).

3
Proof. (i). The support of (X, Y ), denoted by S, is

S := {(x, y) : (0, 0), (0, 1), (1, 0), (1, 1)}.

(ii). Note that u = x − y and v = x + y. Then it follows that x = (u + v)/2 and y = (v − u)/2.
Hence, for (x, y) = (0, 0), it follows that (u, v) = (0, 0). For (x, y) = (0, 1), it follows that
(u, v) = (−1, 1). For (x, y) = (1, 0), it follows that (u, v) = (1, 1). For (x, y) = (1, 1), it follows
that (u, v) = (0, 2). Hence, the support of (U, V ), denoted by B, is

B := {(u, v) : (0, 0), (−1, 1), (1, 1), (0, 2)}.

(iii). The joint pmf of (U, V ) is as follows: For (u, v) in the support B.

P (U = u, V = v) = P (X − Y = u, X + Y = v)
 
u+v v−u
=P X= ,Y =
2 2
u+v v−u
= fX,Y ( , )
2 2
  u+v v−u  
+ 2 2− u+v v−u
2 − 2
3 2 2
=
5 5
 v  2−v
3 2
= .
5 5

Preliminaries to Problem 7.5. Let u := g1 (x, y) and v = g2 (x, y) be a one-to-one mapping


T : (x, y) 7→ (u, v) taking some domain D ⊂ R2 onto some range R ⊂ R2 . The transformation
can be inverted; i.e., there exists inverse T −1 : (u, v) 7→ (x, y) such that x = h1 (u, v) and
y = h2 (u, v) for some functions h1 , h2 . Then the Jacobian of this inverse is the determinant
 ∂x ∂x 
∂u ∂v
∂x ∂y ∂x ∂y
J = det ∂y ∂y = − .
∂u ∂v ∂u ∂v ∂v ∂u

where det(A) means determinant of the matrix A. Recalling that in Calculus, if g : R2 → R and
T maps the set A ⊂ D onto the set B ⊂ R, then the change of variables within integrals leads to
ZZ ZZ
g(x, y)dx dy = g(h1 (u, v), h2 (u, v))|J| du dv. (∗)
A B

Problem 7.5 (Change of Variable Formula). Let X and Y be two random variables having
joint pdf fX,Y and g1 , g2 are functions mapping R2 to R. Define

U := g1 (X, Y ), and V := g2 (X, Y ).

Show that the pair (U, V ) has joint density function


(
fX,Y (h1 (u, v), h2 (u, v))|J|, if (u, v) is in the range of T
fU,V (u, v) =
0, otherwise.

4
Proof. Let A ⊂ D and B ⊂ R be typical sets such that T (A) = B, i.e., onto. Then (X, Y ) ∈ A
if and only if (U, V ) ∈ B. Thus,
ZZ
P ((U, V ) ∈ B) = P ((X, Y ) ∈ A) = f (x, y)dx dy
Z ZA
= f (h1 (u, v), h2 (u, v))|J|du dv.
B

where the last equality holds by (∗). Comparing this with the definition of the joint pdf of (U, V ):
ZZ
P ((U, V ) ∈ B) = fU,V (u, v)dudv
B

2
for suitable B ⊂ R . This implies

fU,V (u, v) = f (h1 (x, y), h2 (u, v))|J|.

Problem 7.6 (Convolution Formula). Let X and Y be continuous random variables with the
joint probability fX,Y (x, y) with x, y ∈ R. Define transformations U := XY and V := X.
(i) Find the joint pdf of fU,V .
(ii) Show that Z ∞
fU (u) = fX,Y (v, u/v) · |1/v|dv
−∞

which is the so-called convolution formula.

Proof. (i). Since u = xy and v = x, it implies that u = yv and v = x. The inverses: x = v and
y = u/v for (u, v) in the support of (U, V ), denoted by B:

B := {(u, v) : −∞ < v < ∞, −∞ < u < ∞} ∩ {(u, v) : v ̸= 0}.

Note that v ̸= 0 isn’t an issues for the integral later since the set of measure for v = 0 is zero.
Therefore, the Jacobian is  
0 1
J = det = −1/v
1/v −uv −2
Now, the joint pdf is

fU,V (u, v) = fX,Y (v, u/v)|J| = fX,Y (v, u/v)| − 1/v|.

(ii) To find the marginal pdf fU , we integrate out the variable v on the joint pdf fU,V . That is,
Z ∞ Z ∞
fU (u) = fU,V (u, v)dv = fX,Y (v, u/v)dv
−∞ −∞

and the proof is complete.

5
Problem 7.7 (Sum and Difference of Normal Random Vector). Let X and Y be two indepen-
dent, standard normal random variables. Consider the transformation

U := g1 (X, Y ) := X + Y
V := g2 (X, Y ) := X − Y.

(i) Find the joint pdf fX,Y (x, y).


(ii) Determine the support for (X, Y ) and the support of (U, V ). Then find the inverse of
x = h1 (u, v) and y = h2 (u, v).
(iii) Find the Jacobian J.
(vi) Find the joint pdf of (U, V ).
(v) Show that U, V are independent.
Proof. (i). Since X and Y are independent standard normal random variables, the joint pdf of
(X, Y ) is simply

fX,Y (x, y) = fX (x)fY (y)


1 1
= √ exp −x2 /2 · √ exp −y 2 /2 .
 
2π 2π
(ii). The support of (X, Y ) are the entire R2 . The support of (U, V ) are again the entire R2 .
Since u = x + y and v = x − y. It follows that
u+v u−v
x = h1 (u, v) := ; y = h2 (u, v) :=
2 2
(iii) The Jacobian is given by
∂h1 (u,v) ∂h1 (u,v) 1 1
∂u ∂v 2 2
1
J= = 1 =− .
∂h2 (u,v) ∂h2 (u,v)
2 − 21 2
∂u ∂v

(iv) The joint pdf of (U, V ) is given by

fU,V (u, v) = fX,Y (h1 (u, v), h2 (u, v))|J|


   
1 u+v 2 1 u−v 2 1
= √ exp −( ) /2 · √ exp −( ) /2 ·
2π 2 2π 2 2
   
1 1 1 1 1
= √ √ exp − (u2 + 2uv + v 2 ) · exp − (u2 − 2uv + v 2 )
2 2π 2π 8 8
 
1 1 1 1
= √ √ exp − (u2 + v 2 ) .
2 2π 2π 4
for all u, v ∈ R. (v). To show that U, V are independent, Lemma ?? tells us that it suffices to
show that fU,V (u, v) = g(u)q(v) for some functions g and q. Rewrite
 
1 1 1 1 2 2
fU,V (u, v) = √ √ exp − (u + v )
2 2π 2π 4
   
1 1 2 1 1 2
= √ √ exp − u √ √ exp − v
2 2π 4 2 2π 4
:= g(u)q(v)

which completes the proof.

6
Definition 7.1 (Covariance). Let X and Y be two random variables with mean µX and µY ,
respectively. The covariance of X and Y , denoted by cov(X, Y ), is defined by

cov(X, Y ) := E[(X − µX )(Y − µY )].

Problem 7.8 (Covariance and Variance).


(i) If Y = X, show that cov(X, X) = var(X).
(ii) For any random variables X, Y with mean µX and µY , respectively. Show that

cov(X, Y ) = E[XY ] − µX µY .

(iii)(Linear Combinations of Two Random Variables) Let X, Y be two random variables and
a, b ∈ R. Let Z := aX + bY be a linear combinations of X, Y . Show that

var(Z) = a2 var(X) + b2 var(Y ) + 2ab · cov(X, Y ).

Proof. (i). Note that cov(X, X) = E[(X − µX )(X − µX )] = E[(X − µX )2 ] = var(X).


(ii). Observe that

cov(X, Y ) = E[(X − µX )(Y − µY )]


= E[XY − XµY − µX Y + µX µY ]
= E[XY ] − E[X]µY − µX E[Y ] + µX µY
= E[XY ] − µX µY − µX µY + µX µY
= E[XY ] − µX µY .

(iii). Observe that

var(Z) = var(aX + bY )
= E[(aX + bY − E[aX + bY ])2 ]
= E[(aX + bY − aµX − bµY )2 ]
= E[(a(X − µX ) + b(Y − µY ))2 ]
= a2 E[(X − µX )2 ] + 2abE[(X − µX )(Y − µY )] + b2 E[(Y − µY )2 ]
= a2 var(X) + 2abcov(X, Y ) + b2 var(Y ).

Definition 7.2 (Correlation Coefficient). Let X and Y be two random variables with mean µX
2
and µY and variance σX > 0 and σY2 > 0, respectively. Then the correlation coefficient of X
and Y , call it ρXY , is the number defined by

cov(X, Y )
ρXY :=
σX σY
Problem 7.9 (Properties of Correlation Coefficient).
(i) Show that −1 ≤ ρXY ≤ 1. Hint: You may find the Cauchy-Schwarz inequality |E[U V ]|2 ≤
E[U 2 ]E[V 2 ] for random variables U and V is useful.
(ii) If E[XY ] = E[X]E[Y ], show that cov(X, Y ) = 0 and ρ = 0.

7
Proof. (i). To show −1 ≤ ρXY ≤ 1, it suffices to show |cov(X, Y )| ≤ σX σY . Observe that
|cov(X, Y )|2 = |E[(X − µX )(Y − µY )]|2
≤ E[(X − µX )2 ]E[(Y − µY )2 ]
= var(X)var(Y )
2 2
= σX σY .
where the first equality holds by the Cauchy-Schwarz inequality |E[U V ]|2 ≤ E[U 2 ]E[V 2 ]. There-
fore, it follows that
|cov(X, Y )| ≤ σX σY .
(ii) Trivial.

Problem 7.10 (Uncorrelated Bivariate Normal). Let Zi ∈ N (0, 1) for i ∈ {1, 2} and Z1 , Z2 are
independent. Now define two random variables
X := aX Z1 + bX Z2 + µX and Y := aY Z1 + bY Z2 + µY
where aX , bX , µX , aY , bY and µY are constants.
(i) Show that E[X] = µX and var(X) = a2X + b2X .
(ii) Show that E[Y ] = µY and var(Y ) = a2Y + b2Y .
(iii) Show that cov(X, Y ) = aX aY + bX bY .
(iv) Define
r
1
aX := σ X = bX ,
2
r r
1 1
aY := σY , bY := − σY ,
2 2
2
where σX , σY2 > 0. Show that (X, Y ) has the (uncorrelated) bivariate normal pdf with parameters
2
µX , µY , σX , σY2 . That is, for x, y ∈ R,
 2  2 !!
1 −1 x − µX y − µY
fX,Y (x, y) = exp + .
2πσX σY 2 σX σY

Proof. (i). To find E[X] = aX E[Z1 ]+bX E[Z2 ]+µX = µX . Likewise, note that E[Z12 ] = var(Z1 ) =
1 = E[Z22 ] and E[Z1 Z2 ] = 0 since Z1 and Z2 are independent.
var(X) = E[X 2 ] − µ2X
= E[(aX Z1 + bX Z2 + µX )2 ] − µ2X
= E[(aX Z1 + bX Z2 + µX )(aX Z1 + bX Z2 + µX )] − µ2X
= a2X + b2X .
(ii) An almost identical proof of part (i) would work. Hence, E[Y ] = µY and var(Y ) = a2Y + b2Y .
(iii) Observe that
cov(X, Y ) = E[XY ] − E[X]E[Y ]
= E[(aX Z1 + bX Z2 + µX )(aY Z1 + bY Z2 + µY )] − µX µY
= E[(aX Z1 (aY Z1 + bY Z2 + µY ) + bX Z2 (aY Z1 + bY Z2 + µY ) + µX (aY Z1 + bY Z2 + µY )] − µX µY
= E[aX aY Z12 + aX bY Z1 Z2 + aX µY Z1 ] + E[bX aY Z2 Z1 + bX bY Z22 + bX µY Z2 ]
+ E[µX aY Z1 + µX bY Z2 + µX cY )] − µX µY
= aX aY + bX bY .

8
(iv). Define the functions: for z1 , z2 ∈ R,
x = g1 (z1 , z2 ) := aX z1 + bX z2 + µX , y = g2 (z1 , z2 ) = aY z1 + bY z2 + µY .
Then the corresponding inverse functions are
bY (x − µX ) − bX (y − µY ) σY (x − µX ) + σX (y − µY )
z1 = = √
aX bY − bX aY 2σX σY

aX (y − µY ) − aY (x − µX ) σY (x − µX ) − σX (y − µY )
z2 = = √
aX bY − bX aY 2σX σY
Hence, the corresponding Jacobian
√ σY √ σX
2σX σY 2σX σY
J= √ σY √ −σX
2σX σY 2σX σY

√1 √1
2σX 2σY
= √1 1
2σX
− √2σ
Y
σX σY σX σY
=− 2 2 − 2 2
2σX σY 2σX σY
−1
= .
σX σY
Observe that
fX,Y = fZ1 ,Z2 (z1 , z2 )|J|
σY (x − µX ) + σX (y − µY ) σY (x − µX ) − σX (y − µY ) 1
= fZ1 ,Z2 ( √ , √ )
2σX σY 2σX σY σ X σY
   
1 ( σY (x−µ√X2σ
)+σX (y−µY ) 2
) 1 ( σY (x−µX )−σX (y−µY ) 2
√ )
= √ exp − X σY  √ exp − 2σX σY  1
2π 2 2π 2 σX σY
  
1 1 σY (x − µX ) + σX (y − µY ) 2 σY (x − µX ) − σX (y − µY ) 2
= exp − ( √ ) +( √ )
2πσX σY 2 2σX σY 2σX σY
 
1 1 2 2

= exp − 2 2 (σY (x − µX ) + σX (y − µY )) + (σY (x − µX ) − σX (y − µY ))
2πσX σY 4σX σY
 
1 1 2 2 2 2

= exp − 2 2 2σY (x − µX ) + 2σX (y − µY )
2πσX σY 4σX σY
1 (x − µX )2 (y − µY )2
  
1
= exp − 2 + ,
2πσX σY 2 σX σY2
which is desired.

A Appendix
Problem A.1 (Marginal Distribution). Consider a continuous random vector (X, Y ) which is
uniformly distributed over the unit circle in R2 with the joint pdf
(
1
, y ∈ (−1, 1), x2 + y 2 ≤ 1
fX,Y (x, y) := π
0, elsewhere

9
Find the marginal pdf of X.

Proof.
√ Note that the support S := {(x, y) : y ∈ (−1, 1), x2 + y 2 ≤ 1} = {(x, y) : − 1 − x2 ≤ y ≤
1 − x2 , x ∈ (−1, 1)}
Z ∞
fX (x) = fX,Y (x, y)dy
−∞

Z 1−x2
1 1 p
= √ dy = 2( 1 − x2 ).
− 1−x2 π π

Problem A.2 (Joint CDF). Let FX,Y (x, y) be the joint distribution function of X and Y . For
all real constants a < b and c < d, verify that

P (a < X ≤ b, c < Y ≤ d) = FX,Y (b, d) − FX,Y (b, c) − FX,Y (a, d) + FX,Y (a, c).

Moreover, if X, Y are independent; i.e., FX,Y (x, y) = FX (x)FY (y), show that for real constants
a < b and c < d,

P (a < X ≤ b, c < Y ≤ d) = P (a < X ≤ b)P (c < Y ≤ d).

Proof. (i) Trivial.


(ii). By part (i) and the fact that X, Y are independent, we have

P (a < X ≤ b, c < Y ≤ d) = FX,Y (b, d) − FX,Y (b, c) − FX,Y (a, d) + FX,Y (a, c)
= FX (b)FY (d) − FX (b)FY (c) − FX (a)FY (d) + FX (a)FY (c)
= FX (b)[FY (d) − FY (c)] − FX (a)[FY (d) − FY (c)]
= [FX (b) − FX (a)][FY (d) − FY (c)]
= [P (X ≤ b) − P (X ≤ a)][P (Y ≤ d) − P (Y ≤ c)]
= P (a < X ≤ b)P (c < Y ≤ d).

Problem A.3 (Expectation and Variance). Let X, Y have the joint pdf
(
6y 0 < y < x < 1
fX,Y (x, y) :=
0 o.w.

(i) Find the marginal pdf of X. Then find the mean E[X] and variance var(X).
(ii) Find the marginal pdf of Y . Then find the mean E[Y ] and variance var(Y ).
Proof. (i). The marginal pdf of X is given by
Z ∞
fX (x) = fX,Y (x, y)dy
−∞
Z x
= 6ydy = 3x2 .
0
R1
for x ∈ (0, 1) and zeros elsewhere. Hence, E[X] = xfX (x)dx = 0 x3x2 dx = 3/4 and E[X 2 ] =
R
R 2 R1
x fX (x)dx = 0 x2 3x2 dx = 3/5. Hence, var(X) = E[X 2 ] − (E[X])2 = 3/5 − (3/4)2 = 3/80.

10
(ii). To find the marginal pdf of Y , we calculate
Z Z 1
fY (y) = fX,Y (x, y)dx = 6ydx = 6y(1 − y)
y

R1
for y ∈ (0, 1). Hence, E[Y ] = yfY (y)dy = 0 y6y(1 − y)dy = 6(1/3 − 1/4) = 1/2 and E[Y 2 ] =
R
R 2 R1
y fY (y)dy = 0 y 2 6y(1 − y)dy = 6(1/4 − 1/5) = 6/20.

var(Y ) = E[Y 2 ] − (E[Y ])2 = 6/20 − (1/2)2 = 1/20.

Problem A.4 (Bivariate Transformation). Consider a random vector (X1 , X2 ) which is uni-
formly distributed over the unit square S := {(x1 , x2 ) : x1 ∈ (0, 1), x2 ∈ (0, 1)} with pdf
(
1, x1 ∈ (0, 1), x2 ∈ (0, 1)
fX1 ,X2 :=
0, elsewhere

Consider a transformation
Z := X1 + X2
(i) Sketch the support set.
(ii) Find the cdf of Z.
(iii) Find the pdf of Z.
Proof. (i)

Figure 1:

(i). Let z ∈ R. The cdf of Z is

FZ (z) := P (Z ≤ z) = P (X1 + X2 ≤ z).

11
Now, for z < 0, FZ (z) = 0. For z ∈ [0, 1), we have
∞ ∞ z z−x2
z2
Z Z Z Z
P (X1 + X2 ≤ z) = fX1 ,X2 (x1 , x2 )dx1 dx2 = 1dx1 dx2 =
−∞ −∞ 0 0 2

for z ∈ [1, 2), we have


Z ∞ Z ∞
P (X1 + X2 ≤ z) = 1 − fX1 ,X2 (x1 , x2 )dx1 dx2
−∞ −∞
Z 1 Z 1
=1− 1dx1 dx2
z−1 z−x2
1 1
= (z − 1)2 − (1 − z)(2 − z) +
2 2
1
= 1 − (2 − z)2 .
2
Lastly, for z ≥ 2, we have FZ (z) = 1. To sum up, we have

0,
 z<0
z 2 /2,

z ∈ [0, 1)
FZ (z) = 1 2
1 − 2 (2 − z) , z ∈ [1, 2)


1, z>2

(iii) Since Fz′ (z) exists for all values of z, the pdf of Z may be written as


0, z<0

z, z ∈ [0, 1)
fZ (z) =


2 − z, z ∈ [1, 2)
0 z > 2.

Problem A.5 (Convolution Formula). Let X and Y be continuous random variables with the
joint probability fX,Y (x, y) with x, y ∈ R. Define transformations U := X + Y and V := Y .
(i) Find the joint pdf of fU,V .
(ii) Show that Z ∞
fU (u) = fX,Y (u − v, v)dv
−∞

which is the so-called convolution formula.


Proof. (i). Since u = x + y and v = y are affine linear, it implies that we have an inverses:
x = u − v and y = v for (u, v) in the support of (U, V ), denoted by B:

B := {(u, v) : −∞ < v < ∞, −∞ < u < ∞}.

Therefore, the Jacobian is  


1 −1
J = det =1
0 1

12
Now, the joint pdf is

fU,V (u, v) = fX,Y (u − v, v)|J| = fX,Y (u − v, v).

(ii) To find the marginal pdf fU , we integrate out the variable v on the joint pdf fU,V . That is,
Z ∞ Z ∞
fU (u) = fU,V (u, v)dv = fX,Y (u − v, v)dv
−∞ −∞

and the proof is complete.

13

You might also like