Introduction
Introduction
Introduction
13: While all the derivations in this problem are there in the text and other problems,
we do them from scratch here in order to reinforce the concepts.
(a) Since U1 and U2 are independent, so are Z = 2 ln − U1 and Θ = 2πU2. Clearly, Θ is uniform over [0,
2π]. Since U1 takes values in [0, 1], the random variable Z takes values in [0, ). The CDF of Z∞is given
by
P [Z ≤ z] = P [−2 ln U1 ≤ z] = P [U1 ≥ e−z/2 ] = 1 − e−z/2 , z ≥ 0
We recognize that Z ∼ Exp( 1 ), or an exponential random variable with mean E[Z] = 2. Remark:
This provides a good opportunity 2 to emphasize that one does not always have to write down explicit
expressions for the joint CDF or density in order to specify the joint distribution. Any specification
that would allow one to write down such expressions if needed is sufficient. In the preceding, we
provided such a specification by stating that Z and Θ are independent, with
Z ∼ Exp( 12) and Θ ∼ Unif [0,
2π].
(b) Let us do this from scratch instead of using results from prior examples and problems.
1 1
p(z, θ) = p(z)p(θ) = e−z/2 , z ≥ 0, 0 ≤ θ ≤ 2π (1)
2 2π
and .
p(x1, x2) = p(z, θ) . (2)
.
|det (J(x1, x2; z, θ))| . 2 2 −1 x2
z=x1+x2,θ=tan x1
where the Jacobian can be computed as
√1 cos √1 sin θ
J(x1, x2; z, θ) = 2z 2z
θ
√ √
− z sin θ z cos θ
so that det (J(x1, x2; z, θ)) = 1 cos2 θ + 1 sin2 θ = 1 . Plugging this and (1) into (2), we obtain
2 2 2
that the joint density of X1, X2 is given by
1
p(x1 , x2 ) = e−(x2 +x )/2
− ∞ < x1 , x2 < ∞
1 22 ,
2
π N(0, 1) densities, so that X , X are i.i.d. N(0, 1)
We recognize that this is a product of two 1 2
random variables.
(c) A code fragment using the preceding to generate N(0, 1) random variables is provided below, and
the histogram generated is shown in Figure 7.
70
60
50
40
30
20
10
0
−4 −3 −2 −1 0 1 2 3 4
Figure 7: Histogram based on 2000 N(0, 1) random variables generated using the method in Problem
5.13.
generating Gaussian random variables from uniform random variables N=1000; half
the number of Gaussians needed
11
generate uniform
random variables
U1 = rand(N,1);
U2 = rand(N,1);
Z = -2*log(U1);
exponentials, mean 2
theta=2*pi*U2;
uniform over [0,2 pi]
transform to standard
Gaussian
X1=sqrt(Z).*cos(theta);
X2=sqrt(Z).*sin(theta);
X = [X1;X2]; 2N independent N(0,1) random
variables hist(X,100); histogram with hundred bins
(d) We estimate E[X2] as the empirical mean by adding the following code fragment:
estimated_power = sum(X.^2)/(2*N)
The answer should be close to the theoretical answer E[X2] = var(X) + (E[X])2 = 1 + 02 = 1.
(e) The desired probability P [X3 + X > 3] can be estimated by adding
the following code fragment.
E = (X.^3 + X > 3); indicates whether the desired event occurs
probability = sum(E)/(2*N) counts fraction of times desired event
occurs
We get an answer of about 0.11.