Practice Problems ch2 ISE 502
Practice Problems ch2 ISE 502
π(12 ) = π,
so we have
1
C ·π =1 =⇒ . C=
π
The event {D ≤ x} corresponds to the set of all points within a disk of
radius x. Its area is
πx2 .
Thus, the probability that the point is within this smaller disk is
P {D ≤ x} = x2 , 0 ≤ x ≤ 1.
Problem 37
Problem: Let X1 , X2 , . . . , Xn be independent random variables, each having
a uniform distribution over (0, 1). Define
M = max{X1 , X2 , . . . , Xn }.
1
Since the Xi are independent and uniformly distributed on (0, 1), we have for
each i,
P {Xi ≤ x} = x, for 0 ≤ x ≤ 1.
Thus,
FM (x) = P {M ≤ x} = xn , 0 ≤ x ≤ 1.
FM (x) = xn , 0 ≤ x ≤ 1.
The probability density function (PDF) of M is the derivative of its CDF:
d d n
fM (x) = FM (x) = (x ) = nxn−1 , 0 ≤ x ≤ 1.
dx dx
fM (x) = nxn−1 , 0 ≤ x ≤ 1.
Problem 50
Problem: Let c be a constant. Show that
(a) Var(cX) = c2 Var(X);
(b) Var(c + X) = Var(X).
Solution:
2
Problem 53
Problem: If X is uniformly distributed over (0, 1), calculate E[X n ] and Var(X n ).
Solution:
For a uniform (0, 1) random variable X, the expectation of X n is given by:
Z 1
n
E[X ] = xn dx.
0
we get:
2
1 1
Var(X n ) = − .
2n + 1 n+1
2
n 1 1 n 1
E[X ] = , Var(X ) = − .
n+1 2n + 1 n+1
Problem 55
Problem: Suppose that the joint probability mass function of X and Y is given
by:
j −2λ λj
P (X = i, Y = j) = e , 0 ≤ i ≤ j.
i j!
(a) Find the probability mass function (PMF) of Y . (b) Find the probability
mass function (PMF) of X. (c) Find the probability mass function (PMF) of
Y − X.
Solution:
3
j
X j λj
P (Y = j) = e−2λ .
i=0
i j!
Using the binomial theorem:
j
X j i j−i
11 = 2j ,
i=0
i
we get:
(2λ)j
P (Y = j) = e−2λ .
j!
Thus, Y ∼ Poisson(2λ).
(2λ)j
P (Y = j) = e−2λ .
j!
λi
P (X = i) = e−λ .
i!
Thus, X ∼ Poisson(λ).
λi
P (X = i) = e−λ .
i!
P (X = i, Y − X = k) = P (X = i, Y = k + i).
From the given joint PMF:
k + i −2λ λk+i
P (X = i, Y = k + i) = e .
i (k + i)!
Rewriting:
4
λi −λ λk
P (X = i, Y − X = k) = e−λe .
i! k!
This shows that X ∼ Poisson(λ) and Y − X ∼ Poisson(λ), and they are
independent.
Thus, the PMF of Y − X is:
λk
P (Y − X = k) = e−λ .
k!
Problem 63
Problem: Calculate the moment generating function (MGF) of a geometric
random variable.
Solution:
Let X be a geometric random variable with probability mass function (PMF):
P (X = n) = (1 − p)n−1 p, n = 1, 2, 3, . . .
The moment generating function (MGF) is given by:
∞
X
ϕ(t) = E[etX ] = etn P (X = n).
n=1
The summation is a geometric series with first term 1 and common ratio
(1 − p)et , which converges for (1 − p)et < 1:
∞
X 1
rn = , for |r| < 1.
n=0
1−r
Applying this result:
∞
X n−1 1
(1 − p)et = .
n=1
1 − (1 − p)et
Thus, the MGF simplifies to:
5
pet 1
ϕ(t) = , for et < .
1 − (1 − p)et 1−p
pet
ϕ(t) = , for t < − ln(1 − p).
1 − (1 − p)et
Problem 76
Problem: Let X and Y be independent random variables with means µX and
2
µY and variances σX and σY2 . Show that:
2 2
Var(XY ) = σX σY + µ2Y σX
2
+ µ2X σY2 .
Solution:
The expectation of the product is given by:
E[XY ] = E[X]E[Y ] = µX µY .
Next, we compute E[(XY )2 ]:
E[(XY )2 ] = E[X 2 Y 2 ].
Since X and Y are independent, we have:
E[X 2 Y 2 ] = (σX
2
+ µ2X )(σY2 + µ2Y ).
Now, computing the variance:
6
2 2
Var(XY ) = σX σY + µ2X σY2 + µ2Y σX
2
+ µ2X µ2Y − µ2X µ2Y .
Simplifying:
2 2
Var(XY ) = σX σY + µ2Y σX
2
+ µ2X σY2 .
2 2
Var(XY ) = σX σY + µ2Y σX
2
+ µ2X σY2 .
Problem 79
Problem: With K(t) = log(E[etX ]), show that:
Solution:
The moment generating function (MGF) of X is:
MX (t) = E[etX ].
The cumulant generating function (CGF) is:
E[XetX ]
K ′ (t) = .
E[etX ]
Evaluating at t = 0:
E[Xe0X ] E[X]
K ′ (0) = = = E[X].
E[e0X ] 1
Now, differentiating again:
E[X 2 ] − E 2 [X]
K ′′ (0) = = E[X 2 ] − E 2 [X] = Var(X).
1
Thus, we conclude: