0% found this document useful (0 votes)
19 views7 pages

Practice Problems ch2 ISE 502

The document contains various problems and solutions related to probability and statistics, including uniform distributions, variance, moment generating functions, and joint probability mass functions. Key results include the probability that a point is within a certain distance from the origin, the cumulative distribution function of the maximum of independent uniform random variables, and properties of variance with respect to constants. Additionally, it covers the moment generating function of geometric random variables and the variance of the product of independent random variables.

Uploaded by

mohamedaldresy98
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views7 pages

Practice Problems ch2 ISE 502

The document contains various problems and solutions related to probability and statistics, including uniform distributions, variance, moment generating functions, and joint probability mass functions. Key results include the probability that a point is within a certain distance from the origin, the cumulative distribution function of the maximum of independent uniform random variables, and properties of variance with respect to constants. Additionally, it covers the moment generating function of geometric random variables and the variance of the product of independent random variables.

Uploaded by

mohamedaldresy98
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Problem 36

Problem: A point is uniformly distributed within the disk of radius 1. That


is, its density is
f (x, y) = C, for x2 + y 2 ≤ 1.
Find the probability that its distance from the origin is less than x, where
0 ≤ x ≤ 1.
Solution: Since the density is uniform, the constant C is determined by the
condition Z
C dx dy = 1.
x2 +y 2 ≤1

The area of the disk of radius 1 is

π(12 ) = π,

so we have
1
C ·π =1 =⇒ . C=
π
The event {D ≤ x} corresponds to the set of all points within a disk of
radius x. Its area is
πx2 .
Thus, the probability that the point is within this smaller disk is

Area of disk of radius x πx2


P {D ≤ x} = = = x2 .
Area of disk of radius 1 π

P {D ≤ x} = x2 , 0 ≤ x ≤ 1.

Problem 37
Problem: Let X1 , X2 , . . . , Xn be independent random variables, each having
a uniform distribution over (0, 1). Define

M = max{X1 , X2 , . . . , Xn }.

Show that the cumulative distribution function (CDF) of M , denoted by FM (x),


is given by
FM (x) = xn , 0 ≤ x ≤ 1.
What is the probability density function (PDF) of M ?
Solution: The event {M ≤ x} means that all of the Xi are less than or
equal to x:
P {M ≤ x} = P {X1 ≤ x, X2 ≤ x, . . . , Xn ≤ x}.

1
Since the Xi are independent and uniformly distributed on (0, 1), we have for
each i,
P {Xi ≤ x} = x, for 0 ≤ x ≤ 1.
Thus,
FM (x) = P {M ≤ x} = xn , 0 ≤ x ≤ 1.

FM (x) = xn , 0 ≤ x ≤ 1.
The probability density function (PDF) of M is the derivative of its CDF:
d d n
fM (x) = FM (x) = (x ) = nxn−1 , 0 ≤ x ≤ 1.
dx dx

fM (x) = nxn−1 , 0 ≤ x ≤ 1.

Problem 50
Problem: Let c be a constant. Show that
(a) Var(cX) = c2 Var(X);
(b) Var(c + X) = Var(X).

Solution:

(a) We start with the definition of the variance:


Var(cX) = E (cX − E[cX])2 .
 

Since E[cX] = c E[X], we can write


Var(cX) = E (cX − c E[X])2 = E c2 (X − E[X])2 .
   

Because c2 is a constant, it factors out of the expectation:


Var(cX) = c2 E (X − E[X])2 = c2 Var(X).
 

(b) Using the definition of variance again, we have:


Var(c + X) = E (c + X − E[c + X])2 .
 

Since E[c + X] = c + E[X], it follows that:


Var(c + X) = E (c + X − (c + E[X]))2 = E (X − E[X])2 = Var(X).
   

Thus, we have shown that:

Var(cX) = c2 Var(X) and Var(c + X) = Var(X).

2
Problem 53
Problem: If X is uniformly distributed over (0, 1), calculate E[X n ] and Var(X n ).

Solution:
For a uniform (0, 1) random variable X, the expectation of X n is given by:
Z 1
n
E[X ] = xn dx.
0

Evaluating the integral:


1
E[X n ] = .
n+1
Next, we calculate E[(X n )2 ]:
Z 1
2n 1
E[X ]= x2n dx = .
0 2n + 1

Using the formula for variance:

Var(X n ) = E[X 2n ] − (E[X n ])2 ,

we get:
 2
1 1
Var(X n ) = − .
2n + 1 n+1
 2
n 1 1 n 1
E[X ] = , Var(X ) = − .
n+1 2n + 1 n+1

Problem 55
Problem: Suppose that the joint probability mass function of X and Y is given
by:
j −2λ λj
 
P (X = i, Y = j) = e , 0 ≤ i ≤ j.
i j!
(a) Find the probability mass function (PMF) of Y . (b) Find the probability
mass function (PMF) of X. (c) Find the probability mass function (PMF) of
Y − X.

Solution:

(a) PMF of Y : To find P (Y = j), sum P (X = i, Y = j) over all possible


values of i:

3
j  
X j λj
P (Y = j) = e−2λ .
i=0
i j!
Using the binomial theorem:
j  
X j i j−i
11 = 2j ,
i=0
i

we get:
(2λ)j
P (Y = j) = e−2λ .
j!
Thus, Y ∼ Poisson(2λ).

(2λ)j
P (Y = j) = e−2λ .
j!

(b) PMF of X: To find P (X = i), sum P (X = i, Y = j) over all j ≥ i:


∞  
X j λj
P (X = i) = e−2λ .
j=i
i j!

Rewriting the sum in terms of k = j − i:



1 −2λ X λk+i
P (X = i) = e .
i! k!
k=0

Since the remaining sum is the Taylor series for eλ , we obtain:

λi
P (X = i) = e−λ .
i!
Thus, X ∼ Poisson(λ).

λi
P (X = i) = e−λ .
i!

(c) PMF of Y − X: Define k = Y − X. Then,

P (X = i, Y − X = k) = P (X = i, Y = k + i).
From the given joint PMF:

k + i −2λ λk+i
 
P (X = i, Y = k + i) = e .
i (k + i)!
Rewriting:

4
λi −λ λk
P (X = i, Y − X = k) = e−λe .
i! k!
This shows that X ∼ Poisson(λ) and Y − X ∼ Poisson(λ), and they are
independent.
Thus, the PMF of Y − X is:

λk
P (Y − X = k) = e−λ .
k!

Problem 63
Problem: Calculate the moment generating function (MGF) of a geometric
random variable.

Solution:
Let X be a geometric random variable with probability mass function (PMF):

P (X = n) = (1 − p)n−1 p, n = 1, 2, 3, . . .
The moment generating function (MGF) is given by:

X
ϕ(t) = E[etX ] = etn P (X = n).
n=1

Substituting the PMF:



X
ϕ(t) = etn (1 − p)n−1 p.
n=1

Factoring out constants:



X n−1
ϕ(t) = pet (1 − p)et .
n=1

The summation is a geometric series with first term 1 and common ratio
(1 − p)et , which converges for (1 − p)et < 1:

X 1
rn = , for |r| < 1.
n=0
1−r
Applying this result:

X n−1 1
(1 − p)et = .
n=1
1 − (1 − p)et
Thus, the MGF simplifies to:

5
pet 1
ϕ(t) = , for et < .
1 − (1 − p)et 1−p

pet
ϕ(t) = , for t < − ln(1 − p).
1 − (1 − p)et

Problem 76
Problem: Let X and Y be independent random variables with means µX and
2
µY and variances σX and σY2 . Show that:
2 2
Var(XY ) = σX σY + µ2Y σX
2
+ µ2X σY2 .

Solution:
The expectation of the product is given by:

E[XY ] = E[X]E[Y ] = µX µY .
Next, we compute E[(XY )2 ]:

E[(XY )2 ] = E[X 2 Y 2 ].
Since X and Y are independent, we have:

E[X 2 Y 2 ] = E[X 2 ]E[Y 2 ].


Using the identity:

E[X 2 ] = Var(X) + (E[X])2 = σX


2
+ µ2X ,
and similarly,

E[Y 2 ] = σY2 + µ2Y ,


we obtain:

E[X 2 Y 2 ] = (σX
2
+ µ2X )(σY2 + µ2Y ).
Now, computing the variance:

Var(XY ) = E[(XY )2 ] − (E[XY ])2 .


Substituting the values:
2
Var(XY ) = (σX + µ2X )(σY2 + µ2Y ) − µ2X µ2Y .
Expanding the product:

6
2 2
Var(XY ) = σX σY + µ2X σY2 + µ2Y σX
2
+ µ2X µ2Y − µ2X µ2Y .
Simplifying:
2 2
Var(XY ) = σX σY + µ2Y σX
2
+ µ2X σY2 .

2 2
Var(XY ) = σX σY + µ2Y σX
2
+ µ2X σY2 .

Problem 79
Problem: With K(t) = log(E[etX ]), show that:

K ′ (0) = E[X], K ′′ (0) = Var(X).

Solution:
The moment generating function (MGF) of X is:

MX (t) = E[etX ].
The cumulant generating function (CGF) is:

K(t) = log MX (t) = log E[etX ].


Differentiating K(t):

E[XetX ]
K ′ (t) = .
E[etX ]
Evaluating at t = 0:

E[Xe0X ] E[X]
K ′ (0) = = = E[X].
E[e0X ] 1
Now, differentiating again:

E[X 2 etX ]E[etX ] − E 2 [XetX ]


K ′′ (t) = .
E 2 [etX ]
Evaluating at t = 0:

E[X 2 ] − E 2 [X]
K ′′ (0) = = E[X 2 ] − E 2 [X] = Var(X).
1
Thus, we conclude:

K ′ (0) = E[X], K ′′ (0) = Var(X).

You might also like