0% found this document useful (0 votes)
20 views4 pages

hw02 Sol

The document is a homework assignment for the course IEOR E4102, focusing on stochastic modeling, due on February 7, 2025. It includes solutions to several problems from the course textbook, covering topics such as expectations, variances, and conditional distributions. The problems involve calculations related to random variables, including Bernoulli and Poisson distributions, and require detailed intermediate steps for full credit.

Uploaded by

atlantise163
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views4 pages

hw02 Sol

The document is a homework assignment for the course IEOR E4102, focusing on stochastic modeling, due on February 7, 2025. It includes solutions to several problems from the course textbook, covering topics such as expectations, variances, and conditional distributions. The problems involve calculations related to random variables, including Bernoulli and Poisson distributions, and require detailed intermediate steps for full credit.

Uploaded by

atlantise163
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

IEOR E4102 Stochastic Modeling for MSE Spring 2025

Dr. A. B. Dieker

Homework 2
due on Friday February 7, 2025, 11:59pm EST
Include all intermediate steps of the computations in your answers. If the answer is readily
available on the web (e.g., on wikipedia), then credit is only given for the intermediate steps.

1. Problem 3.27 of our text.


(This means problem 27 from Chapter 3; the page number printed on top of the page is 167.)
Solution. Let us define NT T H as the number of trials required until we get the first TTH, and
NT T as the number of trials required until we get the first TT. Let us also call AT T →T T H the
additional number of trials required after seeing TT until we get the first TTH. We then have

NT T H = NT T + AT T →T T H ,

which implies by linearity of expectation that

E[NT T H ] = E[NT T ] + E[AT T →T T H ].

We know from class (or Example 3.15 of the book) that


1 1
E[NT T ] = + .
1 − p (1 − p)2

There are two ways to calculate E[AT T →T T H ]. The first relies on the observation that AT T →T T H
is geometric with parameter p, since after getting the first TT we wait until we get an H and
then we have obtained the first TTH. This argument implies that E[AT T →T T H ] = 1/p.
The second way is to condition on the coin flip after the first TT. We find that

E[AT T →T T H ] = E[AT T →T T H |heads after first TT]P (heads after first TT)
+ E[AT T →T T H |tails after first TT]P (tails after first TT)
= 1 × p + (E[AT T →T T H ] + 1) × (1 − p).

Solving for E[AT T →T T H ] yields E[AT T →T T H ] = p1 .


We conclude that
1 1 1 1
E[NT T H ] = + + = .
1 − p (1 − p)2 p p(1 − p)2
,
2. Problem 3.39 of our text, except (g) and (h).
Solution.
(a) Let N denote the number of cycles, and let X denote the position of 1. We first note that
X is uniform on {1, . . . , n} by symmetry. As a result, we have for n ≥ 1, by conditioning
on X,
n n n−1
X 1X 1X 1
mn = E(N ) = E[N |X = i]P (X = i) = E[N |X = i] = (1 + mn−i ) + ,
n n n
i=1 i=1 i=1
where the last equality uses the fact that if X = i, there is one cycle until we see card i and
then we ’restart’ the experiment with n − i cards. If i = n then there is no need
P to restart.
We furthermore note that this recursion can also be written as mn = 1 + n1 n−1 k=1 mk .
(b) Using the formula above, and taking m0 = 0, we have that m1 = 1, m2 = 32 , m3 = 11
6 and
25
m4 = 12 .
1
(c) One conjecture from parts (a) and (b) can be: mn = 1 + 2 + . . . + n1 .
(d) Base case is satisfied, as m0 = 0 (or m1 = 1). Assume the Induction Hypothesis that
mi = 1 + 12 + . . . + 1i for all i < n. Then we have
n−1
1 X 1 1 1 n−2 1 
mn = 1 + 1 + + ... + =1+ (n − 1) + + ... +
n 2 i n 2 n−1
i=1
 1 1  n−1 1 1
= 1 + 1 + + ... + − = 1 + + ... + .
2 n−1 n 2 n

(e) At each i Pfor which Xi = 1 a cycle is added to the total number of cycles. We conclude
that N = ni=1 Xi .
(f) By linearity of expectation, we have mn = E(N ) = ni=1 E(Xi ) = ni=1 P (Xi = 1). Note
P P
that Xi = 1 if and only if i is the last of 1, 2, . . . , i, and each permutation of 1, 2,P. . . , i is
equally likely to appear in the deck of cards. Once again we conclude that mn = ni=1 1i .

3. Problem 3.44 of our text.


Solution. Let X be the number of people entering a shop during a given day, and U1 , U2 , . . .
be the money spent by the customers. It is reasonable to assume independence between X
and the Ui , and that the Ui are i.i.d. Also, since X ∼ Ps(10), we know that E(X) = 10,
= 10 and since Ui ∼ Unif(0, 100), E(Ui ) = 50, Var(Ui ) = 5000/6. From class we know
Var(X) P
that E( Xi=1 Ui |X) = XE(U1 ), using that Ui ’s are all independent of X and i.i.d. Thus we find
that
X X
! !!
X X
E Ui = E E Ui X = E (XE(U1 )) = E(U1 )E(X) = 500.
i=1 i=1

For the variance, we use the law of total variance, which gives us:
X X
! " !# " X #!
X X X
Var Ui = E Var Ui X + Var E Ui X .
i=1 i=1 i=1

From class we know that Var( X


P
i=1 Ui |X) = XVar(U1 ), again using that Ui ’s are all independent
of X and i.i.d. Continuing, again using E( X
P
i=1 Ui |X) = XE(U1 ), we find that

X
!
X 105
Var U1 = Var(U1 )E(X) + (E(U1 ))2 Var(X) = .
3
i=1

4. Problem 3.56 of our text.


Solution.

2
(a) Let Y be a Bernoulli random variable defined by
(
1, if it rains tomorrow
Y =
0, if it doesn’t rain tomorrow.

We are given that X is conditionally Poisson(9) given Y = 1, hence E(X|Y = 1) =


Var(X|Y = 1) = 9. Similarly, since X is conditionally Poisson(3) given Y = 0, we have
E(X|Y = 0) = Var(X|Y = 0) = 3. By conditioning on Y , we find that

E(X) = E(X|Y = 1)P (Y = 1) + E(X|Y = 0)P (Y = 0) = 9 × 0.6 + 3 × 0.4 = 6.6.

(b) Conditioning as before, we have that

P [X = 0] = P (X = 0|Y = 1)P (Y = 1) + P (X = 0|Y = 0)P (Y = 0)


= 0.6e−9 + 0.4e−3 ≈ 0.02.

(c) Using the conditional variance formula, Var(X) = E[Var(X|Y )] + Var[E(X|Y )]. As seen
before, we have (
3 Y = 0,
E[X|Y ] = Var(X|Y ) =
9 Y = 1.
A different way of saying this is E[X|Y ] = 3 + 6Y . From Var(X|Y ) = E[X|Y ] we immedi-
ately find that E[Var(X|Y )] = E[E[X|Y ]] = E[X] = 6.6. We next observe that
 
6 6
Var(E[X|Y ]) = Var(3 + 6Y ) = 36 × × 1− = 8.64.
10 10

Our conclusion is that

Var(X) = E[Var(X|Y )] + Var[E(X|Y )] = 6.6 + 8.64 = 15.24.

You can also find this without the conditional variance formula. We have Var(X) =
E(X 2 ) − (E(X))2 . Moreover, E(X 2 |Y = 1) = Var(X|Y = 1) + (E(X|Y = 1)2 = 90 and
E(X 2 |Y = 0) = Var(X|Y = 0) + (E(X|Y = 0))2 = 12. Conditioning as before yields

E(X 2 ) = E(X 2 |Y = 1)P (Y = 1) + E(X 2 |Y = 0)P (Y = 0) = 90 × 0.6 + 12 × 0.4 = 58.8.

We conclude that Var(X) = 58.8 − 6.62 = 15.24.

5. Ungraded challenge problem. Problem 3.48 of our text.


Solution. It is given that E(Yi |X) = X for all i = 1, . . . , n. Using the conditional variance
formula, we have that

Var(Yi ) = Var[E(Yi |X)] + E[Var(Yi |X)] = Var(X) + E[Var(Yi |X)].

Now we note that Var(Yi |X) = E(Yi2 |X) − (E(Yi |X))2 = E(Yi2 |X) − X 2 and we therefore
obtain that E[Var(Yi |X)] = E(Yi2 ) − E(X 2 ). Using Var(X) = E(X 2 ) − (E(X))2 , we deduce
that Var(Yi ) = E(Yi2 ) − (E(X))2 .

3
We also find that

E[(Yi − X)2 ] = E(Yi2 ) + E(X 2 ) − 2E[Yi X] = E(Yi2 ) + E(X 2 ) − 2E[E(Yi X|X)]


= E(Yi2 ) + E(X 2 ) − 2E[XE(Yi |X)] = E(Yi2 ) − E(X 2 ),

so combining these two observations yields

E[(Yi − X)2 ] = E(Yi2 ) − E 2 (X) + E 2 (X) − E(X 2 ) = Var(Yi ) − Var(X).

Clearly the left-hand side is minimized when we pick Yi such that its variance is the lowest. ,

You might also like