EE3110 Jul 2024 Tutorial2 Solutions
EE3110 Jul 2024 Tutorial2 Solutions
1. Let A be the event that the batch will be accepted. Then, A = A1 ∩ A2 ∩ A3 ∩ A4 where Ai is the event
that the ith item is not defective. Hence, we have
2. Intuitively, there is something wrong with this rationale. The reason is that it is not based on a correctly
specified probabilistic model. In particular, the event where both of the other prisoners are to be released
is not properly accounted in the calculation of the posterior probability of release.
To be precise, let A, B, and C be the prisoners, and let A be the one who considers asking the guard.
Suppose that all prisoners are a priori equally likely to be released. Suppose also that if B and C are to
be released, then the guard chooses B or C with equal probability to reveal to A. Then, there are four
possible outcomes:
1. A and B are to be released, and the guard says B (probability 1/3)
2. A and C are to be released, and the guard says C (probability 1/3)
3. B and C are to be released, and the guard says B (probability 1/6)
4. B and C are to be released, and the guard says C (probability 1/6)
Thus, P(A is to be released | guard says B) = P(A is to be released and guard says B)/P(guard says B)
1
= 1 +3 1 = 32
3 6
2
Similarly, P(A is to be released | guard says C) = 3
Thus, regardless of the identity revealed by the guard, the probability that A is released is equal to 2/3,
the a priori probability of being released.
3. Let A be the event that the first toss is a head and B be the event that second toss is a head. Shipra
claims that P(A ∩ B|A) ≥ P(A ∩ B|A ∪ B). Let us analyze the claim:
P(A ∩ B ∩ A) P(A ∩ B)
P(A ∩ B|A) = = . (1)
P(A) P(A)
which is exactly the definition of B suggesting A. Hence, the forward implication is proved. A
similar proof can be used for the backward implication.
(b) Aim: To show B suggests A if and only if B c does not suggest A.
Hence, we’ve shown A suggests B. Using (1), we can tell that B suggests A. Hence, we’ve proved
the forward implication. The same proof can be started from the end and reversed to prove the
backward implication.
(c) Given:
P(Treasure is in first place) = β, P(Treasure is in second place) = 1 − β.
P(Finding treasure given treasure is present) = p
P(A ∩ B) P(B) pj
P(B | A) = = =
P(A) P(A) 1 − pi di
Similarly, if i = j, we have
P(A ∩ B) P(B)P(A | B) pi (1 − di )
P(B | A) = = =
P(A) P(A) 1 − pi di
Page 2
6. Let A be the event that the first n − 1 tosses produce an even number of heads, and let E be the event
that the nth toss is a head. We can obtain an even number of heads in n tosses in two distinct ways:
1) there is an even number of heads in the first n − 1 tosses, and the nth toss results in tails: this is the
event A ∩ E c ; 2) there is an odd number of heads in the first n − 1 tosses, and the nth toss results in
heads: this is the event Ac ∩ E. Using also the independence of A and E,
We now use induction. For n = 0, we have q0 = 1, which agrees with the given formula for qn . Assume,
that the formula holds with n replaced by n − 1, i.e.,
1 + (1 − 2p)n−1
qn−1 =
2
(b) Let B be the event that the roll results in a sum of 4 or less. We need to calculate P(A|B)
P(A ∩ B)
P(A|B) =
P(B)
2/36
P(A|B) =
6/36
= 1/3
(d) Let D be the event that the two dice land on different numbers.
#(D) = 30 and #(C ∩ D) = 10 so
10/36
P(C|D) =
30/36
= 1/3
Page 3
8.
P (test + |ill)(P (ill)
P (ill|test+) =
P (test + |ill)(P (ill) + P (test + |healthy)(P (healthy)
99
.10−5
= 100
99 1
.10−5 + .(1 − 10−5 )
100 100
1
≈ .
1101
The chance of being ill is rather small. Indeed it is more likely that the test was incorrect.
9. Let A denote the event that the family has at least one boy, and B denote the event that it has at least
one girl. Then,
Pr(B) = 1 − (1/2)n ,
Pr(A ∩ B) = 1−Pr(All Boys) − Pr(All Girls)= 1 − (1/2)n − (1/2)n = 1 − (1/2)n−1
Hence,
P r(A ∩ B) 1 − (1/2)n−1
P r(A|B) = =
P r(B) 1 − (1/2)n
10. Let X and Y denote the number of tosses required on the first experiment and second experiment,
respectively. Then X = n if and only if the first n − 1 tosses of the first experiment are tails and the
nth toss is a head, which has probability (1/2)n . Furthermore, Y > n if and only if the first n tosses of
the second experiment are all tails, which also has probability (1/2)n . Hence,
P∞
Pr(Y > X) = n=1 P r(Y > n | X = n)P r(X = n)
∞ ∞
X 1 1 X 1 1
= n
. n
= n
=
n=1
2 2 n=1
4 3
A = (A ∩ C) ∪ (A ∩ C c )
Note that by the definition of PB and the fact that it is indeed another probability map (i.e., satisfies
the axioms of probability), we have,
PB (A|C) = P (A|B ∩ C)
we have,
Page 4
Comparing with the expression given in the question,
P (C)
α = P (C|B) = P (B) (as C is a subset of B).
12. rth urn contains r − 1 red and N − r blue balls. Hence the total number of balls = N − 1.
(a)
P (2nd ball is blue|rth urn is chosen) = P (2nd is blue|rth urn, 1st is red).P (1st is red|rth urn)
+ P (2nd is blue|rth urn, 1st is blue).P (1st is blue|rth urn)
N −r r−1 N −r−1 N −r N −r
= . + . =
N −2 N −1 N −2 N −1 N −1
N
X
P (second ball is blue) = P (second ball is blue|rth urn is selected).P (rth urn is selected)
r=1
N
X N −r 1 1
= . =
r=1
N − 1 N 2
(b)
N
X
P (f irst ball is blue) = P (f irst ball is blue|rth urn is selected).P (rth urn is selected)
r=1
N
X N −r 1 1
= . =
r=1
N − 1 N 2
(
(N −r)(N −r−1)
th (N −1)(N −2) , r=1,...,N-2
P (both the balls blue|r urn is selected) =
0, r=N
N −2
X (N − r)(N − r − 1) 1
P (both the balls blue) = .
r=1
(N − 1)(N − 2) N
N −2
1 X
= (N − r)(N − r − 1)
(N )(N − 1)(N − 2) r=1
1
=
3
Page 5
13. (a)
(b)
(c) Majority Decision Decoder: For the transmission of ’one symbol’, transmitter will send the
same symbol 3 times. So the receiver will receive 3 symbols, out of 3 symbols, 1 is received 2 times,
receiver will decode that ’one symbol’ as 1, otherwise 0.
Page 6
15. Let D is the event that a person is HIV positive, and T is the event that the person tests positive.
P(D ∩ T ) (0.99)(0.003)
P(D | T ) = = ≈ 23%
P(T ) (0.99)(0.003) + (0.01)(0.997)
A short reason why this surprising result holds is that the error in the test is much greater than the
percentage of people with HIV. This leads to a lot of false positives than “expected”.
16. The game can be described as having probability 1/2 of winning 1 dollar and a probability 1/2 of losing
1 dollar. A player begins with a given number of dollars, and intends to play the game repeatedly until
the player either goes broke or increases his holdings to N dollars.
For any given amount n of current holdings, the conditional probability of reaching N dollars before
going broke is independent of how we acquired the n dollars, so there is a unique probability P(N | n)
of reaching N on the condition that we currently hold n dollars. Of course, for any finite N we see that
P(N | 0) = 0 and P(N | N ) = 1. The problem is to determine the values of P(N | n) for n between 0
and N .
We are considering this setting for N = 200, and we would like to find P(200 | 50). Denote y(n) :=
P(200 | n), which is the probability you get to 200 without first getting ruined if you start with n dollars.
We saw that y(0) = 0 and y(200) = 1. Suppose the player has n dollars at the moment, the next round
will leave the player with either n + 1 or n − 1 dollars, both with probability 1/2. Thus the current
probability of winning is the same as a weighted average of the probabilities of winning in player’s two
possible next states. So we can safely say that,
1 1
y(n) = y(n + 1) + y(n − 1).
2 2
Multiplying by 2 , and subtracting y(n) + y(n − 1) from each side, we have
This says that slopes of the graph of y(n) on the adjacent intervals are constant (remember that x must
be an integer). In other words, the graph of y(n) must be a line. Since y(0) = 0 and y(200) = 1, we
have y(n) = n/200, and therefore y(50) = 1/4. Another way to see what the function y(n) is to use the
telescoping sum as follows
since the all these differences are the same, and y(0) = 0. To find y(1) we can use the fact that y(200) = 1,
so y(1) = 1/200, and therefore y(n) = n/200 and y(50) = 1/4.
17. Let K be the event that an examinee knows the answer, G the event of the examinee guesses, and C the
event that the answer is correct.
P(K) = p, P(G) = 1 − p
P(C|K) = 1, P(C|G) = 1/m
We need to find
P(C|K)P(K)
P(K|C) =
P(C|K)P(K) + P(C|G)P(G)
p
=
p + (1 − p)/m
Page 7