Comm2 Assgn1
Comm2 Assgn1
Communication-II [EC31204]
Assignment-1
Group no.:17
Members:
Given:
P(G) = 0.6, P(C) = 0.7, P(G ∩ C) = 0.4
Now, the probability that the selected student belongs neither to G nor to C is:
Thus, the probability that a randomly selected student is neither a genius nor a chocolate lover
1
2 Question 2 [solved by Shreyans]
For the experiment of rolling a die, we have:
Ω = {1, 2, 3, 4, 5, 6}
Let X be the random variable denoting the outcome of the roll. Let us consider the σ-algebra to be
the power set of Ω.
Let p and q denote the probability of getting an even face and an odd face respectively. Given that:
p = 2q (1)
Substituting:
3p + 3q = 1 (2)
Solving equations (1) and (2), we get:
1 2
q= , p=
9 9
Thus, the probability model for this die is as follows:
x 1 2 3 4 5 6
1 2 1 2 1 2
P(X = x) 9 9 9 9 9 9
2
3 Question 3 [solved by Shreyans]
For the given experiment, let U denote the set of all possible outcomes from the die, then:
U = {1, 2, 3, 4}
Any roll of the die can result in any number x ∈ U . As per the question, the experiment is finished
when the roll results in an even number. Hence, the last roll of the die must result in an even number
in U , and all the throws before that must result in an odd number in U .
Let Ao denote the event of getting an odd number, that is, some y ∈ {1, 3}. Let Ae denote the
event of getting an even number, that is, some w ∈ {2, 4}.
Thus, the sample space Ω can be represented as:
3
4 Question 4 [solved by Shreyans]
Let A, B, and C be the opponents, and let a, b, and c be the corresponding probabilities of winning
against each opponent. Without loss of generality, assume a > b > c. Hence, A is the weakest oppo-
nent and C is the strongest.
Let E denote the event of a win against opponent E, and E denote a loss against E. For any
permutation P1 P2 P3 of ABC, let:
- The first game be played against P1 (with win probability p1 ),
- The second against P2 (with win probability p2 ),
- The last against P3 (with win probability p3 ).
Let W denote the event of winning the tournament. To win the tournament, we must win at least two
games in a row. Hence:
W = (P1 P2 P3 ) ∪ (P1 P2 P3 ) ∪ (P1 P2 P3 )
Clearly, all of the above events are disjoint. Thus, using the additivity axiom, we get:
P(W ) = (1 − p1 )p2 p3 + p1 p2 (1 − p3 ) + p1 p2 p3
Simplifying :
P(W ) = p2 (p1 + p3 − p1 p3 ) = p2 (p1 + p3 ) − p1 p2 p3
Notice that the term p1 p2 p3 appears in every permutation as it is (since the order of multiplication
does not matter). Hence, to maximize the probability of winning the tournament, we need to maximize
the term p2 (p1 + p3 ).
Again, the order of p1 and p3 does not matter (as they are added), the possible options we have
are:
a(b + c) if A plays second,
b(a + c) if B plays second,
c(a + b) if C plays second.
Clearly:
a(b + c) > b(a + c) because a > b,
a(b + c) > c(a + b) because a > c.
Hence, playing A second turns out to be optimal, and the order of B and C does not matter, as shown.
4
5 Question 5
(Written by Harsh Raj)
Let Xi : event of getting a Head on the ith toss.
Claim:
P(Xi = 1, X2 = 1 | X1 = 1) ≥ P(Xi = 1, X2 = 1 | X1 = 1 OR X2 = 1)
Assumptions:
1. Xi ∼ i.i.d
2. P(Xi = 1) = p (generalised)
Solution:
Since X1 and X2 are independent, we can say:
P(X1 = 1, X2 = 1 | X1 = 1) = P(X2 = 1) = p
and :
P(X1 = 1, X2 = 1) = P(X1 = 1) · P(X2 = 1) = p2
The probability of getting at least one head :
p2 p2
P(Xi = 1, X2 = 1 | X1 = 1 OR X2 = 1) = =
P(at least one Head) 2p − p2
P(Xi = 1, X2 = 1 | X1 = 1) ≥ P(Xi = 1, X2 = 1 | X1 = 1 OR X2 = 1)
p2
p≥
2p − p2
=⇒ 2p − p2 ≥ p2
=⇒ 2p ≥ 2p2
=⇒ 1 ≥ p
This inequality holds for all valid probabilities (0 ≤ p ≤ 1).
Conclusion:
Alice is right about her claim; also, the fairness of toss (value of p) does not affect the correctness of
her claim as it is valid for all possible and valid values of p.
5
6 Question 6
(solved by: Venu Gopal)
Let xi : Event that the i-th ball picked up is not-defective,where i = 1, 2, 3, 4.
Given:
• Number of defective items, D = 5
P (X) = P (x1 ∩ x2 ∩ x3 ∩ x4 )
Using the multiplication rule:
Substituting probabilities:
95
P (x1 ) =
100
94
P (x2 | x1 ) =
99
93
P (x3 | x1 ∩ x2 ) =
98
92
P (x4 | x1 ∩ x2 ∩ x3 ) =
97
95 94 93 92
P (E) = · · ·
100 99 98 97
Simplify:
P (X) = 0.81
Thus, the probability of accepting the batch is:
P (X) = 0.81
6
7 Question 7
(solved by: Venu Gopal)
Let XB : Number of heads Bob gets.
XA : Number of heads Alice gets.
Given:
• Total coins: 2n + 1.
• Bob tosses n + 1 coins.
• Alice tosses n coins.
• Each coin has probability of heads: p = 12 .
Let,
7
8 Question 8
(solved by: Venu Gopal)
Let, x1 : be the event,one dog decides the correct path
x2 : be the event,two dogs decide the correct path
Given:
• P (x1 ) = p1
• Both dogs choose the path independently.
P (both dogs agreeing to the same path) = P (both choosing the correct path)+P (both choosing the wrong path)
P (both dogs not agreeing to the same path) = 1 − P (both dogs agreeing to the same path)
Substitute:
Simplify:
P (both dogs not agreeing to the same path) = 1 − p21 + (1 − 2p1 + p21 )
8
Further simplification:
P (x2 ) = p21 + p1 − p21
P (x2 ) = p1
We can see that P (x1 ) = P (x2 ). Hence,we can see that both the strategies are equiprob-
able
9
9 Question 9
(Written by Harsh Raj)
Let Xi : ith sent signal, Yi : ith received signal, Zi : ith signal received correctly.
Given:
P(X = 0) = p, P(X = 1) = 1 − p,
P(Y = 0 | X = 0) = 1 − ϵ0 , P(Y = 0 | X = 1) = ϵ1 ,
P(Y = 1 | X = 0) = ϵ0 , P(Y = 1 | X = 1) = 1 − ϵ1 .
—
Part A
The probability that the k th symbol is received correctly is :
Part B
The probability of receiving the sequence 1011 correctly is:
10
Part C
If the sent signal is 000 , the message is decoded correctly if at least two 0s are correctly detected. The
probability is:
Part D
As the probability of toggling X = 0 (ϵ0 ) decreases, the probability of correctly decoding ”000”
increases.
—
Part E
Using Bayes’ rule:
P(received = 101) = p · ϵ0 · (1 − ϵ0 ) · ϵ0 + (1 − p) · (1 − ϵ1 )2 · ϵ1
p · ϵ20 · (1 − ϵ0 )
=⇒ P(X = 0 | received = 101) = .
p· ϵ20 · (1 − ϵ0 ) + (1 − p) · (1 − ϵ1 )2 · ϵ1
p · ϵ20 · (1 − ϵ0 )
P(symbol sent was 0 given we received 101) = .
p· ϵ20 · (1 − ϵ0 ) + (1 − p) · (1 − ϵ1 )2 · ϵ1
11
10 Question 10 [Solved by Aditya]
We are working with two types of users:
• n1 voice users: These users occasionally need a voice connection.
• n2 data users: These users occasionally need a data connection.
Each user connects independently:
• A voice user connects with probability p1 and requires r1 bits/sec when active.
• A data user connects with probability p2 and requires r2 bits/sec when active.
The system has a total capacity of C bits/sec. If the total demand exceeds C, the system becomes
overwhelmed. Our goal is to calculate the probability of this happening.
Let’s define the key random variables:
• V : The number of active voice users. Since each voice user connects independently, V follows a
binomial distribution:
V ∼ Binomial(n1 , p1 ).
D ∼ Binomial(n2 , p2 ).
• T : The total demand on the system, which is the sum of the contributions from voice and data
users:
T = r1 · V + r2 · D.
The system is considered overloaded when the total demand exceeds its capacity:
µT = r1 · n1 · p1 + r2 · n2 · p2 .
Now that we know T is approximately normal, the probability of overload can be expressed as:
Standardizing T (to convert it into a standard normal random variable Z), we have:
C − µT
P (Overload) = P Z > ,
σT
12
where Z ∼ N (0, 1) (a standard normal random variable).
Finally, we use the cumulative distribution function (CDF) of the standard normal distribution,
Φ(z), to express the probability:
C − µT
P (Overload) = 1 − Φ .
σT
• σT is the standard deviation of the total demand, which reflects how much variation we expect.
• C−µT
σT is the ”z-score” that measureshow far the system capacity C is from the average demand
µT , in units of standard deviation. item The overload probability, P (Overload), gives us the
likelihood that the demand will exceed capacity, causing the system to be overwhelmed.
13
11 Question 11 [Solved by Aditya]
(a) Binomial Random Variable
The binomial random variable represents the number of successes in n independent trials of a
Bernoulli process, where each trial has the same probability p of success.
Event Description: Tossing a biased coin n times, where the probability of landing heads (success)
is p. The binomial random variable counts how many times the coin lands on heads out of the n tosses.
n k
X ∼ Binomial(n, p), P (X = k) = p (1 − p)n−k , k = 0, 1, . . . , n
k
X ∼ Geometric(p), P (X = k) = (1 − p)k−1 p, k = 1, 2, . . .
14
(f ) Events and Their Probabilities
1. The first success does not occur till the m-th trial (m is a constant)
This means that the first m−1 trials are failures, and the m-th trial is the first success. The probability
of this event is:
2. The third success occurs between the m-th and m + k-th trial (m and k are constants)
This means:
• Exactly two successes occur in the first m − 1 trials.
• Exactly one success occurs in the k trials between the m-th and m + k − 1-th trials.
The probability of this event is:
m−1 2 m−3 k
P (Third success between m and m + k) = p (1 − p) · p(1 − p)k−1
2 1
Here:
• m−1
2 p2 (1 − p)m−3 : Probability of exactly two successes in the first m − 1 trials.
• k
1 p(1−p)k−1 : Probability of exactly one success in the k trials between the m-th and m+k−1-th
trials.
Range of Y
After the k th success occurs at trial mk , the (k + 1)th success must occur on a subsequent trial.
Therefore, the range of Y is:
Y ∈ {mk + 1, mk + 2, . . . }
Distribution of Y
The (k + 1)th success occurs only after mk trials, so the distribution of Y is a shifted geometric
distribution. The probability that the (k + 1)th success occurs at the n-th trial (n > mk ) is:
P (Y = n) = (1 − p)n−mk −1 p
Here:
• (1 − p)n−mk −1 : Probability of n − mk − 1 failures after the k th success.
• p: Probability of success on the n-th trial.
15
(h) Distribution of Z
The random variable Z represents the number of trials required to get the first occurrence of three
successive 1s. The probability is:
P (Z = n) = p3 (1 − p)n−3 , n≥3
Problem
You have a biased coin with P (Heads) = 0.6. You flip the coin until you see two heads in a row. What
is the expected number of flips?
Solution
This problem is modeled as a Markov process with three states:
• State 0: No heads have been observed.
• State 1: One head has been observed.
• State 2: Two consecutive heads (HH) have been observed, and the process stops.
Let E0 and E1 represent the expected number of flips starting from State 0 and State 1, respectively:
E0 = 1 + pE1 + (1 − p)E0
E1 = 1 + p · 0 + (1 − p)E0
Rearranging:
1 + pE1
E0 =
p
E1 = 1 + (1 − p)E0
Substituting E1 = 1 + (1 − p)E0 into E0 :
1 + p (1 + (1 − p)E0 )
E0 =
p
Simplifying:
1+p
E0 = + (1 − p)E0
p
Rearranging:
1+p
E0 − (1 − p)E0 =
p
1+p
E0 p =
p
1+p
E0 =
p2
Final Result
For p = 0.6, the expected number of flips is:
1 + 0.6 1.6
E[Number of flips] = = ≈ 4.44
(0.6)2 0.36
Thus, on average, 4.44 flips are needed to observe two consecutive heads.
16