0% found this document useful (0 votes)
52 views7 pages

C3 - Pas - MS

This document contains the marking scheme for a probability and statistics test containing 4 questions: 1. State whether probability statements are true or false with justifications. 2. Find the probability that person A originally wrote a plus sign given the referee saw a plus, working through the multiple people handling the slip. 3. Find the probability of selecting an odd digit in 1) the first selection and 2) the second selection from 5 digits, and 3) both selections. 4. Find the probability mass function of the sum of two independent random variables X and Y, and their covariance.

Uploaded by

Rohit Haolader
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views7 pages

C3 - Pas - MS

This document contains the marking scheme for a probability and statistics test containing 4 questions: 1. State whether probability statements are true or false with justifications. 2. Find the probability that person A originally wrote a plus sign given the referee saw a plus, working through the multiple people handling the slip. 3. Find the probability of selecting an odd digit in 1) the first selection and 2) the second selection from 5 digits, and 3) both selections. 4. Find the probability mass function of the sum of two independent random variables X and Y, and their covariance.

Uploaded by

Rohit Haolader
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Indian Institute of Information Technology Allahabad

Probability and Statistics (PAS) - C3 Review Test Marking Scheme

1. State whether the following statements are true or false. In each case write the justi-
fication precisely.

(a) Let E1 , E2 , . . . , En be a sequence of events. Then P (Ei ) = 0, i = 1, 2, . . . , n if and


only if P (∪ni=1 Ei ) = 0. [3]
Solution.
(=⇒) True. !
[n Xn
0≤P Ei ≤ P (Ei ) = 0. [1]
i=1 ! i=1
n
[
⇒P Ei = 0. [1]
i=1
(⇐=) True. !
n
[ n
[
For i = 1, 2, . . . , n, Ei ⊆ Ei ⇒ P (Ei ) ≤ P Ei = 0 ⇒ P (Ei ) = 0. [1]
i=1 i=1
(b) If X is a continuous random variable, then P (X ∈ C) > 0, where C is a countable
subset of R. [2]
Solution. False.
X continuous random variable ⇒ P (X = x) = FX (x) − FX (x−) = 0, ∀ x ∈ R,
where FX is the c.d.f of X. [1]
X
⇒ P (X ∈ C) = P (X = x) = 0. [1]
x∈C
(c) The expected value of a continuous type random variable is not unique. [1]
Solution. False.
Z∞ Z∞ Z0
E(X) = xfX (x)dx = (1 − FX (t))dt − FX (t)dt. [1]
−∞ 0 −∞
Since, distribution function of any random variable is unique, E(X) is unique.
(d) Let X be a continuous random variable with probability density function f . Then
X has another probability density function g, g 6= f such that f (x) ≤ g(x) for all
x ∈ R. [3]
Solution. Let x0 ∈ R such that f (x0 ) > 0. Define g : R → R as g(x) = f (x), ∀
x ∈ R \ {x0 } and g(x0 ) = α > f (x0 ). Then for x0 ≤ x, [1]
Zx Zx 0 Zx Zx 0 Zx Zx

FX (x) = f (t)dt = f (t)dt + f (t)dt = g(t)dt + g(t)dt = g(t)dt.


−∞ −∞ x0 −∞ x0 −∞
[2]

2. A slip of paper is given to person A, who marks it with either a plus or minus sign;
the probability of her writing a plus sign is 31 . A passes the slip to B, who may either
leave it alone or change the sign before passing it to C. Next, C passes the slip to D
after perhaps changing the sign; finally, D passes it to a referee after perhaps changing
the sign. The referee sees a plus sign on the slip. It is known that B, C, and D each
change the sign with probability 32 . Find the probability that A originally wrote a
plus. [6]
Solution: Let us define the follwing events
E1 : A wrote a plus sign E2 : A wrote a minus sign
E : The referee observes a plus sign on the slip.
Given, P (E1 ) = 13 , P (E2 ) = 23 .
We have to find P (E1 |E).
Now, P (E|E1 ) = P[referee observes the plus sign given that ’A’ wrote the plus sign on
the slip]
= P[(Plus sign was not changed at all)]∪(Plus sign was changed exactly twice in passing
from ’A’ to referee through B, C and D)]
= P (E3 ∪ E4 )(say)= P (E3 ) + P (E4 ). Let A1 , A2 and A3 respectively denote the events
the B, C, and D changes the sign on the slip. Then we are given:
2
P (A1 ) = P (A2 ) = P (A3 ) =
3
. We have
1
P (E3 ) = P (A1 ∩ A2 ∩ A3 ) =
27
4
P (E4 ) = P [(A1 ∩ A2 ∩ A3 ) ∪ (A1 ∩ A2 ∩ A3 ) ∪ (A1 ∩ A2 ∩ A3 )] =
9
Therefore,
1 4 13
P (E|E1 ) = + = . [2]
27 9 27
Similarly,
P (E|E2 ) = P[referee observes the plus sign given that ’A’ wrote minus sign on the
slip]
=P[(Minus sign was changed exactly once)]∪(Minus sign was changed thrice)]
= P (E5 ∪ E6 )(say)= P (E5 ) + P (E6 ).
2
P (E5 ) = P (A1 ∩ A2 ∩ A3 ∪ A1 ∩ A2 ∩ A3 ∪ A1 ∩ A2 ∩ A3 ) =
9
8
P (E6 ) = P [(A1 ∩ A2 ∩ A3 ) = .
27
Therefore,
2 8 14
P (E|E2 ) = + = . [2]
9 27 27
Hence,
P (E1 )P (E|E1 ) 13
P (E1 |E) = = . [1+1]
P (E1 )P (E|E1 ) + P (E2 )P (E|E2 ) 41
3. Among the digits 1, 2, 3, 4, 5 at first one is chosen and then a second selection is made
among the remaining four digits. Assuming that all twenty possible outcomes have
equal probabilities, find the probability that an odd digit will be selected (i) the first
time (ii) the second time and (iii) both times. [4]
Solution: The total possible outcomes to select two numbers out of five without
replacement are 5 ∗ 4 = 20.
(i) No. of cases in which an odd number is selected on the first trial is 3 ∗ 4 = 12.
Hence, the required probability is 12/20 = 3/5. [1]
(ii) No. of cases in which an odd number is selected on the second trial can happen
in two ways (a) when the first number selected is an odd number (b) when the first
number selected is not an odd number.
Hence, the required probability is 53 ∗ 24 + 25 ∗ 43 = 35 . [1+1]
(iii) When both the selections results in odd number, the required probability is 35 ∗ 42 =
6 3
20 = 10 . [1]
4. Let X and Y be independent random variables with probability mass functions

1
2, y = 0


( 
1 1, y = 1

3 , x = 1, 2, 3
pX (x) = , pY (y) = 31
0, otherwise 

 6, y = 2

0, otherwise.

Find the probability mass function of W = X + Y , and Cov(X, W ). [10]


Solution: The p.m.f of W is

1
6, w=1



5

18 , w=2




1,

w=3
pW (w) = 31 [1+1+1+1+1]
6, w=4



1

18 , w=5





0, otherwise.

Cov(X, W ) = Cov(X, X + Y ) = Cov(X, X) = Var(X) (since X and Y are indepen-


dent) [1]
E(X 2 ) = 31 (1 + 4 + 9) = 14
3 [1]
E(X) = 31 (1 + 2 + 3) = 6
3 [1]
Var(X) = E(X 2 ) − (E(X))2 [1]
14 36
⇒ Var(X) = 3 − 9 = 23 . [1]
5. Let X be discrete random variable with probability mass function pX and support EX ,
and Y be a continuous random variable with probability density function fY . Find
the probability mass functions/probability density function (whatever applicable) of
X +Y. [5]
Solution: Let Z = X + Y . Then the c.d.f of Z is
FZ (z) = P (Z ≤ z)
= P (X + Y ≤ z)
X
= P (X + Y ≤ z|X = x)pX (x) [1]
x∈EX
X
= P (Y ≤ z − x|X = x)pX (x) [1]
x∈EX
X
= FY |X=x (z − x)pX (x), [1]
x∈EX
Z z X
= fY |X=x (t − x)pX (x)dt, [1]
−∞ x∈E
X

The pdf of Z is
X
fZ (z) = fY |X=x (z − x)pX (x) [1]
x∈EX
n
−n
X nk
6. Using Central limit theorem, find lim e . [6]
n→∞ k!
k=0
Solution. Consider a sequence of i.i.d random variables {Xn } such that Xn ∼
P oisson(1) for all n. [1]
n
X
Then, E(Xn ) = V ar(Xn ) = 1 and Xi ∼ P oisson(n) for all n. [ 21 + 12 + 1]
i=1
Now,

n n
!
X nk X
lim e−n = lim P Xi ≤ n [1]
n→∞ k! n→∞
i=1
k=0
P n 
i=1 Xi −n
= lim P √ ≤0 [1]
n→∞ n
1
= Φ(0) = [1]
2
7. A new computer virus attacks a folder consisting of 200 files. Each file gets damaged
with probability 0.2 independently of other files. What is the approximate probability
that fewer than 50 files get damaged? [6]
Solution: X = the number of damaged files. Then X ∼ Bin(200, 0.2).
p
Now, E(X) = np = 40 and σ = np(1 − p) = 5.65 [1+1]
Applying the Central limit theorem with the continuity correction,
P (X < 50) ≈ P (X < 49.5) [1]
 
X − 40 49.5 − 40
=P < [1]
5.65 5.65
= Φ(1.68) = 0.953 [1+1]
8. Let X1 , X2 , . . . , Xn be independent random variables with E(Xi ) = µ, V ar(Xi ) = σ 2
n n
− 1X 1X −
2
and sample mean defined as X = Xi . Prove or disprove that S = (Xi −X)2
n i=1 n i=1
is an unbiased estimator of σ 2 . [4]
Solution: n n
1X
2 1X 2
S = (Xi − X̄)2 = Xi − X̄ 2 . [1]
n i=1 n i=1

n
2 1X
E(S ) = E(Xi2 ) − E(X̄ 2 )
n i=1
n
1X
= [V ar(Xi ) + (E(Xi ))2 ] − [V ar(X̄) + (E(X̄))2 ] [1]
n i=1
n  2 
1X 2 σ
= [σ + µ2 ] − + µ2 [1]
n i=1 n
2σ2 n−1 2
=σ − = σ 6= σ 2 . [1]
n n
n
1X −
2
Therefore, S = (Xi − X)2 is not an unbiased estimator of σ 2 .
n i=1

9. Let A and B be two events such that P (A is true) = p1 , and P (B is true) = p2 .


Consider the following two codes: [10]
C1:
while(A is true)
{
printf("Bye\n");
}

C2:
if(B is true)
{
printf("Hello\n");
}
else
{
printf("Bye\n");
}

Let X and Y respectively denote the number of times “Hello” and “Bye” are printed
when we run the combined program C1 and C2 , that is,
begin
C1; C2;
end

(a) Find the probability mass function of X.


Solution: X denote the number of times “Hello” printed, i.e., support of EX =
{0, 1} with probability mass function,

P (X = 1) = p2 , because “ Hello” is printed with probability p2


P (X = 0) = 1 − p2

Therefore, X is a Bernoulli random variable with probability p2 . [1]


(b) Find the joint probability mass function of X and Y .
Solution: Let Z be the number of times program C1 write “ Bye”. Then Z is a
geometric random variable with parameter p1 , i.e.,

P (Z = k) = pk1 (1 − p1 ) k = 0, 1, 2, . . . [1]

Since Y is the random variable denoting the number of times “ Bye ” is printed
when both programs executed. Therefore,
If X = 1, then Y = Z takes values {0, 1, . . .}. [1]
If X = 0, then Y = Z + 1 takes values {1, 2, . . .} [1]
Thus, for k ≥ 0

P (X = 1, Y = k) = P (X = 1, Z = k) = P (X = 1)P (Z = k) = p2 (1 − p1 )pk1

and for k ≥ 1,

P (X = 0, Y = k) = P (X = 0, Z = k−1) = P (X = 0)P (Z = k−1) = (1−p2 )(1−p1 )pk−1


1

Therefore the joint p.m.f of X and Y is [1+1]


(
p2 (1 − p1 )py1 if x = 1, y ≥ 0
p(X,Y ) (x, y) =
(1 − p2 )(1 − p1 )py−1
1 if x = 0, y ≥ 1

(c) Find the probability mass function of Y .


Solution:

P (Y = 0) = P (X = 1, Y = 0) = p2 (1 − p1 )
P (Y = k) = P (X = 0, Y = k) + P (X = 1, Y = k)
= (1 − p2 )(1 − p1 )pk−1
1 + p2 (1 − p1 )pk1
= (1 − p2 (1 − p1 ))(1 − p1 )pk−1
1 , k ≥1

The p.m.f of Y is [1+1]


(
p2 (1 − p1 ) if y = 0
pY (y) =
(1 − p2 (1 − p1 ))(1 − p1 )py−1
1 if y ≥ 1
(d) Find E(Y ).
Solution:

X
E(Y ) = kP (Y = k)
k=0

X
= k(1 − p2 (1 − p1 ))(1 − p1 )p1k−1
k=1
p1
= + (1 − p2 ). [2]
1 − p1
Alternatively,

E(Y ) = E(Y |X = 0)P (X = 0) + E(Y |X = 1)P (X = 1)


= E(Z + 1)P (X = 0) + E(Z)P (X = 1)
p1
= E(Z) + P (X = 0) = + (1 − p2 )
1 − p1
Z-table:
Z .00 .01 .02 .03 .04 .05 .06 .07 .08 .09
1.6 .945 .946 .947 .948 .949 .950 .951 .952 .953 .954

You might also like