0% found this document useful (0 votes)
10 views

Probability Solution07

This document contains a tutorial on probability concepts with several examples and solutions. Specifically: 1) It finds the distributions of min(U, 1-U) and max(U, 1-U) where U is a uniform random variable on (0,1). Both are uniform distributions on their respective ranges. 2) It calculates the distribution of the waiting time for the first arriving bus when arrival times are uniformly distributed between 8:00-8:30 and buses arrive independently. 3) It calculates the probability that customer A is the last to leave a post office with two clerks, given exponential service times. 4) It identifies the modes of common probability distributions like normal, gamma, and

Uploaded by

nicholas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Probability Solution07

This document contains a tutorial on probability concepts with several examples and solutions. Specifically: 1) It finds the distributions of min(U, 1-U) and max(U, 1-U) where U is a uniform random variable on (0,1). Both are uniform distributions on their respective ranges. 2) It calculates the distribution of the waiting time for the first arriving bus when arrival times are uniformly distributed between 8:00-8:30 and buses arrive independently. 3) It calculates the probability that customer A is the last to leave a post office with two clerks, given exponential service times. 4) It identifies the modes of common probability distributions like normal, gamma, and

Uploaded by

nicholas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

NATIONAL UNIVERSITY OF SINGAPORE

SEMESTER I, 2023/2024

MA2116/ST2131 Probability Tutorial 7

1. Let U be a uniform random variable on (0, 1).


(i) Find the distribution of min(U , 1 −U ).
(ii) Find the distribution of max(U , 1 −U ).
(
U, if 0 < U ≤ 1/2,
Solution. (i) Let X = min(U , 1 −U ) = Then 0 < X ≤ 1/2.
1 −U , if 1/2 < U < 1.
For 0 < x < 1/2,

F X (x) = P (X ≤ x) = P (U ≤ x) + P (1 −U ≤ x)

= P (U ≤ x) + P (U ≥ 1 − x) = x + (1 − (1 − x)) = 2x,

or

1 − F X (x) = P (X > x) = P (U > x, 1 −U > x) = P (x < U < 1 − x) = 1 − 2x.

Then f X (x) = 2. Alternatively, for 0 < x < 1/2 and small ϵ > 0,

P (x < X < x + ϵ) = P (x < U < x + ϵ) + P (x < 1 −U < x + ϵ)

= P (x < U < x + ϵ) + P (1 − x − ϵ < U < 1 − x) = ϵ + ϵ = 2ϵ.

Then f X (x) = 2. It follows that X is a uniform random variable on (0, 1/2).


(
1 −U , if 0 < U ≤ 1/2,
(ii) Let Y = max(U , 1 −U ) = Then 1/2 ≤ Y < 1.
U, if 1/2 < U < 1.
For 1/2 ≤ y < 1,

1 − F Y (y) = P (Y > y) = P (U > y) + P (1 −U > y)

= P (U > y) + P (U < 1 − y) = (1 − y) + (1 − y) = 2 − 2y,

or

F Y (y) = P (Y < y) = P (U < y, 1 −U < y) = P (1 − y < U < y) = 2y − 1.


1
MA2116/ST2131 PROBABILITY TUTORIAL 7 2

Then f Y (y) = 2. Alternatively, for 1/2 < y < 1 and small ϵ > 0,

P (y < Y < y + ϵ) = P (y < U < y + ϵ) + P (y < 1 −U < y + ϵ)

= P (y < U + y + ϵ) + P (1 − y − ϵ < Y < 1 − y) = ϵ + ϵ = 2ϵ.

Then f Y (y) = 2. It follows that Y is a uniform random variable on (1/2, 1). □

2. Suppose that the arrival times of buses A and B are uniformly distributed between 8 : 00 to
8 : 30. A passenger is waiting at a bus stop and will take the first arriving bus. Assume that
that the arrival time of buses A and B are independent. Find the distribution of the waiting
time of the passenger.

Solution. Let X A and X B be the arrival time (in minutes) of buses A and B . They are uni-
formly distributed on (0, 30). Let Y be the waiting time of the passenger. Then for any
0 < y < 30,
µ ¶2
30 − y
P (Y > y) = P (X A > y, X B > y) = P (X A > y)P (X B > y) = .
30
Hence,
30 − y
f Y (y) = , 0 < y < 30. □
450
3. Consider a post office having two clerks, and suppose that when A enters the system he
discovers that B is being served by one of the clerks and C by the other. Suppose also that A
is told that his service will begin as soon as either B or C leaves. If the amount of time a clerk
spends with a customer is exponentially distributed with parameter λ, find the probability
that, of the three customers, A is the last to leave the post office.

Solution. Consider the time at which A is first finds a free clerk. At this point either B or C
would have just left and the other one would still be in service. Assume that B is still being
served. By the lack of memory of the exponential random variable, it follows that the amount
of additional time that B has to spend in the post office is also exponential with parameter
λ. That is, it is the same as if B was just starting his service at this point. By symmetry, the
probability that he finishes before A must equal 1/2. □

4. The mode of a continuous random variable having probability density function f is the value
of x for which f (x) attains its maximum. Calculate the mode of
(a) normal random variable of parameters (µ, σ2 );
(b) gamma random variable of parameters (α, λ), where α ≥ 1.
(c) beta random variable of parameters (a, b), where a > 1 and b > 1.
MA2116/ST2131 PROBABILITY TUTORIAL 7 3

1 2 2
Solution. (a) f (x) = p e −(x−µ) /2σ .
2πσ
Note that −(x − µ)2 /2σ2 has the maximum value 0 at x = µ. Then f (x) also attains the
maximum at x = µ.
1
(b) f (x) = λe −λx (λx)α−1 , x > 0. Then
Γ(α)
λα −λx λα −λx α−2
f 0 (x) = e [(α − 1)x α−2 − λx α−1 ] = e x (α − 1 − λx).
Γ(α) Γ(α)
If 0 < x < (α − 1)/λ, then f 0 (x) > 0; if x > (α − 1)/λ, then f 0 (x) < 0. So f (x) attains the
maximum at x = (α − 1)/λ.
1
(c) f (x) = x a−1 (1 − x)b−1 , 0 < x < 1. Then
B (a, b)
1
f 0 (x) = [(a − 1)x a−2 (1 − x)b−1 − (b − 1)x a−1 (1 − x)b−2 ]
B (a, b)
1
= x a−2 (1 − x)b−2 [(a − 1)(1 − x) − (b − 1)x]
B (a, b)
1
= x a−2 (1 − x)b−2 [(a − 1) − (a + b − 2)x].
B (a, b)
Compare f (0) = 0, f (1) = 0 and f ((a − 1)/(a + b − 2)) > 0, we conclude that f (x) attains
the maximum at x = (a − 1)/(a + b − 2). □

5. Let Z be the standard normal random variable. Show that E [Z n+2 ] = (n + 1)E [Z n ].
1 2
Solution. Recall that the probability density function of Z is f (x) = p e −x /2 . Then

Z∞
1 2
E [Z n+2 ] = p x n+2 e −x /2 d x.
2π −∞
2 2
Let u = x n+1 and d v/d x = −xe −x /2
. Then d u/d x = (n + 1)x n and v = −e −x /2
. Use integra-
tion by parts, · ¸
Z∞
n + 1 n+1 −x 2 /2 ¯¯∞
n+2 n −x 2 /2
E [Z ]= p x e ¯ + x e dx .
2π −∞ −∞
Note that for any positive integer n,
µ ¶n+1 µ ¶n+1
x n+1 x 1
lim = lim 2 = lim 2 = 0.
x→∞ e x 2 /2 x→∞ e x /2(n+1) x→∞ e x /2(n+1) x/(n + 1)

Similarly
x n+1
lim = 0.
x→−∞ e x 2 /2
Then Z∞
n +1 2
E [Z n+2
]= p x n e −x /2
d x = (n + 1)E [Z n ]. □
2π −∞

6. Let X be a nonnegative continuous random variable. Prove the following statements.


MA2116/ST2131 PROBABILITY TUTORIAL 7 4
Z∞
(i) E [X n ] = nt n−1 P (X > t ) d t .
0
(ii) P (X > a) ≤ E [X n ]/a n for any a > 0 and positive integer n.
(iii) If X is exponential with parameter 1, then E [X n ] = n!.
(iv) n! ≥ (n/e)n .

Solution. (i) Let f (x) be the probability density function of X . Then


Z∞ Z∞ Z∞
nt n−1 P (X > t ) d t = nt n−1 f (x) d x d t
0 0 t
Z∞ Z x Z∞
n−1
= nt f (x) d t d x = x n f (x) d x = E [X n ].
0 0 0

(ii) Let f (x) be the probability density function of X . Then


Z∞ Z∞ Z∞
n n n
E [X ] = x f (x) d x ≥ x f (x) d x ≥ a n f (x) d x = a n P (X > a).
0 a a

Z∞ Z∞
n−1 −t
n
(iii) E [X ] = nt e dt = n t n−1 e −t d t = nE [X n−1 ]. Consequently,
0 0

E [X n ] = n(n − 1) · · · 1 · E [X 0 ] = n!.

(iv) Let X be an exponential random variable with parameter 1. Then E [X n ] = n!.

n! E [X n ]
= ≥ P (X ≥ n) = e −n .
nn nn

That is, n! ≥ n n e −n = (n/e)n . □

7. Let N (t ) be a Poisson process with parameter λ > 0. Recall that N (t ) denote the number of
events occurring in time interval [h, h + t ] and

(λt )n
P (N (t ) = n) = e −λt .
n!

Let S n denote the arrival time of the n th event.


(i) Explain that N (t ) < n is equivalent to S n > t .
(ii) Use (i) or otherwise, find the probability density function of S n .

Solution. (i) N (t ) < n means there are at most n − 1 events occurring in [0, t ]; in other
words, the n th event occurs after time t , that is, S n > t .
MA2116/ST2131 PROBABILITY TUTORIAL 7 5
Pn−1 Pn−1
(ii) P (S n > t ) = P (N (t ) < n) = i =0 P (N (t ) = i ) = i =0 e −λt (λt )i /i !. So the probability den-
sity function of S n is given by, for t > 0,
· ¸
d X −λt (λt )i i (λt )i −1
n−1
f n (t ) = − P (S n > t ) = λe −
dt i =0 i! i!
X
n−1 (λt )i X −λt (λt )i −1
n−1
=λ e −λt −λ e
i =0 i! i =1 (i − 1)!
X
n−1
−λt (λt )i X −λt (λt ) j
n−2
−λt (λt )
n−1
=λ e −λ e = λe .
i =0 i! j =0 j! (n − 1)!

Hence, S n is a gamma random variable with parameters (n, λ).


Alternatively, P (N (t ) = 1) ≈ λt and P (N (t ) ≥ 2) ¿ λt . In particular, P (N (ε) ≥ 2) ≈ 0 for
small ϵ. Note that t < S n < t +ϵ means there are i < n events in (0, t ) and j ≥ n −i events
in (t , t + ϵ). So
X
n−1
P (t < S n < t + ϵ) = P (N (t ) = i , N (ϵ) ≥ n − i )
i =1

(λt )n−1
≈ P (N (t ) = n − 1)P (N (ϵ) = 1) ≈ e −λt · λϵ.
(n − 1)!
Hence, the probability density function of S n is
(λt )n−1
f n (t ) = λe −λt , t > 0.
(n − 1)!
Note that this also implies that Γ(n) = (n − 1)! for positive integer n. □

8. A standard Cauchy random variable X has probability density function


1
f (x) = , −∞ < x < ∞.
π(1 + x 2 )
Prove that 1/X is also a standard Cauchy random variable.

Solution. Let y > 0. Then


Z1/y
1 1
1 − F Y (y) = P (Y > y) = P (0 < X < 1/y) = d x = tan−1 (1/y).
0 π(1 + x )
2 π
It follows that
1 −1/y 2 1
f Y (y) = − · = , y > 0.
π 1 + (1/y)2 π(1 + y 2 )
Let y < 0. Then

F Y (y) = P (Y < y) = P (1/y < X < 0) = P (0 < X < −1/y) = 1 − F Y (−y).

It follows that
1
f Y (y) = −F Y0 (−y)(−1) = f Y (−y) = , y < 0.
π(1 + y 2 )
MA2116/ST2131 PROBABILITY TUTORIAL 7 6

Alternatively, note that g (x) = 1/x is one-to-one and with inverse h(y) = 1/y. If y 6= 0, then
1 1 1
f Y (y) = f X (1/y) · |h 0 (y)| = · 2= . □
π(1 + 1/y ) y
2 π(1 + y 2 )

9. A fair coin is tossed three times. Find the joint probability mass function of X and Y .
(a) X is the number of heads in all three tosses, and Y is the number of tails.
(b) X is the number of heads on the first two tosses, and Y is the number of heads on all
three tosses.
(c) X is the absolute difference between the number of heads and the number of tails in all
three tosses, and Y is the number of tails.

Solution. (a) Note that X + Y = 3. For i + j = 3,


¡3 ¢
P (X = i , Y = j ) = P (X = i ) = i (1/2)i

Then
Y =0 Y =1 Y =2 Y =3
X =0 0 0 0 1/8
X =1 0 0 3/8 0
X =2 0 3/8 0 0
X =3 1/8 0 0 0
(b) Note that Y = X or Y = X + 1. For j = i , i + 1,
¡2¢¡ 1 ¢ ¡2¢
P (X = i , Y = j ) = P (X = i , Y − X = j − i ) = i j −i
(1/2)3 = i
(1/2)3 .

Y =0 Y =1 Y =2 Y =3
X =0 1/8 1/8 0 0
X =1 0 1/4 1/4 0
X =2 0 0 1/8 1/8
(c) Note that X = |Y − (3 − Y )| = |3 − 2Y |. For i = |3 − 2 j |,
¡3¢
P (X = i , Y = j ) = P (Y = j ) = j
(1/2)3 .

Y =0 Y =1 Y =2 Y =3
X =1 0 3/8 3/8 0
X =3 1/8 0 0 1/8

10. The joint probability density function of X and Y is given by

f (x, y) = C e −y , −y < x < y, y > 0.


MA2116/ST2131 PROBABILITY TUTORIAL 7 7

(i) Find the value of C .


(ii) Find the marginal probability density function of X and E [X ].
(iii) Find the marginal probability density function of Y and E [Y ].
Z∞ Z∞
Solution. (i) Note that 1 = C f (x, y) d x d y. Then
−∞ −∞
Z∞ Z y Z∞ ¯∞
−y ¯
1= Ce dx dy = 2C ye −y d y = −2C (y + 1)e −y ¯ = 2C .
0 −y 0 0

So C = 1/2.
(ii) For any x ∈ R, −y < x < y means |x| < y. So
Z∞ Z∞
1 −y 1
f X (x) = f (x, y) d y = e d y = e −|x| .
−∞ |x| 2 2
Since f X (x) is even, x f X (x) is odd. Then
Z∞
E [X ] = x(1/2)e −|x| d x = 0.
−∞
(iii) For any y > 0,
Z∞ Zy
1 −y
f Y (y) = f (x, y) d x = e d x = ye −y .
−∞ −y 2
Then Z∞ Z∞
−y
E [Y ] = y · ye dy = y 2 e −y d y = 2. □
0 0

You might also like