0% found this document useful (0 votes)
7 views

Exercises

Uploaded by

Saddly Smile
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Exercises

Uploaded by

Saddly Smile
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Chapter 1

Probabilities

1.1 Conditional probability


In a group of 50 people,

• 30 persons speak english

• 20 persons speak german

• and 5 persons speak both english and german.

What is the probability that a person picked at random speaks english, given that the same
person also speaks german ?

1.2 Total probability theorem


A group of people contains

• 40% of persons having the french nationality

• 60% of persons having the german nationality.

We assume that no-one has the double nationality. We also know that

• 70% of the french are chess players

• 20% of the germans are chess players.

If a person picked at random can play chess, what is the probability that this person has the
german nationality ?

1.3 Independence of two events


A factory produces products in two independent phases. During phase 1, a defect of type A
occurs with a probability of 2%. During phase 2, a defect of type B occurs with a probability
of 8%. What is the probability that a product picked at random presents:
a) both defects
b) one and only one defect among both defects
c) no defect among both defects
d) at least one defect among both defects.

1
1.4 Expectation and variance of discrete r.v.s
A player has the choice between two strategies:

1. two independent dice rolls, each with equally likely outcomes in {1, 2, 3, 4, 5, 6}

2. a single dice roll, with equally likely outcomes in {1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}.

The player having the highest number of points wins.


a) The player chooses the strategy with the highest expected number of points. Which
strategy is it ?
b) Calculate the variance of the number of points obtained with each strategy.

1.5 Binomial and Poisson p.m.f.s


You go to a party with 500 guests. What is the probability that exactly one other guest has
the same birthday as you ? Calculate this probability exactly and also approximately using
the Poisson p.m.f.
For simplicity, exclude birthdays on February 29.

1.6 Joint and marginal p.m.f.s


Consider a r.v. X (resp. Y ) whose range is {0, 1, 2} (resp. {0, 1, 2, 3}). The joint p.m.f. of X
1
and Y has the form pX,Y (x, y) = 42 (2x + y), 0 ≤ x ≤ 2, 0 ≤ y ≤ 3.
a) Find the marginal p.m.f.s of X and Y .
b) Find the probability of the event {X = 2, Y = 1}
c) Find the probability of the event {X ≥ 1, Y ≤ 2}.

1.7 Gaussian r.v.


The height of the students in your class is Gaussian distributed with mean 1.7 m and standard
deviation 5 cm. Indication: Φ(1) ≈ 0.8413 and Φ(2) ≈ 0.9772.
a) What is the probability that the height of a student is larger than 1.8 m
b) What is the probability that the height of a student is strictly smaller than 1.6 m
c) What is the probability that the height of a student is in the interval [1.65, 1.75] m.

1.8 Joint p.d.f., c.d.f. and change of variable


Two continuous r.v.s X and Y have the following joint p.d.f.
 1
 xy, 0 < x < 4, 1 < y < 5
fX,Y (x, y) = 96
0, otherwise.

a) Show that X and Y are independent.


b) Calculate the joint c.d.f. of X and Y .
c) Using the change of variable U = X + Y, V = X, calculate P (X + Y < 3).

2
1.9 Central limit theorem
In the game defined in Exercise 1.4,

1. Player 1 uses strategy 1

2. Player 2 uses strategy 2.

Using an approximation based on the central limit theorem, show that after n > 30 independent
games, the probability that the number of points collected by player 1 is larger than the number
of points collected by player 2, exceeds 50%.

1.10 Polling
At an election

• p is the actual proportion of people in favor of candidate A.

• 1 − p is the actual proportion of people in favor of candidate B.

We choose a sample of size n from the total population. Define the r.v.
(
1, if the i-th person in the sample is in favor of candidate A
Xi =
0, otherwise.

Assuming that the sample is chosen uniformly at random in the total population, the r.v.
Sn = X1 + · · · + Xn ∼ B(n, p). Let us define the empirical estimator of p as the r.v. Pn = Snn .
a) Assuming n > 30, np > 5 and n(1−p) > 5, use a Gaussian approximation of the binomial
distribution to show that
r !
p(1 − p)
P |Pn − p| ≤ t = 2Φ(t) − 1.
n

b) Three days before the election, the pollster obtains for n = 2500 an observed value for Pn
equal to p2500 = 0.48. Give a 95% confidence interval for the unknown value of p. Indication:
Φ(2) ≈ 0.9772. Can candidate A still reasonably hope to get elected ?

3
4
Chapter 2

Random processes

2.1 Poisson process


Arrival times of costumers in a store are modeled by a Poisson process with rate λ = 10
costumers per minute.
a) Let M be the number of costumers arriving between 9:00 and 9:10. What is the proba-
bility distribution of M ?
b) Let N be the number of costumers arriving between 9:10 and 9:15. What is the proba-
bility distribution of N ?
c) Calculate the expectation and the variance of the number of arrivals between 9:00 and
9:15.

2.2 Poisson interarrival times


Each morning, you decide to pull out of your driveway and make an illegal U-turn instead of
driving around the block. Police cars drive by according to a Poisson process of rate λ. You
decide to make a U-turn once you see that the road has been clear of police cars for τ time
units. Let N be the number of police cars you see, before you actually make the U-turn. Find
the expected value of N , as a function of λ and τ .

2.3 Markov process


Consider a discrete-time Markov process {Xn , n ∈ N}, taking values in the set of states {s1 , s2 }
and having the following state transition diagram:

1-b s1 s2 1-r

r
whith parameters b = 0.1 and r = 0.4.
a) Build the transition probability matrix P and show that the Markov chain is regular.
b) Find the steady-state probability vector, π.

5
2.4 Queueing
Packets arrive at a telecommunication network, where they are stored in a buffer and then
transmitted. The storage capacity of the buffer is m packets:

• if m packets are already present, any newly arriving packets are discarded

• otherwise, a newly arriving packet is stored in the buffer.

A each discrete time instant, exactly one of the following occurs:

• a new packet arrives, with probability b > 0

• one packet in the buffer completes transmission, with probability 0 < d 6= b

• none of the two previous events occurs.

a) Show that the number of packets in the buffer can be modeled by a birth-death process,
having the following state transition diagram. Is the corresponding Markov process regular ?

1-b 1-b-d 1-b-d 1-d


b b b b

0 1 … m-1 m

d d d d
b) Find the steady-state probabilities of the number of packets in the buffer, π = [π0 , π1 , . . . , πm ],
as a function of ρ = db .

6
Chapter 3

Estimation theory

3.1 Estimation of the mean and variance of a random


variable
Consider a random process formed by a collection of i.i.d. random variables X = (X1 , X2 , . . . , XN )T ,
such that ∀i, (
E[Xi ] = m
V [Xi ] = σ 2 .

and the corresponding observation set is x = (x1 , x2 , . . . , xN )T . We are interested in the natural
estimators of the mean and variance

 N
X
M̂ = Xi /N





i=1
 N
X
(Xi − M̂ )2 /N.

V̂ =




i=1

a) Calculate the expectation and the variance of the mean estimator.


b) Show that the proposed variance estimator is biased and find a remedy.

3.2 Estimation of the parameter of a Bernoulli r.v.


We consider the outcomes of N independent coin tosses, where p is the probability of a head
and 1 − p the probability of a tail. Define the r.v. Xi associated to the i-th coin toss such that
(
1, for a head
Xi =
0, for a tail.

Consider the following estimator of p

N
X
P̂ = Xi /N.
i=1

Show that P̂ is unbiased and that its variance decreases with N .

7
3.3 MVUE for the mean of Gaussian r.v.s
Consider a random process formed by a collection of i.i.d. random variables X = (X1 , X2 , . . . , XN )T ,
such that Xi ∼ N (m, σ 2 ), ∀i, and the corresponding observation set is x = (x1 , x2 , . . . , xN )T .
a) Assuming m is an unknown but deterministic constant, find the joint p.d.f. of X param-
eterized by m, denoted by fX (x; m). Show that fX (x; m) satisfies the regularity condition and
calculate the CRLB.
b) Use the second part of the CRLB theorem to find the MVUE for m.
c) Show that the maximum-likelihood estimate of m, m̂ML , coincides with the estimate
provided by the MVUE.

3.4 BLUE for the mean of Gaussian r.v.s


Consider a random process formed by a collection of i.i.d. random variables X = (X1 , X2 , . . . , XN )T ,
such that Xi ∼ N (m, σ 2 ), ∀i, and the corresponding observation set is x = (x1 , x2 , . . . , xN )T .
Asuming a linear observation model of the form

X = ms + W,

where s = (1, . . . , 1)T and W = (W1 , W2 , . . . , WN )T is a collection of i.i.d. random variables


such that Wi ∼ N (0, σ 2 ), ∀i. Find the BLUE for m, M̂BLUE . Show that it coincides with
the MVUE obtained in Exercice 3.3.

3.5 Bayesian estimation of the parameter of an exponen-


tial p.d.f.
Consider a random process formed by a collection of i.i.d. random variables X = (X1 , X2 , . . . , XN )T ,
such that Xi ∼ E(θ), ∀i, and the corresponding observation set is x = (x1 , x2 , . . . , xN )T . We
assign the following prior distribution for θ, considered as a random variable Θ:
(
λe−λθ , if θ ≥ 0
fΘ (θ) =
0, otherwise

for some λ ≥ 0.
a) Find the MAP estimate of θ, θ̂MAP .
b) Let θ̂ML be the maximum-likelihood estimate of θ. Show that as λ → 0, θ̂MAP → θ̂ML .
Give a theoretical interpretation.

3.6 Bayesian estimation of the mean of Gaussian r.v.s


Consider a random process formed by a collection of i.i.d. random variables X = (X1 , X2 , . . . , XN )T ,
such that Xi ∼ N (m, σ 2 ), ∀i, and the corresponding observation set is x = (x1 , x2 , . . . , xN )T .
We assign the following prior distribution for m, considered as a random variable M :

fM (m) = N (m0 , σ02 ).

a) Find the MAP estimate of m, m̂MAP .

8
b) Find the MMSE estimate of m, m̂MMSE .
c) What is the value of m̂MAP , when σ02 → 0. Give an interpretation.
d) What is the value of m̂MAP , when σ02 → +∞. Give a theoretical interpretation.

You might also like