0% found this document useful (0 votes)
76 views12 pages

ESM3a: Advanced Linear Algebra and Stochastic Processes

This document contains solutions to assignment problems from an advanced linear algebra and stochastic processes course. The problems cover topics like probability, random variables, independence, expectation, and distributions. The solutions provide detailed working showing how to set up and solve the problems mathematically. Key steps and reasoning are explained clearly in full sentences. Figures and diagrams are used where helpful.

Uploaded by

venisulav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
76 views12 pages

ESM3a: Advanced Linear Algebra and Stochastic Processes

This document contains solutions to assignment problems from an advanced linear algebra and stochastic processes course. The problems cover topics like probability, random variables, independence, expectation, and distributions. The solutions provide detailed working showing how to set up and solve the problems mathematically. Key steps and reasoning are explained clearly in full sentences. Figures and diagrams are used where helpful.

Uploaded by

venisulav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Jacobs University Bremen

Keivan Mallahi-Karai

Due: November 4, 2015


Assignment 6

ESM3a: Advanced Linear Algebra and Stochastic Processes

(6.1) Three points M, N, and L are randomly chosen on a circle centered at O. Find the probability of the
event that
(a) O is inside the triangle MNP .
(b) O on on one of the sides of the triangle MNP .
(c) O is outside the triangle MNP .
Solution. Let use denote the events by Ein , Eon , and Eout . Without loss of generality we assume that
O is located at the point (1, 0) and let us denote by 0 x 1 and 0 y 1 the angles at which
N and P are from M, where the full circle corresponds to 1. The conditions for MNP to be acute
that is that |x y | < 1/2, max(x, y ) > 1/2 and min(x, y ) < 1/2. Drawing the region as a subset of
[0, 1] [0, 1], one can see that it is a union of two right triangles of area 1/8. This implies that
1
P [Ein ] = .
4
On the other hand, the center O is on one of the sides if x = 1/2, or y = 1/2, or |x y | = 1/2. Each
one of these equations described a line of a union of two lines, which has zero area. This show that
P [Eon ] = 0.
By the law of total probability, we have
P [Eout ] =

3
.
4

(6.2) For events A, B , we write A ? B if A and B are independent.


(a) Show that if A ? B then A ? B c .
(b) Show that if A ? B then Ac ? B c .
(c) Is it true that if A ? B and A ? C then A ? B \ C?
(d) Is it true that if A ? B and A ? C then A ? B [ C?
Solution.
P [A \ B c ] = P [A

A \ B] = P [A]

P [A \ B] = P [A] (1

P [B]) = P [A] P [B c ] .

(b) follows from a repeated application of (a): A ? B implies A ? B c , which is equivalent to B c ? A,


and this in turn implies B c ? Ac .
(c) and (d) are both false. Take = {00, 01, 10, 11}, where each point in the sample space has
probability 1/4. Take A = {00, 01}, B = {00, 11}, C = {01, 11}. Clearly A is disjoint from both B
and C, but B \ C = {11}, while A \ (B \ C) = ;, which shows that A is not disjoint from A \ C. The
same example is also a counterexample for part (d).
(6.3) A coin which lands heads with probability 0 < p < 1 has been flipped n times. Let rn denote the
probability of an even number of heads.
(a) Show that r1 = 1 p, r2 = p 2 + (1 p)2 .

(b) By conditioning on the outcome of the first trial show that


rn = p(1

rn

1)

+ (1

p)rn

1.

(c) Deduce that


1
(1 + (1 2p)n ) .
2
(d) Use part (c) to show that, independent of the value of p, we have
rn =

lim rn =

n!1

1
.
2

Solution. Let us denote by Xi the outcome of the i -th throw of the coin. This implies
r1 = P [X1 = T ] = 1

p.

and

(1)

r2 = P [X1 = X2 = H] + P [X1 = X2 = T ] = p 2 + (1 p)2 .


Let us denote by En the event that there are an even number of heads in the first n throws. Let An be
the event that the last (n-th throw) is H. By conditioning on A, we have

rn = P [En |An ] P [An ] + P [En |Acn ] P [Ac ] = pP Enc 1 + (1 p)P [En 1 ]


= p(1

rn

1)

+ (1

p)rn

1.

The rest follows by induction on n. If the statement is true for some n, then
1
1
1
rn+1 = p(1 ( (1 + (1 2p)n ))) + (1 p)( (1 + (1 2p)n )) =
1 + (1
2
2
2
It is now clear that since 0 1 2p < 1, we have
1
1
lim rn = lim (1 + (1 2p)n ) = .
n!1
n!1 2
2

2p)n+1 .

(6.4) An r -element subset A of the set {1, 2, . . . , n} is randomly chosen. Let X denote the largest element
of A. Show that the probability mass function of X is given by
8 k 1
< ( r 1 ) if 1 k n
(nr)
p(k) := P [X = k ] =
:0
otherwise
Solution. It is clear that X can only take values 1, 2, . . . , n. Moreover X = k i A contains k, and the
rest of the elements of A are at most k 1. Hence there are kr 11 options for A. Since there are nr
options for choosing an r -element subset of the set {1, 2, . . . , n}, we are done.

(6.5) Let X and Y be Bernoulli random variables, with parameters p and q, respectively. Note that the
random variable Z = XY also takes only values 0, and 1, and is hence a Bernoulli random variable with
some parameter r . Show that
p+q

1 r min(p, q).

Hint: Use the fact that Z = 0 when X = 0 or Y = 0.


Solution. Note that {X = 0} {Z = 0}. This shows that
1

p = P [X = 0] P [Z = 0] = 1

r,

implying that p
1

r . Similarly, q

r , proving that r min(p, q). For the other inequality, note that

r = P [Z = 0] = P [X = 0] + P [Y = 0]

This establishes the other inequality.

P [X = Y = 0]

(1

p) + (1

q)

1.

Jacobs University Bremen


Keivan Mallahi-Karai

Due: November 11, 2015


Assignment 7

ESM3a: Advanced Linear Algebra and Stochastic Processes

(7.1) Let X be a random variable with the density function

where

|x|

f (x) =

> 0. Find P [a < X < b] for a < b 2 R. You may need to distinguish several cases.

Solution. Case 1. a

0 It follows that b > 0 as well and therefore


Z b
e x
b
P [a < X < b] =
dx =
e x a=
e a e b .
2
2
2
a
Case 2. b 0 It follows that a < 0 as well and therefore
Z b
e x
b
P [a < X < b] =
dx = e x a =
e b e a .
2
2
2
a
Case 2. a 0, b 0 In this case we break the integral at 0 and proceed as follows
Z b
Z 0 x
Z b
e |x|
e
e x
P [a < X < b] =
dx =
dx +
dx =
2 e a e
2
2
2
2
a
a
0
(7.2) Suppose X has Poisson distribution with parameter . Find E
form.
Solution. By definition we have
1
X
E [X ] = e
k=0

1
X

1
=e
k! k + 1

k=0

1
1+X

(k + 1)!

p
FY (t) = P [Y t ] = P X t =

Dierentiating this we have

fY (t) =

p e
2 t

if t > 0
otherwise

. The answer should be in a closed

(7.3) Suppose X has exponential distribution with parameter .


(a) Find the density of Y = X 2 and Z = |X 1|.
(b) Compute E [Y ].
Solution. Note that

if t > 0
otherwise

Similarly, we have
FZ (t) = P [Z t ] = P [|X

1| t ] = P [t

1 X t + 1] =

e
1

(t 1)

(t+1)

(t+1)

if t > 1
t<1

The density function can now be computed by dierentiation:


(
(e (t 1) e (t+1) ) if t > 1
fZ (t) =
e (t+1)
t<1
Finally, we have


E [Y ] = E X 2 =

t2 e

dt = 2

(7.4) A coin is tossed until both sides appear at least once. Let X be the number of the times the coin is
thrown.
(a) Show that
(
1/2j 2 if j = 2, 3, . . .
P [X j ] =
1
j =1
(b) Compute E [X ].
Solution.
(a) It is clear that they need at least two throws, hence P [X 1] = P [X 2] = 1. For j 2, we
have X j i the result of the first j 1 throws are the same. Hence
1
1
P [X j ] = 2 j 1 = j 2 .
2
2
(b) Using the formula discussed in class for random variables that only take non-negative integers
as values, we have
1
1
X
X
1
E [X ] =
P [X j ] = 1 +
= 3.
2j
j=1

j=0

(7.5) n distinct balls are placed in a box. m balls are taken out with substitution. Let N be the number of
distinct balls that have been taken out. Compute E [N ].
Solution. For 1 i n, let us denote by Ni the Bernoulli random variable which is 1, when the ball i
has been taken out in at least one of the m round. Clearly
n
X
N=
Ni .
i=1

Since the balls are substituted, Ni = 0, precisely when the ball i is not taken out in any of the m rounds.
This implies

n 1 m
P [Ni = 1] = 1
.
n
Since Ni is a Bernoulli random variables, we have E [Ni ] = P [Ni = 1]. Hence
" n
#


n
X
X
n 1 m
E [N ] = E
Ni =
E [Ni ] = n 1
.
n
i=1

i=1

Jacobs University Bremen


Keivan Mallahi-Karai

Due: November 18, 2015


Assignment 8

ESM3a: Advanced Linear Algebra and Stochastic Processes

(8.1) For > 1, suppose that X has the density function given by
(
e t if t 0
fX (t) =
0
otherwise
X
Compute E e .
Solution. Setting h(x) = e x , we have
Z 1
X
E e =
e t e

dt =

(8.2) Let X and Y


Z = XY .
(a) Find the
(b) Find the
(c) Find the

( 1)t

dt =

be independent random variables with uniform distribution in the interval [0, 1]. Let
joint distribution function of X and Y .
joint distribution function of X and Z.
joint density function of X and Z.

(8.3) Let n 2 be an integer and let the joint probability mass function of discrete random variables X and
Y be given by
(
k(x + y ) if 1 x, y n
pX,Y (x, y ) =
0
otherwise
(a) Determine the value of constant k.
(b) Determine the marginal probability mass functions of X and Y .
(c) Find P [X Y ].
Hint: For (c), you can simplify the calculations by observing that P [X
Solution. We know that

x,y

p(x, y ) = 1. This implies that


k

n X
n
X

(x + y ) = 1.

x=1 y =1

1
Simplifying this leads to kn2 (n + 1) = 1, which leads to k = n2 (n+1)
.
For part (b), write
n
X
n(n + 1)
2x + n + 1
p(x) = k
(x + y ) = k(nx +
)=
.
2
2n(n + 1)
y =1

Since p(x, y ) = p(y , x), we have

p(y ) =

2y + n + 1
.
2n(n + 1)

Y ] = P [Y

X ].

For part (c), note that P [X


P [Y
X ]. This implies that
P [X

Y ]+P [X Y ]+P [X = Y ] = 1. By symmetry, we have P [X

1
Y ] = (1
2

1
P [X = Y ]) =
2

2k

n
X
x=1

n
2n

Y]=

(8.4) Suppose X and Y are independent random variables with exponential distributions with parameters
and , respectively. Set M = min(X, Y ) and D = |X Y | = max(X, Y ) min(X, Y ). Compute the
probability P [D > a, M > b] as a function of a and b.
Solution. Note that M > 0 and D

0. The joint density function of X, Y is given by

f (x, y ) = exp( ( x + y )),

x, y > 0.

We will discuss the case a > 0 and b > 0. The cases where at least one of a, b is negative is easier.
The region described by D > a, M > b is
x, y

b,

|x

Once the region is drawn, one can see that


Z b+a Z
P [D > a, M > b] =
b

y|

a.

x+a

exp( ( x + y )) =
b

(8.5) n people board an elevator on the ground floor of a building with k floors. Each leaves the elevators at
one of the floors 1 to k, each chosen randomly and independently from others.
(a) For 1 j k, let Xj be the Bernoulli random variable which takes value 1 exactly when the
elevator stops at floor j. Show that

k 1 n
P [Xj = 1] = 1
.
k
(b) Use part (a) to find the expected number of the floors at which the elevator stops?

Solution. For part (a), note that that elevator does not stop at floor j, if all n passengers choose a
floor dierent from floor j. Since there are k 1 such floors, and each passenger decides independently,
we have

k 1 n
P [Xj = 1] = 1 P [Xj = 0] = 1
.
k
For part (b), let X denote the number of the floors at which the elevator stops. It is obvious that
n
X
X=
Xj ,
j=1

as each j that Xj = 1 contributes 1 to the sum above. Using the linearity of the expectation, we have

k
X
k 1 n
E [X ] =
E [Xj ] = k k
.
k
j=1

Jacobs University Bremen


Keivan Mallahi-Karai

Due: November 30, 2015


Assignment 9

ESM3a: Advanced Linear Algebra and Stochastic Processes

(9.1) The premium of an insurance company is 180. Using statistical analysis, the insurance company has
found out that claim costs have an average of 80, and the standard deviation of 50. Assume that
50 percent of the customers with an insurance policy will file for a claim by the end of the year. At
least many policies should the insurance company sell in order for the probability of loss to be less than
0.001%?
Solution. Let n be the number of policies sold, and Xj be the amount of claim from police j, 1 j n,
and X = X1 + + Xn .
p
E [X ] = 80n, Var [X ] = 50 n.
This implies that
2500n
1
1
P [X > 180n] = P [X 80n > 100n] P [|X 80n| > 100n]
=
<
10000n2
4n
1000
which gives n 250.
(9.2) This problem provides some mathematical justification for democratic decision-making. Suppose that
a group of 2n + 1 people are trying to reach a decision using majority vote (To avoid ties, we have
assumed the number of voters is an odd number). There are two choices, one supposedly right and
the other supposedly wrong The voters do not know the correct choice, but we will assume that each
voter has probability p of picking the right choice and probability 1 p of picking the wrong choice.
Moreover, suppose that individual votes independently. In this problem we show that if p = 12 + " with
" > 0, then the probability that the majority vote leads to the correct choice tends to 1 as n tend to
infinity. For instance, if each individual is reasonable enough to make the right choice with probability
.50001 then the majority vote will be correct with probability 0.99 as soon as the number of voters is
large enough.
(a) For 1 i 2n + 1, let Xi denote a Bernoulli random variable with parameter p = 1/2 +
that takes value 1 exactly when the voter i makes the correct decision and Y be the Bernoulli
random variable with value 1 when the majority vote is correct. Show that Y = 1 if and only if
X1 + + X2n+1 (n + 1).
(b) Use the WLLN to show that
lim P [X1 + + X2n+1

n!1

n + 1] = 1.

(c) What do you think are the shortcomings of applying this to the real-world situations? This part
does not have an ocially correct answer, so feel free to speculate.
Solution.
(a) By definition, Y = 1, i the majority vote is correct. This happens exactly when at least n + 1
of the voters vote correctly, which in turn is equivalent to S2n+1 = X1 + + X2n+1 n + 1.
(b) Assume that p = 12 + . Let An be the event that Yn = 0, which is equivalent to S2n+1 n.
This implies that
S2n+1 (2n + 1)p
n (2n + 1)(1/2 + )
2n 1/2

=
< .
2n + 1
2n + 1
2n + 1

This shows that


An := {Y2n+1 = 0} Bn :=

S2n+1

(2n + 1)p
> . .
2n + 1

By WLLN, we have limn!1 P [Bn ] = 0. This implies that limn!1 P [An ] = 0.


(c) Its your try!
(9.3) (a) Assume that Xn is a discrete random variable with the uniform distribution over the set 0, n1 , n2 , . . . , n n 1 ,
i.e.,

k
1
P Xn =
= , k = 0, 1, . . . , n 1.
n
n
Show that the moment generating function of Xn is given by
MXn (t) =

et 1
.
n(e t/n 1)

(b) Let X have a uniform distribution over [0, 1]. Show that
et

1
.
t
(c) Show that as n ! 1, we have MXn (t) ! MX (t).
MX (t) =

Solution. Using the geometric sum we have

For (b), we have

n 1

X
1 kt/n
et 1
MXn (t) = E e tX =
e
=
.
n
n(e t/n 1)
k=0

MX (t) =
For (c), we use the Taylor expansion

e tx dx =
0

ey = 1 + y +

e tx
t

=
0

et

1
t

y2
+ ...
2

to get
et 1
et 1
= lim
t/n
n!1 n(e
1) n!1 n((1 + t/n + t 2 /2n2 + . . . )
lim

et 1
et 1
=
.
2
n!1 t + t /2n + . . .
t

= lim

(9.4) Suppose X and Y are independent so that X and X + Y have the same distribution. Show that
P[Y = 0] = 1.
Hint: Use the fact that X and X + Y have the same moment generating function.
Solution. We have
MX (t) = MX+Y (t) = MX (t)MY (t)
which implies that MY (t) = 1. Note that Y0 = 0 is a random variable with MY0 (t) = 1. Using the
uniqueness theorem, we obtain Y = 0.
(9.5) The moment generating function of a random variable X is given by
X (t)

Find E [X ] , Var [X ].

= (1

2t)


Solution. We know that E [X ] = MX0 (0) and E X 2 = MX00 (0). It is easy to see
MX0 (0) = ( 2)( 2)(1

2t)

t=0

= 4.

Simularly,
MX00 (0) = ( 2)( 2)( 3)( 2)(1

2t)

4
t=0

= 24.

Jacobs University Bremen


Keivan Mallahi-Karai

Due: December 7, 2015


Assignment 10

ESM3a: Advanced Linear Algebra and Stochastic Processes

(10.1) There are 100 students in a probability class. Since each lecture hall has room for only 70 students,
the students have to be divided into two groups for the exam. The professor suggests the following
scheme: each students uses a fair coin to decide which lecture hall to go to. Under the (not so safe!)
assumption the coin outcomes are independent, use the CLT to find the approximate value of the
probability that none of the lecture halls is overcrowded with students.

Solution.
Let X1 , X2 , . . ., Xn , where n := 150, be independent random variables with Xi = 1
if student i goes to lecture hall A (say) and Xi = 0 if the students i goes to lecture hall B. Then
S100 = X1 + + X100 will be the number of students who choose to go to lecture hall A and
100 S100 will be the number of students who go to lecture hall B. We are interested in
Note that = E [Xi ] =
P [30 S100

1
2,

P [30 S100 70] .

= Var [Xi ] = 14 . Hence


"
#
S100 50
70] = P
1 p
1 =
100 12
and

(1)

( 1) .70

(10.2) A fair die is tossed repeatedly. Let Xn be the number of distinct numbers that have been rolled in the
first n rounds.
(a) Find the transition matrix of this Markov chain.
(b) Find all absorbing states of this chain.

Solution. Note that pij = 0 if j < i. Also if the current maximum is i , then the maximum in the next
round is still i if the outcome of the next round is at most i and 1/6
0
1/6 5/6
0
0
B 0
1/3
4/6
0
B
B 0
0
1/2
1/2
P =B
B 0
0
0
2/3
B
@ 0
0
0
0
0
0
0
0
The only absorbing state is 6, since p66 = 1.

0
0
0
1/3
5/6
0

1
0
0 C
C
0 C
C
0 C
C
1/6A
1

(10.3) A Markov chain is called doubly stochastic if the sum of entries of each column of the transition
matrix P is 1. Show that if such a Markov chain is ergodic then the stationary distribution is =
(1/n, 1/n, . . . , 1/n), where n is the number of states of the chain.

Solution. Let = ( n1 , . . . , n1 ). Since there is a unique stationary distribution, it suces to show that
P = . To see this note that since each column add up to 1, we have
n
n
X
1X
1
(P )n =
k Pkj =
Pkj =
n
n
k=1

k=1

which proves the claim.

(10.4) Consider a Markov chain with the state space S = {1, 2} and the transition probabilities given by
p11 = p,

p12 = 1

p21 = 0,

p,

where 0 < p < 1. Compute pijn .


Solution. The transition matrix is clearly given by

p 1 p
P =
0
1
It is now easy to prove by induction that
n

P =

pn
0

pn

1
1

p22 = 1,

You might also like