Assignment 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

MA533 Assignment 1

July 20, 2017

Instructions.

1. The assignments are not compulsory, but you are encouraged to solve them to improve
your understanding.

2. IF you want me to look at your answers, submit the answers to me in writing. Proofs
should be reasonably formal, but no longer than one page (i.e. two sides). Try to write
clearly and concisely. In exams, there will be marks for how clearly the proof is written.

3. Try to solve the problems on your own.

4. Even if you do discuss with others, you should write the proof down by yourself.

1. Assume that is the space of random permutations on n letters with the uniform measure.
For a Sn , let Y () denote the number of fixed points of , i.e. Y () = |{i | (i) = i}|.
Using Linearity of Expectation, ompute the expected value EY . Using the same idea, can
you compute E[Y 2 ]?

2. (From class) Let E1 , . . . , En be events in a finite probability space . Show that E1 , . . . , En


are mutually independent if and only if the following condition holds:
For all A1 , . . . , An with Ai {Ei , Eic }, P[ ni=1 Ai ] = ni=1 P[Ai ].
T Q

3. (From class) Let E1 , . . . , En be events in a finite probability space . Show that E1 , . . . , En


are mutually independent if and only if the indicator random variables 1E1 , . . . , 1En are
mutually independent.

4. (From class) Let X1 , . . . , Xn be mutually independent random variables in a finite prob-


ability space . Show that for any A1 , . . . , An R, we have
n
Y
P[X1 A1 , . . . , Xn An ] = P[Xi Ai ].
i=1

5. (From class) Let X1 , Y1 , . . . , Xn , Yn be mutually independent random variables in a finite


probability space . For each i [n], let Zi be the random variable fi (Xi , Yi ) where
fi : R R R is any function.
Show that Z1 , . . . , Zn are mutually independent.

6. Let X, Y be random variables that only take values from the set {1, 1}. Show that the
following are equivalent:

1
P[X = 1] = P[X = 1] = 1/2, P[Y = 1] = P[Y = 1] = 1/2, and X, Y are mutually
independent.
EX = EY = E(X Y ) = 0.

Can you formulate and prove a version of this for n random variables X1 , . . . , Xn taking
values only in {1, 1}?

You might also like