0% found this document useful (0 votes)
73 views4 pages

MIT18 S096F13 Pset2

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 4

18.

S096 Problem Set 2 Fall 2013

Probability Theory and Stochastic Process

Due date : 10/3/2013


Collaboration on homework is encouraged, but you should think through the problems yourself
before discussing them with other people. You must write your solution in your own
words. Make sure to list all your collaborators.

Part A
Part A has problems that straightforwardly follow from the denition. Use this part as an
opportunity to get used to the concepts and denitions.

Problem A-1. (Exponential distribution) A continuous random variable X is said to have


an exponential distribution with parameter λ if its probability density function is given by

fX (x) = λe−λx (x ≥ 0).

(a) Compute the moment generating function of X.


(b) Compute the mean and variance of the exponential distribution using the moment gener-
ating function.
(c) For a xed real t > 0, compute P(X > t).
(d) Prove the memoryless property :

P(X > s + t | X > s) = P(X > t) ∀t, s > 0.

(e) Let X1 , X2 , · · · , Xn be i.i.d. exponential random variables with parameter λ. Prove that
min{X1 , X2 , · · · , Xn } has exponential distribution.
(f ) When you enter the bank, you nd that all three tellers are busy serving other customers,
and there are no other customers in queue. Assume that the service times for you and for
each of the customers being served are independent identically distributed exponential random
variables. What is the probability that you will be the last to leave among the four customers?

Problem A-2. (Poisson distribution) A discrete random variable X is said to have a Poisson
distribution with parameter λ if its probability mass function is given by

λk e−λ (k = 0, 1, 2 · · ·).
fX (k) = k!
(a) Compute the moment generating function of X.
(b) Compute the mean and variance of the Poisson distribution using the moment generating
function.
(c) Prove that the sum of independent Poisson random variables has Poisson distribution.(Hint
: use the moment generating function)

Problem A-3. Identify Markov processes.


(a) Simple random walk.
(b) The process Xt = |St |, where St is a simple random walk.

1
(c) X0 = 0 Xt+1 = Xt + (Yt − λ1 ) for t ≥ 0, where Yt are i.i.d.
and random variables with
exponential distribution of parameter λ.
(d) X0 = 0 and Xt+1 = Xt + Zt Zt−1 · Z0 for t ≥ 0, where Zt are i.i.d. random variables with
log-normal distribution.
(e) X0 = 0 and Xt+1 = Xt + Wt + Wt−1 + · · · + W0 for t ≥ 0, where Wt are i.i.d. random
variables with normal distribution.

Problem A-4. (a) Construct an example of a nite Markov chain with non-unique stationary
distributions.
(b) A Markov chain with transition probabilities pij is called doubly stochastic if each row of
the transition matrix sums to one, i.e.,

m
X
pij = 1, j = 1, 2, · · · , m.
i=1

1
Prove that πj = m gives the stationary distribution of a doubly stochastic process over a
state space S of size m.

Part B
Part B has more elaborate problems. Many of the problems in Part B cover important topics
that we did not have enough time to cover in lecture. Thus understanding the content is as
important as solving the problem. Try to think through the content of the problem while
solving it.

Problem B-1. Let X1 , X2 , · · · be i.i.d. exponential random variables with parameter λ, and
dene τ = max{t ≥ 0 : X1 + · · · + Xt ≤ 1}. Hence τ measures the number of times an event
with `exponential waiting time' occured during an interval of length 1.
(a) Compute P(τ = 0).
(b) Compute P(τ = n) using the distribution of Sn = X1 + X2 + · · · + Xn , whose cumulative
distribution function is

n−1
X 1 −λx
FSn (x) = 1 − e (λx)k (x > 0).
k!
k=0

Conclude that τ has Poisson distribution.

Problem B-2. (a) Verify that the probability density function of the normal distribution

1 2 2
φ(x) = √ e−(x−µ) /(2σ ) (−∞ < x < ∞)
σ 2π
´∞
is indeed a probability density function, i.e.
−∞
φ(x)dx = 1 and φ(x) ≥ 0 for all −∞ < x <
´ 2

∞. (Hint : Compute
−∞
φ(x)dx using polar coordinates)
(b) Compute the expectation and variance of log-normal distribution (Hint : it suces to
compute the case when µ = 0. You can use part (a) in order to simply the computation)

Problem B-3. A family of pdfs or pmfs is called an exponential family with parameter θ (θ
is a vector) if it can be expressed as

k
!
X
f (x|θ) = h(x)c(θ)exp wi (θ)ti (x) ,
i=1

2
where c(θ) ≥ 0 and w1 (θ), · · · , wk (θ) are real-valued functions of θ, and h(x) ≥ 0 and
t1 (x), · · · , tk (x) are real-valued functions that do not depend on θ.
Prove that log-normal distribtuion, Poisson distribution, and exponential distribution are
exponential families.

Problem B-4. It is well believed that investing in the stock market over a long period of
time reduces the risk. The following example taken from M. Kritzman, Puzzles of nance,
Chapter 3 (See Kritzman and Rich, Beware of Dogma: the truth about time diversication)
argues against this idea.

Consider a theoretical stock whose annual return has log-normal distribution with parameters
µ and σ for µ = ln(1.1) and σ = ln(1.2). Assume that the return of each year is independent
to other years. For this theoretical stock, the fraction of wealth lost with 0.1% chance when
invested over T years is 40.5%, 58.43%, 61.48%, and 73.92%, for T = 1, 5, 10, 20 respectively.
The author then concludes that:

These results reveal that if risk is construed as annualized variability, then time
diminishes risk. .... However, if the magnitude of potential loss denes risk, then
it increases with time.

(a) Compute the fraction of wealth lost with 0.1% chance when invested over T years, as a
function of T (the function may involve the cumulative distribution function of the normal
distribution).
(b) Find the value of T that maximizes the function h(T ) computed in (a). Is it true that
h(T ) is an increasing function of time?
(c) Criticize the argument above, using the computations of steps (a) and (b).

Problem B-5. A person walks along a straight line and, at each time period, takes a step
to the right with probability b, and a step to the left with probability 1 − b. The person
starts in one of the positions1, 2, · · · , m, but if he reaches position 0 (or position m + 1), his
step is instantly reected back to position 1 (or position m, respectively). Equivalently, we
may assume that when the person is in positions 1 or m, he will stay in that position with
corresponding probability 1 − b and b, respectively.
(a) Model this problem as a Markov chain and describe the transition matrix.
(b) Find the stationary distribution.

3
MIT OpenCourseWare
http://ocw.mit.edu

18.S096 Mathematical Applications in Financial Industry


Fall 2013

For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

You might also like