0% found this document useful (0 votes)
5 views6 pages

Solution 1

The document provides solutions to a series of problems related to stochastic processes, including calculations of joint and marginal probability distributions, conditional expectations, and Markov chains. It details the derivation of probability density functions and expectations based on given conditions and examples. Additionally, it discusses the application of the Chapman-Kolmogorov equation in analyzing a Markov chain with defined states and transition probabilities.

Uploaded by

xiangrong li
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views6 pages

Solution 1

The document provides solutions to a series of problems related to stochastic processes, including calculations of joint and marginal probability distributions, conditional expectations, and Markov chains. It details the derivation of probability density functions and expectations based on given conditions and examples. Additionally, it discusses the application of the Chapman-Kolmogorov equation in analyzing a Markov chain with defined states and transition probabilities.

Uploaded by

xiangrong li
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Stochastic Processes

Solutions to Example Sheet 1

1. (a)
F (a, b) = P (X ≤ a, Y ≤ b)
X
= p(x, y).
x≤a, y≤b

(b) X
pX (x) = p(x, y),
y∈RY
X
pY (y) = p(x, y).
x∈RX

(c)
pX|Y (x|y) = P (X = x|Y = y)
P (X = x, Y = y)
=
p(Y = y)
p(x, y)
=
pY (y)
p(x, y)
= X .
p(a, y)
a∈RX

(d) X
E(X|Y = y) = x pX|Y (x, y)
x∈RX
X . X
= x p(x, y) p(a, y).
x∈RX a∈RX

(e) The condition expectation E[X|Y ] is a random variable such that

E[X|Y ](ω) = E[X|Y = y]


X
x p(x, y)
x∈RX
= X
p(a, y)
a∈RX

if Y (ω) = y.

1
2. Following the formulae given in Question 1,

(a)

1
F (1, 1) = p(1, 1) = ,
9
1 1 2
F (1, 2) = p(1, 1) + p(1, 2) = + = ,
X 9 9 9
F (3, 3) = p(x, y) = 1.
x,y≤3

(b) The p.d.f. of X is

pX (1) = p(1, 1) + p(1, 2) + p(1, 3)


1 1 2
= + +0= ,
9 9 9
pX (2) = p(2, 1) + p(2, 2) + p(2, 3)
1 1 1
= +0+ = ,
3 6 2
pX (3) = p(3, 1) + p(3, 2) + p(3, 3)
1 1 1 5
= + + = .
9 18 9 18

Similarly the p.d.f. of Y is

pY (1) = p(1, 1) + p(2, 1) + p(3, 1)


1 1 1 5
= + + = ,
9 3 9 9
pY (2) = p(1, 2) + p(2, 2) + p(3, 2)
1 1 1
= +0+ = ,
9 18 6
pY (3) = p(1, 3) + p(2, 3) + p(3, 3)
1 1 5
=0+ + = .
6 9 18

(c) Using the results obtained in (b) the conditional probability density function is

2
given by
1
p(1, 1) 9 1
pX|Y (1|1) = = 5 = ,
pY (1) 9
5
1
p(2, 1) 3 3
pX|Y (2|1) = = 5 = ,
pY (1) 9
5
1
p(3, 1) 9 1
pX|Y (3|1) = = 5 = ,
pY (1) 9
5
1
p(1, 2) 9 2
pX|Y (1|2) = = 1 = ,
pY (2) 6
3
p(2, 2) 0
pX|Y (2|2) = = 1 = 0,
pY (2) 6
1
p(3, 2) 18 1
pX|Y (3|2) = = 1 = ,
pY (2) 6
3
p(1, 3) 0
pX|Y (1|3) = = 5 = 0,
pY (3) 18
1
p(2, 3) 6 3
pX|Y (2|3) = = 5 = ,
pY (3) 18
5
1
p(3, 3) 9 2
pX|Y (3|3) = = 5 = .
pY (3) 18
5

(d) Using the conditional probability density function calculated in (c) and definition
of conditional expectation, we have

3
X
E[X|Y = 1] = xpX|Y (x|1)
x=1
1 3 1 10
=1× +2× +3× = = 2,
5 5 5 5
3
X
E[X|Y = 2] = x pX|Y (x|2)
x=1
2 1 5
=1× +2×0+3× = ,
3 3 3
3
X
E[X|Y = 3] = x pX|Y (x|3)
x=1
3 2 12
=1×0+2× +3× = .
5 5 5

3
3. By the definition of conditional expectation, and calculations in 2(d),

 2 if Y (ω) = 1,
5
E[X|Y ](ω) = 3 if Y (ω) = 2,
12
if Y (ω) = 3.

5

So by the definition of expectation

E[E[X|Y ]]
5 12
× P (Y = 2) +
= 2 × P (Y = 1) + × P (Y = 3)
3 5
5 5 1 12 5
=2× + × + ×
9 3 6 5 18
37
= .
18

It is easy to see from 2(b) that

3
X
E(X) = x PX (x)
x=1
2 1 5
=1× +2× +3×
9 2 18
37
= .
18

Above calculations show that E[E[X|Y ]] = E(X).

4. Let Y = y0 as it is a constant function and S be the sample space. So for x ∈ RX

P (X = x|Y = y0 ) = P (X = x|S)
= P (X = x).

Therefore for any ω ∈ S

E[X|Y ](ω) = E[X|Y = y0 ]


X
= xP (X = x|Y = y0 )
x∈RX
X
= xP (X = x)
x∈RX

= E(X).

4
5. We need 23 = 8 states to analyse the system using a Markov Chain. Denote e.g.
RN N = it rained on the first day
it did not rain on the 2nd day
and it did not rain on the 3rd day
Define the 8 states as follows
0 = RRR,
1 = RRN ,
2 = RN R,
3 = RN N ,
4 = N RR,
5 = N RN ,
6 = N N R,
7 = NNN,
and the transition probability
Pij = P [Xn+1 = j|Xn = i], i, j ∈ {0, 1, 2, . . . , 7}.
Then Xn is a Markov chain.

6. We use all the notations in Question (5). Careful calculations lead to

0.8 0.2 0 0 0 0 0 0
 
 0 0 0.4 0.6 0 0 0 0 
 0 0 0 0 0.6 0.4 0 0 
 
 0 0 0 0 0 0 0.4 0.6 
 
P = .
 0.6 0.4 0 0 0 0 0 0 
 0 0 0.4 0.6 0 0 0 0 
 
0 0 0 0 0.6 0.4 0 0
 
0 0 0 0 0 0 0.2 0.8

7. It is obvious that P(1) = P. Assume for a k ≥ 1,


"1
+ 21 (2p − 1)k 1
− 21 (2p − 1)k
#
2 2
P(k) =
1
2 − 21 (2p − 1)k 1
2 + 21 (2p − 1)k

Then Chapman-Kolmogorov equation, leads to

5
P(k+1) = P(k) P(1)
"1 1 k 1 1 k
# " #
2 + 2 (2p − 1) 2 − 2 (2p − 1) p 1 − p
= .
1 1 k 1 1 k
2 − 2 (2p − 1) 2 + 2 (2p − 1) 1−p p
1 1 k 1 1 k 1 1 k
2 p + 2 p(2p − 1) + 2 (1 − p) − 2 (1 − p)(2p − 1) 2 (1 − p) + 2 (1 − p)(2p − 1)
 + 12 p − 12 p(2p − 1)k 
=
 

1 1 k 1 1 k 1 1 k
2 p − 2 p(2p − 1) + 2 (1 − p) + 2 (1 − p)(2p − 1) 2 (1 − p) − 2 (1 − p)(2p − 1)
+ 12 p + 12 p(2p − 1)k
"1 1 k+1 1 1 k+1
#
2 + 2 (2p − 1) 2 − 2 (2p − 1)
= .
1 1 k+1 1 1 k+1
2 − 2 (2p − 1) 2 + 2 (2p − 1)

By induction principle, we have for any n ≥ 1


"1
+ 21 (2p − 1)n 1
− 21 (2p − 1)n
#
2 2
P(n) = .
1
2 − 21 (2p − 1)n 1
2 + 21 (2p − 1)n

DUAN/NKU

You might also like