final_practice_2023
final_practice_2023
1
Thus for z ≥ 0, X | {Z = z} ∼ Unif[0, z]. We have W = X − Y = 2X − Z.
Therefore,
FW |Z (w|z) = P{W ≤ w | Z = z}
= P{2X − Z ≤ w | Z = z}
z+w
= P{X ≤ | Z = z}
2
0,
if w < −z
z+w
= 2z
, if − z ≤ w ≤ z
1, if w > z.
Thus, (
1
2z
, if |w| ≤ z
fW |Z (w|z) =
0, otherwise,
which leads us to conclude that
(
1 −z
2
e , if |w| ≤ z
fZ,W (z, w) = fW |Z (w|z)fZ (z) =
0, otherwise.
(c) We have
E[Z|X] = E[X + Y | X]
= X + E[Y |X]
= X + E[Y ]
= X + 1,
Z
E[X|Z] = .
2
Solution:
2
(a) We have
E[Y ] = E[min{X, 1}]
Z ∞
= min{x, 1}e−x dx
Z0 1 Z ∞
−x
= xe dx + e−x dx
0 1
1
= −xe−x − e−x + e−1
0
−1
=1−e .
0 1 y
3
where the step (a) follows from the memoryless property. We therefore have
MSE = E[Var(X|Y )]
= Var(X | Y = 1)P{Y = 1}
= e−1 .
3. Sampled Wiener process. Let {W (t), t ≥ 0} be the standard Brownian motion. For
n = 1, 2, . . . , let
1
Xn = n · W .
n
(a) Find the mean and autocorrelation functions of {Xn }.
(b) Is {Xn } WSS? Justify your answer.
(c) Is {Xn } Markov? Justify your answer.
(d) Is {Xn } independent increment? Justify your answer.
(e) Is {Xn } Gaussian? Justify your answer.
(f) For n = 1, 2, . . . , let Sn = Xn /n. Find the limit
lim Sn
n→∞
in probability.
Solution:
(a) We have
E[Xn ] = nE[W (1/n)] = 0.
For m, n ∈ N and m ≥ n, we have
E[Xm Xn ] = mnE[W (1/m)W (1/n)]
= mn · min{1/m, 1/n}
1
= mn ·
m
= n.
Thus in general,
E[Xm Xn ] = min{m, n}.
(b) No. Since the autocorrelation function is not time-invariant, {Xn } is not WSS.
4
(c) Yes. Clearly, {Xn } is a Gaussian process (see the solution to part (e)) with
mean and autocorrelation functions as found in part (a). Therefore, for integers
m1 < m2 ≤ m3 < m4 , we have
E[(Xm2 − Xm1 )(Xm4 − Xm3 )] = E[Xm2 Xm4 ] + E[Xm1 Xm3 ] − E[Xm2 Xm3 ] − E[Xm1 Xm4 ]
= min{m2 , m4 } + min{m1 , m3 } − min{m2 , m3 } − min{m1 , m4 }
= m2 + m1 − m2 − m1
=0
= E[Xm2 − Xm1 ]E[Xm4 − Xm3 ].
Therefore, since (Xm2 − Xm1 ) and (Xm4 − Xm3 ) are jointly Gaussian and uncor-
related, they are independent. Now, for positive integers n1 < n2 < · · · < nk
for some k, (Xn1 , Xn2 − Xn1 , . . . , Xnk − Xnk−1 ), being a linear transformation of a
Gaussian random vector, is itself Gaussian. Moreover, from what we just showed,
(Xn1 , Xn2 − Xn1 , . . . , Xnk − Xnk−1 ) are pairwise independent. Therefore, they are
all independent, which implies that {Xn } is independent-increment. This implies
Markovity.
(d) Yes. See the solution to part (c).
(e) Yes. For integers n1 , n2 , . . . , nk for any k, we have
Xn1 n1 0 ··· 0 W (1/n1 )
Xn 0 n2 · · · 0 W (1/n2 )
2
.. = .. .. . . .. .
. 0
. . . .
Xnk 0 0 · · · nk W (1/nk )
5
4. Poisson process. Let {N (t), t ≥ 0} be a Poisson process with arrival rate λ > 0. Let
s ≤ t.
(a) Find the conditional pmf of N (t) given N (s).
(b) Find E[N (t)|N (s)] and its pmf.
(c) Find the conditional pmf of N (s) given N (t).
(d) Find E[N (s)|N (t)] and its pmf.
Solution:
(a) Assume 0 ≤ ns ≤ nt . By the independent increment property of the Poisson
process, we would get
P{N (t) = nt |N (s) = ns } = P{N (t) − N (s) = nt − ns |N (s) = ns }
= P{N (t) − N (s) = nt − ns }
(λ(t − s))nt −ns
= e−λ(t−s)
(nt − ns )!
for ns = 0, 1, . . . and nt = ns , ns + 1, . . .. Thus,
N (t)|{N (s) = ns } ∼ ns + Poisson(λ(t − s)).
(b) From part (a), it immediately follows that
E[N (t)|N (s)] = N (s) + λ(t − s).
Therefore, the pmf of E[N (t)|N (s)] is
( k
e−λs (λs)
k!
if x = k + λ(t − s), k = 0, 1, . . .
pE[N (t)|N (s)] (x) =
0 otherwise
(c) From part (a), the joint pmf of (N (t), N (s)) for 0 ≤ ns ≤ nt , is
P{N (t) = nt , N (s) = ns } = P{N (s) = ns }P{N (t) = nt |N (s) = ns }
(λs)ns −λ(t−s) (λ(t − s))nt −ns
= e−λs e
ns ! (nt − ns )!
ns nt −ns
s (t − s)
= e−λt λnt .
ns !(nt − ns )!
Therefore, the conditional pmf of N (s)|{N (t) = nt } is for nt ≥ ns ≥ 0
P{N (s) = ns , N (t) = nt }
P{N (s) = ns |N (t) = nt } =
P{N (t) = nt }
ns nt −ns nt −1
−λt nt s (t − s) −λt (λt)
= e λ e
ns !(nt − ns )! nt !
nt s ns s nt −ns
= 1− .
ns t t
6
Hence,
s
N (s)|{N (t) = nt } ∼ Binom nt , .
t
(d) From part (c), it immediately follows that
s
E[N (s)|N (t)] = N (t),
t
and its pmf is
( k
e−λt (λt)
k!
if x = st k, k = 0, 1, . . .
pE[N (s)|N (t)] (x) =
0 otherwise
(a) Find the variance σ 2 such that {Xn } and {Yn } are jointly WSS.
Solution:
(a) If {Xn } is WSS, then Var(Xn ) = Var(X0 ) = σ 2 for all n ≥ 0. From the recursive
relation, we would get
1
Var(Xn ) = Var(Xn−1 ) + Var(Zn ),
4
which implies σ 2 = 43 .
(b) First, note that for n ≥ 0,
1
Xm+n = Xm+n−1 + Zm+n
2
1 1
= Xm+n−2 + Zm+n−1 + Zm+n
4 2
= ...
1 1 1
= n Xm + n−1 Zm+1 + · · · + Zm+n−1 + Zm+n .
2 2 2
7
Hence, it follows that
4
RX (n) = E[Xm+n Xn ] = 2−n E[Xm
2
] = 2−|n| .
3
Now we can find the autocorrelation function of {Yn } easily.
RY (n) = E[Ym+n Ym ]
= E[(Xm+n + Vm+n )(Xm + Vm )]
= E[Xm+n Xm + Xm+n Vm + Vm+n Xm + Vm+n Vm ]
= RX (n) + δ(n)
4
= 2−|n| + δ(n)
3
Here δ(n) denotes the Kronecker delta function, that is,
(
1 if n = 0
δ(n) =
0 otherwise.
(d) Since Xn and Yn are jointly Gaussian, we can find the conditional expectation
E[Xn |Yn ], which is the MMSE estimate of Xn given Yn , as follows:
Cov(Xn , Yn )
E[Xn |Yn ] = E[Xn ] + (Yn − E[Yn ])
Var(Yn )
RXY (0)
= Yn
RY (0)
4
= Yn .
7
8
(e) As in part (d), the MMSE estimate of Xn given (Yn , Yn−1 ) is
−1 Yn Yn
E[Xn |Yn , Yn−1 ] = E[Xn ] + ΣXn ,(Yn ,Yn−1 ) Σ(Yn ,Yn−1 ) −E
Yn−1 Yn−1
−1
RY (0) RY (1) Yn
= RXY (0) RXY (1)
RY (1) RY (0) Yn−1
−1
7/3 2/3 Yn
= 4/3 2/3
2/3 7/3 Yn−1
Yn
= 8/15 2/15
Yn−1
8 2
= Yn + Yn−1 .
15 15
(f) Since (Xn , Yn ) are jointly WSS, from part (e) it immediately follows that the
conditional expectation E[Xn |Yn , Yn+1 ] has the same form with E[Xn |Yn , Yn−1 ]:
8 2
E[Xn |Yn , Yn+1 ] = Yn + Yn+1 .
15 15
6. Random-delay mixture. Let {X(t)}, −∞ < t < ∞, be a zero-mean wide-sense station-
ary process with autocorrelation function RX (τ ) = e−|τ | . Let
Y (t) = X(t − U ),
where U is a random delay, independent of {X(t)}. Suppose that U ∼ Bern(1/2), that
is, (
X(t) with probability 1/2,
Y (t) =
X(t − 1) with probabiltiy 1/2.
9
and
and
Note that {Y (t)} is WSS with RY (t1 , t2 ) = RX (t1 , t2 ) for any random delay U .
In fact, {Y (t)} has the same distribution as {X(t)}, not only the first and second
moments.
10