0% found this document useful (0 votes)
75 views1 page

EE321 InfoTheoryCoding HomeWork3

This document is a homework assignment from an electrical engineering course on information theory and coding. It contains two questions about entropy rates of stochastic processes. Question 1 asks about a stochastic process that randomly selects between two Bernoulli processes and determines properties like stationarity, independence, and entropy rate. Question 2 considers a Markov chain associated with a Bernoulli process that counts the number of 1s in each run, and asks to calculate the entropy rates of the original and transformed processes.

Uploaded by

Tarun Vatwani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views1 page

EE321 InfoTheoryCoding HomeWork3

This document is a homework assignment from an electrical engineering course on information theory and coding. It contains two questions about entropy rates of stochastic processes. Question 1 asks about a stochastic process that randomly selects between two Bernoulli processes and determines properties like stationarity, independence, and entropy rate. Question 2 considers a Markov chain associated with a Bernoulli process that counts the number of 1s in each run, and asks to calculate the entropy rates of the original and transformed processes.

Uploaded by

Tarun Vatwani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Focus Group: Electrical Engineering

Indian Institute of Technology Jodhpur

EE 321: Information Theory and Coding


2014-15 Second Semester (December 2014 - April 2015)

Homework 3
Entropy Rates of a Stochastic Process

Question 3.1
Suppose we observe one of two stochastic processes but dont know which. What is the entropy rate?
Specifically, let X11 , X12 , X13 , be a Bernoulli process with parameter p1 and let X21 , X22 , X23 ,
be a Bernoulli process with parameter p2 . Let
(
1 with probability 21
=
(1)
2 with probability 12
and let Yi = Xi , i = 1, 2, , be the stochastic process observed. Thus Y observes the process {X1i }
or {X2i }. Eventually, Y will know which.
1) Is {Yi } stationary?
2) Is {Yi } an i.i.d. process?
3) What is the entropy rate H of {Yi }?
4) Does
1
log(Y1 , Y2 , , Yn ) H?
n
5) Is there a code that achieves an expected per-symbol description length

1
n ELn

H?

Question 3.2
n where Y = (the number of 1s
Let {Xi } Bernoulli(p) . Consider the associated Markov chain Yi=1
i
in the current run of 1s). For example, if Xn = 101110 , we have Yn = 101230 .
1) Find the entropy rate of X n .
2) Find the entropy rate of Y n .

You might also like