0% found this document useful (0 votes)
59 views2 pages

Homework 0: DUE Monday Jan 9th, 11:45pm Electronically

This document provides the details for the first homework assignment in the EE514A Information Theory I course at the University of Washington. It includes 4 problems covering topics in probability, conditional probability, Jensen's inequality, and concentration inequalities. Problem 1 involves modeling and computing probabilities of cloudy weather in Seattle and Death Valley. Problem 2 requires applying Jensen's inequality to solve economics and inequality problems. Problem 3 offers the choice of using Markov's or Chebyshev's inequality to solve probability problems. The homework is due electronically by January 9th.

Uploaded by

JagritiKumari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views2 pages

Homework 0: DUE Monday Jan 9th, 11:45pm Electronically

This document provides the details for the first homework assignment in the EE514A Information Theory I course at the University of Washington. It includes 4 problems covering topics in probability, conditional probability, Jensen's inequality, and concentration inequalities. Problem 1 involves modeling and computing probabilities of cloudy weather in Seattle and Death Valley. Problem 2 requires applying Jensen's inequality to solve economics and inequality problems. Problem 3 offers the choice of using Markov's or Chebyshev's inequality to solve probability problems. The homework is due electronically by January 9th.

Uploaded by

JagritiKumari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

EE514A Information Theory I

Winter 2012

University of Washington
Dept. of Electrical Engineering

Homework 0: DUE Monday Jan 9th, 11:45pm Electronically


Prof: J. Bilmes <[email protected]>
TA: K. Mohan <[email protected]>

Thursday, Jan 5th 2012

All homework is due electronically via the link https://catalyst.uw.edu/collectit/dropbox/


karna/19164. Note that the due dates and times might be in the evening. Please submit a PDF file. Doing
your homework by hand and then converting to a PDF file (by say taking high quality photos using a digital
camera and then converting that to a PDF file) is fine, as there are many jpg to pdf converters on the web.
Some of the problems below will require that you look at some of the lecture slides.
Note: The following problems are based on concepts in probability and convexity including random variables, conditional probability, Jensens inequality, and basic concentration inequalities. The Lecture 0 covers
these concepts with examples.

Problem 1. Uncertain Seattle Weather (4 points) Lets assume that we have at hand the historical
data regarding Seattles weather patterns: Specifically the days when its cloudy and clear. Assume that
the given data consists of 60 days each from Autumn, Winter, Spring, and Summer sampled uniformly at
random from the historical data along with the observed weather on each of these days. The data shows that
it was cloudy on 30 of the days in Autumn, clear on 10 of the days in Winter, cloudy on 40 of the days in
Spring and cloudy on 10 of the days in Summer.
Problem 1(a) We want to infer probabilities from the observed data. Begin, by describing the sample
space. Next, define the random variables associated with the problem. What probabilities can be inferred
from the given data, also describe why your probability estimates might make sense.
Problem 1(b) Based on your estimated probabilities, compute the following: If it was given that it
was cloudy on any particular day, what is the probability that it was a wintry day in Seattle. Also compute
the probability of it being spring.
Problem 1(c) Now assume that we are also given historical data concerning clear/cloudy sky in Death
valley. Of the 60 sampled days, it is observed that there was 1 cloudy day. Compute the probability that
it was cloudy in both Seattle and in Death valley given that it was a wintry day in Seattle. Specify any
assumptions made in your computation.
Problem 1(d) Finally, assume that of 50 years sampled at random, there were 20 years with more than
200 cloudy days but less than 220 cloudy days, 30 years with more than 220 cloudy days but less than 240
cloudy days. In any given year, what is an estimate for the expected number of cloudy days in Seattle.

Problem 2. Applications of Jensenss Inequality (4 points)

Jensens inequality follows from


convexity and has many applications in information theory, statistics, economics and proving mathematical
inequalities. Solve any two of the three problems below using Jensens inequality.
Problem 2(a) Exchange Rate For a non-negative random variable T , show that
E[1/T ] 1/E[T ]
The above inequality has interpretations in economics when T is the exchange-rate.

2
Problem 2(b) GM-HM Inequality For any a1 , a2 , . . . , an > 0 show that:
n
Y

!1

ai

n
Pn

1
i=1 ai

i=1

Problem 2(c) Under-estimator of Standard Deviation Let X1 , X2 , . . . , Xn denote a sample that


denote the samis drawn identically and
independently (i.i.d) from an underlying distribution. Let X
qP
n
2 /n denote the sample standard deviation. Show that E[
(Xi X)
]
ple mean. Let
=
i=1

Population Standard deviation. That is the sample standard deviation in expectation is always an underestimator of the population standard deviation.

Problem 3. Concentration Inequalities (2 points)


Concentration inequalities are fundamental to showing many information-theoretic results. Some of
the popular concentration inequalities include Markovs inequality, Chebyshevs inequality and ChernoffHoeffding bounds. Here we look at applications of the first two inequalities. Answer any one of the following (1 bonus point for solving both).
Problem 3(a) Markovs Inequality for Probabilities Let p(x) be a probability mass function. Prove,
for all d 0, that
P r{p(X) d} log

1
H(X).
d

Problem 3(b) Indicator Concentration


Let Ij , j = 1, 2, . . . , n denote n i.i.d indicator random
P
I
variables, i.e. P (Ij = 1) = p, P (Ij = 0) = 1 p for all j. Show that p s nj=1 nj p + s with
P
2
I
. In particular show that as n , nj=1 nj p.
probability atleast 1 pp
ns2

You might also like