Unexpected - Expectations 2015
Unexpected - Expectations 2015
June, 2015
Sometimes the easiest way to solve a problem is to define some random variables whose
expectations make things more intuitive to think and write about. When the problem
doesn’t involve any randomness at its outset, folks call this a “probabilistic method”
because you introduce probability where it wasn’t invited. But actually this method is
cool even when it’s not the only probabilistic thing going on, so we’ll look at some of
those applications, too.
Note from the second author: I’ve made very few changes to Evan’s original version of
these notes; all the credit for assembling this excellent collection of problems goes to him!
1
Evan Chen, Andrew Critch 2 Properties of Expected Value
function. But to avoid confusion one needs to remember that the variables are functions,
either implicitly or explicitly.
The conditional probability of Y given X is defined as
P(X = x | Y = y) = P(X = x)
i.e., that finding out the value of Y tells you nothing about the distribution of X, and by
symmetry, conversely. In our dice example, you can check that
2
Evan Chen, Andrew Critch 2 Properties of Expected Value
interested in the expected value, then according to our definition we should go through
all n! permutations, count up the total number of fixed points, and then divide by n! to
get the average. Since we want E[S] = 1, we expect to see a total of n! fixed points.
Let us begin by illustrating the case n = 4 first, calling the people W , X, Y , Z.
W X Y Z Σ
1 W X Y Z 4
2 W X Z Y 2
3 W Y X Z 2
4 W Y Z X 1
5 W Z X Y 1
6 W Z Y X 2
7 X W Y Z 2
8 X W Z Y 0
9 X Y W Z 1
10 X Y Z W 0
11 X Z W Y 0
12 X Z Y W 1
13 Y W X Z 1
14 Y W Z X 0
15 Y X W Z 2
16 Y X Z W 1
17 Y Z W X 0
18 Y Z X W 0
19 Z W X Y 0
20 Z W Y X 1
21 Z X W Y 1
22 Z X Y W 2
23 Z Y W X 0
24 Z Y X W 0
Σ 6 6 6 6 24
We’ve listed all 4! = 24 permutations, and indeed we see that there are a total of 24
fixed points, which are hilighted. Unfortunately, if we look at the rightmost column,
there doesn’t seem to be a pattern, and it seems hard to prove that this holds for other
values of n.
However, suppose that rather than trying to add by rows, we add by columns. There’s
a very clear pattern if we try to add by the columns: we see a total of 6 fixed points in
each column. Indeed, the six fixed W points correspond to the 3! = 6 permutations of
the remaining letters X, Y , Z. Similarly, the six fixed X points correspond to the 3! = 6
permutations of the remaining letters W , Y , Z.
This generalizes very nicely: if we have n letters, then each letter appears as a fixed
point (n − 1)! times. Thus the expected value is
1 1
E[S] = (n − 1)! + (n − 1)! + · · · + (n − 1)! = · n · (n − 1)! = 1.
n! | {z } n!
n times
Cute, right? Now let’s bring out the artillery.
3
Evan Chen, Andrew Critch 2 Properties of Expected Value
Obviously,
S = S1 + S2 + · · · + Sn .
Moreover, it is easy to see that E[Si ] = P(Si = 1) = n1 for each i: if we look at any
particular person, the probability they get their own name tag is simply n1 . Therefore,
1 1 1
E[S] = E[S1 ] + E[S2 ] + · · · + E[Sn ] = + + · · · + = 1.
|n n {z n}
n times
Now that was a lot easier! By working in the context of expected value, we get a
framework where the “double-counting” idea is basically automatic. In other words,
linearity of expectation lets us only focus on small, local components when computing an
expected value, without having to think about a lot of interactions between cases and
quantities that would otherwise distract us.
1 2 1
We seek E[X1 + X2 + · · · + X2006 ]. Note that any particular baby has probability 2 = 4
of being unpoked (if both its neighbors miss). Hence E[Xi ] = 14 for each i, and
1 1003
E[X1 + X2 + · · · + X2006 ] = E[X1 ] + E[X2 ] + · · · + E[X2006 ] = 2006 · = .
4 2
Seriously, this should feel like cheating.
4
Evan Chen, Andrew Critch 2 Properties of Expected Value
X⊥
⊥Y ⇒ E[Y | X] = E[Y ] ⇒ E[XY ] = E[X]E[Y ]
where (exercise) each of these implications is strict, i.e., the reverse does not hold.
It’s reasonably common to forget that these implications are not reversible, so it’s a
reasonable exercise to come up with examples to illustrate it. But while you’re thinking
about that, don’t forget: unlike when multiplying two random variables, linearity of
expectation does not require independence!
5
Evan Chen, Andrew Critch 3 Direct Existence Proofs
6
Evan Chen, Andrew Critch 3 Direct Existence Proofs
Figure 1: The case n = 4. There are n2 −n+1 = 13 edges, and the matching is highlighted
in green.
This problem doesn’t “feel” like it should be very hard. After all, there’s only a total
of n2 possible edges, so having n2 − n + 1 edges means we have practically all edges
present.3
So let’s be really careless and just randomly pair off one set of points with the other,
regardless of whether there is actually an edge present. We call the score of such a pairing
the number of pairs which are actually connected by an edge. We wish to show that
some pairing has score n, as this will be the desired perfect matching.
So what’s the expected value of a random pairing? Number the pairs 1, 2, . . . , n and
define (
def 1 if the ith pair is connected by an edge
Xi =
0 otherwise.
Then the score of the configuration is X = X1 + X2 + · · · + Xn . Given any red point and
2
any blue point, the probability they are connected by an edge is at least n −n+1
n2
. This
n2 −n+1
means that E[Xi ] = n2 , so
7
Evan Chen, Andrew Critch 4 Heavy Machinery
n nk
and then use calculus to prove that k! ≥ e(k/e)k . Specifically,
Proof. Do k < k!
Z k
ln 1 + ln 2 + · · · + ln k ≥ ln x dx = k ln k − k + 1
x=1
Algebra isn’t much fun, but at least it’s easy. Let’s get back to the combinatorics.
Example 3.3 (Ramsey Numbers). Let n and k be integers with n ≤ 2k/2 and k ≥ 3.
Then it is possible to color the edges of the complete graphon n vertices with the
following property: one cannot find k vertices for which the k2 edges among them are
monochromatic.
Remark. In the language of Ramsey numbers, prove that R(k, k) > 2k/2 .
Solution. Again we just randomly color the edges and hope for the best. We use a coin
flip to determine the color of each of the n2 edges. Let’s call a collection of k vertices
bad if all k2 edges are the same color. The probability that any collection is bad is
(k)−1
1 2
.
2
n
The number of collections in k , so the expected number of bad collections is
n
k
E[number of bad collections] = k .
2(2)−1
We just want to show this is less than 1. You can check this fairly easily using Lemma 3.2;
in fact, we have a lot of room to spare.
Problem 3.4. Show that one can construct a (round-robin) tournament outcome with
more than 1000 people such that for any set of 1000 people, some contestant outside that
set beats all of them.
Problem 3.5 (BAMO 2004). Consider a set of n real numbers, not all zero, with sum
zero. Prove that one can label the numbers as a1 , a2 , . . . , an such that
a1 a2 + a2 a3 + · · · + an a1 < 0.
Problem 3.6 (Russia 1996). In the Duma there are 1600 delegates, who have formed
16, 000 committees of 80 people each. Prove that one can find two committees having no
fewer than four common members.
4 Heavy Machinery
Here are some really nice ideas used in modern theory. Unfortunately I couldn’t find
many olympiad problems that used them. If you know of any, please let me know!
8
Evan Chen, Andrew Critch 4 Heavy Machinery
4.1 Alteration
In previous arguments we often proved a result by showing E[bad] < 1. A second method
is to select some things, find the expected value of the number of “bad” situations, and
subtract that off. An example will make this clear.
Example 4.1 (Weak Turán). A graph G has n vertices and average degree d. Prove
n
that it is possible to select an independent set of size at least 2d .
n
Proof. Rather than selecting 2d vertices randomly and hoping the number of edges is 1,
we’ll instead select each vertex with probability p. (We will pick a good choice of p later.)
That means the expected number of vertices we will take is np. Now there are 12 nd
edges, so the expected number of “bad” situations (i.e. an edge in which both vertices
are taken) is 12 nd · p2 .
Now we can just get rid of all the bad situations. For each bad edge, delete one of its
endpoints arbitrarily (possibly with overlap). This costs us at most 12 nd · p2 vertices, so
the expected value of the number of vertices left is
1 2 1
np − ndp = np 1 − dp .
2 2
1 n
It seems like a good choice of p is d, which now gives us an expected value of 2d , as
desired.
9
Evan Chen, Andrew Critch 4 Heavy Machinery
Theorem 4.4 (Lovász Local Lemma). Consider several events, each occurring with
probability at most p, and such that each event is independent of all the others except at
most d of them. 4 Then if
epd ≤ 1
the probability that no events occur is positive. (Here e = 2.71828 . . . is Euler’s constant.)
Note that we don’t use the number of events, only the number of dependencies.
As the name implies, the local lemma is useful in situations where in a random
algorithm, it appears that things do not depend much on each other. The following
Russian problem is such an example.
Example 4.5 (Russia 2006). At a tourist camp, each person has at least 50 and at
most 100 friends among the other persons at the camp. Show that one can hand out a
T-shirt to every person such that the T-shirts have (at most) 1331 different colors, and
any person has 20 friends whose T-shirts all have pairwise different colors.
Solution. Give each person a random T-shirt. For each person P , we consider the event
E(P ) meaning “P ’s neighbors have at most 19 colors of shirts”. We wish to use the
Local Lemma to prove that there is a nonzero probability that no events occur.
If we have two people A and B, and they are neither friends nor have a mutual friend
(in graph theoretic language, the distance between them is at least two), then the events
E(A) and E(B) do not depend on each other at all. So any given E(P ) depends only on
friends, and friends of friends. Because any P has at most 100 friends, and each of these
friends has at most 99 friends other than P , E(P ) depends on at most 100+100·99 = 1002
other events. Hence in the lemma we can set d = 1002 .
For a given person, look at their 50 ≤ k ≤ 100 neighbors. The probability that there
are at most 19 colors among the neighbors is clearly at most
k
C 19
· .
19 C
To estimate the binomial coefficient, we can again use our silly Lemma 3.2 to get that
this is at most
1 eC 19
k k−19 31
19 18 19 18 19
· =e · ≤e .
e 19 C C C
19 31
Thus, we can put p = e18
C . Thus the Lemma implies we are done as long as
31
19 19
e · 1002 ≤ 1.
C
It turns out that C = 48 is the best possible outcome here. Establishing the inequality
when C = 1331 just amounts to some rough estimation with the e’s.
4
More precisely, if we donate the events with binary variables X1 , X2 , . . ., then we require that for each
i, there is a set Di of size at most d + 1 containing Xi such that Xi is independent of its complement
Dic . In other words, measuring the value of all the variables in Dic together will tell you nothing about
the distribution of Xi .
10
Evan Chen, Andrew Critch 5 More Practice Problems
Problem 5.3 (Shortlist 1999 C4). Let A be a set of N distinct residues (mod N 2 ). Prove
that there exists a set B of N residues (mod N 2 ) such that A + B = {a + b|a ∈ A, b ∈ B}
contains at least half of all the residues (mod N 2 ).
Problem 5.4 (Iran TST 2008/6). Suppose 799 teams participate in a round-robin
tournament. Prove that one can find two disjoint groups A and B of seven teams each
such that all teams in A defeated all teams in B.
Problem 5.5 (Caro-Wei Theorem). Consider a graph G with vertex set V . Prove that
one can find an independent set with size at least
X 1
.
deg v + 1
v∈V
Remark. Note that, by applying Jensen’s inequality, our independent set has size at
n
least d+1 , where d is the average degree. This result is called Turán’s Theorem (or
the complement thereof).
Problem 5.6 (USAMO 2012/6). For integer n ≥ 2, let x1 , x2 , . . . , xn be real numbers
satisfying x1 +x2 +. . .+xn = 0 and x21 +x22 +. . .+x2n = 1. For each subset A ⊆ {1, 2, . . . , n},
define X
SA = xi .
i∈A
(If A is the empty set, then SA = 0.) Prove that for any positive number λ, the number
of sets A satisfying SA ≥ λ is at most 2n−3 /λ2 .
Problem 5.7 (Online Math Open, Ray Li). Kevin has 2n − 1 cookies, each labeled with
a unique nonempty subset of {1, 2, . . . , n}. Each day, he chooses one cookie uniformly at
random out of the cookies not yet eaten. Then, he eats that cookie, and all remaining
cookies that are labeled with a subset of that cookie. Compute the expected value of the
number of days that Kevin eats a cookie before all cookies are gone.
Problem 5.8. Let n be a positive integer. Let ak denote the number of permutations
of n elements with k fixed points. Compute
a1 + 4a2 + 9a3 + · · · + n2 an .
Problem 5.9 (Russia 1999). In a certain school, every boy likes at least one girl. Prove
that we can find a set S of at least half the students in the school such that each boy in
S likes an odd number of girls in S.
Problem 5.10. Let n be a positive integer. Suppose 11n points are arranged in a circle,
colored with one of n colors, so that each color appears exactly 11 times. Prove that one
can select a point of every color such that no two are adjacent.
11
Evan Chen, Andrew Critch References
References
[1] pythag011 at http://www.aops.com/Forum/viewtopic.php?f=133&t=481300
[3] Problem 6 talk (c > 1) by Po-Shen Loh, USA leader, at the IMO 2014.
Thanks to all the sources above. Other nice reads that I went through while preparing
this, but eventually did not use:
1. Alon and Spencer’s The Probabilistic Method. The first four chapters are here:
http://cs.nyu.edu/cs/faculty/spencer/nogabook/.
2. A MathCamp lecture that gets the girth-chromatic number result:
http://math.ucsb.edu/~padraic/mathcamp_2010/class_graph_theory_probabilistic/
lecture2_girth_chromatic.pdf
12
Evan Chen, Andrew Critch 6 Unexpected Expectations: Solution Sketches
2.7 Answer: 360. Pick a, b, c randomly and compute E[0.abc]. Then multiply by |S|.
2.8 8p = 4 · p + p2 + p3 + . . . .
(10p)5 − 1 6 7775
2(p + 10p2 + · · · + 104 p5 ) = 2p · = · = 6 · 311 = 1866
10p − 1 5 5
3.4 Suppose there are n people, and decide each game outcome with a coin flip. Let U
be the set of “unbeaten” subsets S of size 1000, i.e. such that nobody outside S beats all
of S.
X
E[|U |] = P(S is unbeaten)
|S|=1000
X
= P(∀t ∈ S c ∃s ∈ S : s beats t)
|S|=1000
X Y
= 1 − 2−1000
|S|=1000 t∈S c
n
= · (1 − 2−1000 )n−1000
1000
which is less than 1 for very large n (exponentials eventually dominate polynomials).
Hence for large n, sometimes |U | = 0, as needed.
3.5 Choose the ordering uniformly randomly. Then, with the convention n + 1 = 1,
−ai a2
E[ai ai+1 ] = E[ai E[ai+1 | ai ]] = E[ai · ( )] = −E[ i ] < 0
n−1 n−1
since they are not all zero, hence the expectation of the given sum is strictly negative,
and so the sum itself is sometimes negative.
3.6 Let ni be the number of committees which the ith delegate is in. Pick two committees
A and B randomly, so
X ni (ni − 1)
E[|A ∩ B|] =
16, 000 · 15, 999
i
Letting f (n) = n(n − 1), by Jensen’s inequality and the fact that the average person is
on 16,000·80
1600 = 800 committees,
13
Evan Chen, Andrew Critch 6 Unexpected Expectations: Solution Sketches
5.1 Pick the two contestants, 1 and 2, randomly. Let Xi be the indicator that both
80 2
contestants miss problem i, so each E[Xi ] < ( 200 ) = 4/25, and their expected number of
both-missed problems is 24/25 < 1 . . .
5.2 Select each of the εi randomly with a coin flip. Let LHS
P be2 thePleft-hand side of the
2 2
desired inequality. Since |z| = zz for any z, LHS = k |zk | + i<j i j (zi z̄j + z̄i zj )
and since i and j are independent, E(i j ) = 0, so E(LHS 2 ) = k |zk |2 = 1, hence
P
LHS 2 ≤ 1 sometimes, as needed.
5.3 Select the elements of B = {b1 . . . bn } uniformly randomly (we’ll even allow repetitions,
for simplicity). For each r (mod N 2 ), and each i, j ∈ 1 . . . n,
N2 − N 1
P(r ∈
/ A + bi ) = P(bi ∈
/ A − r) = =1−
N2 N
1 N 1
so P(r ∈
/ A + B) < (1 − N) < e < 21 . Thus E[|A + B|] > (1 − 1e )N 2 . . .
5.4 Let Dk be the set of teams which defeat the kth team (here 1≤ k ≤ 799), and
dk 799
dk = |Dk | Select A = {a1 , . . . , a7 } randomly, so P(A ⊆ Dk ) = 7 / 7 , so letting
P dk 799
N be the number of teams dominated by A, E[N ] = k 7 / 7 . The function
x
799
7 / 7 is convex, and the average value of dk is 798/2 = 398, so by Jensen’s inequality,
E[N ] ≥ 799 · 398
799
7 / 7 > 799 · ( 12 )7 > 6, hence sometimes N ≥ 7.
5.5 A fairly natural approach is to use a greedy algorithm: randomly choose a vertex,
append it to W , remove it and its neighbors from G, repeat until nothing is left, and then
W will be an independent set. One can prove by induction on |G| that E[|W |] satisfies
the given bound.
A simpler proof is to randomly order the vertices {v1 . . . vn } of G, and take W to be
the subset of those vi which occur before all their neighbors. Then
X X 1
E[|W |] = ind(vi ∈ W ) =
deg(v) + 1
i i
5.6 Since SA = −SAc , we have P(SA > λ) = P(SAc < −λ) = P(SA < −λ) Thus
P(SA > λ) = 12 P(SA
2 > λ2 ), and S 2 is always positive, so we can apply the Markov
A
inequality to it.
X X
2
E(SA )= P(i ∈ A)x2i + P(i, j ∈ A)2xi xj
i ij
1X 2 1X
= xi + 2xi xj
2 4
i i6=j
2 =1+
P
Since 0 = S[n] i6=j 2xi xj , we have
2 1 1 1
E(SA ) = (1) + (−1) =
2 4 4
2 > λ2 ) < 1
hence by the Markov inequality, P(SA 4λ2
, as needed.
5.7 The number of days equals the number of times a cookie is chosen (rather than
merely eliminated). Let C be the set of cookies chosen by the process and S = {1 . . . n},
so X X
E[#days] = E[|C|] = E[ind(A ∈ C)] = P(A ∈ C)
A⊆S A⊆S
14
Evan Chen, Andrew Critch 6 Unexpected Expectations: Solution Sketches
5.8 For a random permutation let X be the number of fixed points, so the required
expression is exactly n!E[X 2 ]. We already know E[X] = 1 from Example 2.1, and by a
similar argument, the expected number of pairs of fixed points in a random permutation
is
X X n 1 1 1
E[ ]= P(i, j both fixed) = · =
2 2 n n−1 2
ij
5.9 Let Lb be the set of girls liked by a given boy b, and let B and G be the sets of chosen
boys and girls. For fixed G, WLOG B is the set of all boys who like an odd number of
girls in G, so the challenge is to choose G. Doing so uniformly randomly means each girl
has probability 50% to be included, and for each boy b,
because a uniformly random subset of Lb is 50% likely to be odd. Hence E[|B ∪ G|] >
50%(total).
5.10 Label the points s1 . . . s11n = s0 in order around the circle. Choose one point of
each color randomly to form a set A, and consider the indicators Bi = ind(si , si+1 ∈ A)
of the “bad” events where an adjacent pair occurs in A. For each i, p = P(Bi ) is 11−2 if
they’re different colors, or 0 if they’re the same color, so p ≤ 11−2 . Unfortunately, the
bound E[ i Bi ] < 11n = 11
P
112 n is only below 1 if n > 11, so for small n this bound is not
strong enough to show that sometimes all the Bi = 0.
However, each Bi is independent of Bj for all j except when sj or sj+1 has the same
color as si or si+1 , so we can try to apply the Lovasz Local Lemma. There are 21 other
pairs (si , si+1 ) sharing a color with si , and at most another 21 pairs sharing a color with
si+1 , so Bi ⊥ ⊥ Bj for all but at most d = 42 values of j. Now,
42 28 42 30 · 40
epd = e · < · < <1
121 10 121 1210
so by LLL there is a positive probability that all the Bi = 0, as needed.
15