Probabilistic Methods in Combinatorics: 1 Warm-Up
Probabilistic Methods in Combinatorics: 1 Warm-Up
Po-Shen Loh
June 2009
Warm-up
M
ai
1.
3. (IMO Shortlist 2006/C3) Let S be a finite set of points in the plane such that no three of them are on
a line. For each convex polygon P whose vertices are in S, let a(P ) be the number of vertices of P ,
and let b(P ) be the number of points of S which are outside P . Prove that for every real number x:
X
xa(P ) (1 x)b(P ) = 1,
P
where the sum is taken over all convex polygons with vertices in S. Important: a line segment, a
point, and the empty set are considered to be convex polygons of 2, 1, and 0 vertices, respectively.
Solution: Randomly color the points in black and white, with a point receiving black with probability
x. For each convex polygon P , let EP be the event that all vertices on the perimeter of P are black,
and all vertices in P s exterior are white. These events are mutually exclusive, so the LHS is the
probability that some Ep holds. But there is always some EP that always holds: consider the convex
hull of all of the black points.
Linearity of expectation
Definition. Let X be a random variable which takes values in some finite set S. Then the expected value
of X, denoted E [X], is:
X
E [X] =
x P [X = x]
xS
Use the following exercises to get used to the concept of expected value.
1. What is the expected number of heads in 101 tosses of a fair coin? Prove this formally from the
definition.
Solution:
101
100
101
X
X
X
101 1
101 100
100
101
101
i
=2
= 2101 101 2100 = 101/2.
=2
i
101
i 2101
j
i i1
i=0
i=1
j=0
2. What is the expected number of heads in 1 toss of a coin which lands heads with probability 1/10?
3. Can you calculate the expected number of heads in 101 tosses of a coin which lands heads with
probability 1/10?
One of the most useful facts about the expected value is called linearity of expectation.
Proposition. For any two not necessarily independent random variables X and Y , we have E [X + Y ] =
E [X] + E [Y ].
Proof. This is essentially a consequence of the commutativity of addition. Following the definition:
X
E [X + Y ] =
(x + y) P [X = x, Y = y]
x,y
x P [X = x, Y = y] +
x,y
y P [X = x, Y = y]
x,y
!
=
X
x
P [X = x, Y = y]
(xP [X = x]) +
!
+
X
y
(yP [Y = y])
= E [X] + E [Y ] .
2
X
x
P [X = x, Y = y]
This is the sum of 101 random variables, each of which has expectation 1/10.
Remark. You may use the following well-known number-theoretic fact: if x is irrational, then the
fractional parts of the multiples of x are equidistributed over the interval [0, 1]. In particular, if we
choose n uniformly among {1, . . . , N }, then E [{xn}] 1/2 as N .
Solution: Suppose none of a, b, c are integers. Divide both sides by n and take the limit. This gives
a + b = c, so we also know that the sum of fractional parts:
{an} + {bn} = {cn}.
(1)
If x is irrational, then {xn} is equidistributed over the interval [0, 1]. In particular, if we choose n
uniformly among {1, . . . , N }, then E [{xn}] 1/2 as N . On the other hand, if x is rational
1
1
with reduced form p/q, then {xn} has expectation tending to q1
2q = 2 2q . So, it is in the interval
[1/4, 1/2). Conclusion: for any noninteger x, E [{xn}] t, with t [1/4, 1/2].
Taking the expectation of equation (??), and taking limits, we immediately see that the only way to
have the equality is if E [{an}] and E [{bn}] both tend to 1/4, and E [{cn}] 1/2. But the only way to
get expecation 1/4 is when a, b are rational, and the only way to get the full expectation 1/2 is when
c is irrational. Yet our first deduction was that a + b = c, so we cannot have two rationals summing to
one irrational. Contradiction.
4. (MOP Test 2007/7/1) In an n n array, each of the numbers
1, 2, . . . , n appears exactly n times. Show
that there is a row or a column in the array with at least n distinct numbers.
Solution:
Choose
P a random row or column (2n choices). Let X be the number of distinct entries
in it. Now X = Ii , where each Ii is the indicator variable of i appearing (possibly more than once)
in our random row or column. Clearly, each E [Ii ] =
P [Ii 1]. To lower-bound this, observe that
n n submatrix, which gives P [Ii 1]
the
worst-case
is
if
all
n
appearances
of
i
are
in
some
These are interesting results from research mathematics (as opposed to Olympiad mathematics) that have
very elegant probabilistic proofs.
1. (Erd
os, 1965) A set S is called sum-free if there is no triple of (not necessarily distinct) elements
x, y, z S satisfying x + y = z. Prove that every set A of nonzero integers contains a subset S A of
size |S| > |A|/3 which is sum-free.
Solution:
Choose a prime p of the form 3k + 2 such that p is greater than twice the maximum
absolute value of any element in A. Let C = {k + 1, . . . , 2k + 1}, which is sum-free modulo p. Then
pick a uniformly random x {1, . . . , p} and let B be the set obtained by multiplying each element of
A by x, modulo p. For each element, probability of mapping into C is |C|/(p 1) > 1/3. So expected
number of elements mapping into C is > |A|/3, and we can take S to be those that do.
2. (Tur
ans Theorem, numerical bound) Let G be a graph with n vertices and average degree d. Prove
n
that one can find a set S of pairwise non-adjacent vertices of size at least d+1
.
Remark. This is called an independent set, and the result is tight because a disjoint union of copies
n
.
of Kd+1 has no independent set larger than d+1
Solution: Start with a random ordering of the vertices. If any vertex precedes all of its neighbors
according to the ordering, then take it for S. Clearly this is an independent set.
P Let Iv be the event
that vertex v is chosen. E [Iv ] = dv1+1 , so by linearity of expectation, E [|S|] = v dv1+1 . By convexity,
1
this is at least n d+1
.
3. (Component of research problem of Tao Jiangs masters student) Let G be a graph whose vertices
have been properly colored. Nothing is assumed about the number of colors used, except for the local
fact that for every vertex v, the number of different colors which appear on vs neighbors is at most
n
M . Prove that there is a set S of pairwise non-adjacent vertices, of size at least d+1
.
Solution: (Proof due to Benny Sudakov.) Start with a random ordering of the color classes. If any
vertexs color class precedes all of its neighbors color classes according to the ordering, then take the
vertex for S. Again this is clearly an independent set. A proper coloring has the property that any
v is differently colored from all of its neighbors, so at most M + 1 total colors appear on v and its
neighbors. Thus the probability that vs color is earliest in the permutation is at least M1+1 . Finish by
linearity of expectation.
4. (Crossing Lemma) No matter how you draw a graph with V vertices and E edges in the plane, there
E3
will be 64V
2 pairs of crossing edges, as long as E 4V .
Solution:
Since planar graphs have E 3V 6, we automatically find that the crossing number
is always E (3V 6) > E 3V . Now take a drawing with, say, t crossings, and sample vertices
randomly with probability p. V goes down to pV , E goes down to p2 E, and cr goes down to p4 t. But
this new drawing needs to satisfy the above, so
p4 t > p2 E 3pV.
Substituting p = 4V /E, we get the desired result.
Real problems one does not expect to prove, but are probably
true
Many of these come from the excellent book titled The Probabilistic Method, by Noga Alon and Joel Spencer.
n
1. (Sperners Lemma) Prove that the maximum antichain in 2[n] has size bn/2c
. That is, show that if
F is a collection
of subsets of {1, . . . , n} such that no two distinct sets A, B F satisfy A B, then
n
|F| bn/2c
.
Solution:
Note that this is a special case of one of the first problems in this handout.
The idea is given as a hint in Alon-Spencer. Take a random permutation and let the random variable
X = #{i : (1), . . . , (i) F}. Consider E [X].
By definition of F, X is bounded by 1, and the events {(1), . . . , (K)} F are disjoint for distinct
K. Let Nk be the number of subsets of size k in F.
E [X] =
n
X
P [{(1), . . . , (k)} F] =
k=1
n
n
X
X
Nk
n
k=1
Nk is bounded by
n
bn/2c
k=1
Nk
.
n
bn/2c
2. (Corollary: Littlewood-Offord Lemma) Let x1 , . . . , xn be nonzero real numbers, not necessarily distinct.
Suppose that c1 , . . . , cn are independent random variables, each of which is 1 with equal probability.
Prove that P [c1 x1 + + cn xn = 0] O( 1n ).
Solution:
First, we may assume all xi > 0 by flipping the sign of any negative guy. (Since the ci
are 1 with equal probability, it wont make a difference.)
P
Consider any combination of (ci ) which makes
ci xi = 0. Identify this with a subset of [n], based on
which indices have ci = +1. Let the F be the collection of all such subsets P
of [n]. This is an antichain
ci xi corresponding to T
because if we had some S strictly contained in a larger T , then the sum
would be strictly larger since all xi > 0.
By Sperners
Lemma, we conclude that the number of ways to have (ci ) making zero-sum is at most
n
,
and
so
the desired probability bound is this divided by 2n , which is of order 1n .
bn/2c
3. (Erd
os-Ko-Rado Theorem) Let n 2k be positive integers, and let C be a collection of pairwise
intersecting k-element subsets of {1, . . . , n}, i.e., every A, B C has A B 6= . Prove that |C| n1
k1 .
Remark. This corresponds to the construction which takes all subsets that contain the element 1,
say.
Solution:
Pick a random k-set A from 2[n] by first selecting a random permutation Sn , and
then picking a random index i [n]. Then define A = {(i), . . . , (i + k 1)}, with indices after n
wrapping around, of course. It suffices to show that P [A C] k/n.
Let us show that conditioned on any fixed , P [A C|] k/n, which will finish our problem. But
this is equivalent to the statement that C can only contain k intervals (wrapping after n) of the form
{i, . . . , i + k 1}, which is easy to show.
4. (ChipLiar game, Alon-Spencer Theorem 14.2.1) Paul and Carole play a game on a board with positions
labeled {0, 1, . . . , k}. Initially, n stones are at position k. Paul and Carole play for r rounds, where
each round has the following structure: Paul names a subset S of the stones on the board, and then,
Carole either moves all stones in S one position to the left, or moves all stones in S c one position to
the left. Any stone that is moved leftwards from
If the number of stones on the board
Pk0 is discarded.
becomes 1 or 0, Carole loses. Prove that if n i=0 ri 2r > 1, then Carole has a winning strategy.
Solution:
Perfect information game, so one of them has a winning strategy. Suppose it is Paul,
and let him play it. Carole will respond by playing randomly. Note that if Paul indeed has winning
strategy, then it can even always beat a random strategy. Calculate expected number of stones left on
the board after r rounds. By linearity, this is n times the probability that a chip survives for r rounds.
But with random play, each chip moves left Bin(r, 1/2) many times. So, expected number of surviving
chips is:
nP [Bin(r, 1/2) k] ,
which we are given to be greater than 1. Hence some play sequence will leave Carole with 2 stones,
contradicting existence of Pauls perfect strategy.
5. (Bollob
as, 1965) Let A1 , . . . , An and B1 , . . . , Bn be distinct subsets of N such that:
6
(r+s)r+s
r r ss .
r
r+s
or
s
r+s .
Sn
Alternatively, we can do this directly as follows. Let X := i=1 Ai Bi be the base set. Define
p := r/(r + s), and consider a coin which has one side saying A and one side saying B, with the
A-side appearing with probability p. For each element in X, independently flip the coin. This defines
a mapping f : X {A, B}. Define the family of events {Ei }n1 by having Ei occur when all elements
x Ai have f (x) = A, and all elements y Bi have f (y) = B.
By definition of the family of sets, it is impossible for Ei and Ej to occur simultaneously if i 6= j,
because in particular, there would exist some element in either Ai Bj or Aj Bi , and it could neither
be A nor B. So, just as in the previous problem, consider the probability of any Ei occurring. Trivially,
P (Ei ) = pr (1 p)s for any i. Since the events are disjoint, the total probability is npr (1 p)s . Yet all
probabilities are bounded by 1, so just as in the previous problem, n pr (1 p)s , which turns out
to be the desired bound. (In fact, this choice of p is optimal.)
References
[1] N. Alon and J. Spencer, The Probabilistic Method, 2nd ed., Wiley, New York, 2000.