0% found this document useful (0 votes)
86 views71 pages

Approximating NP-hard Problems: Efficient Algorithms and Their Limits

This document summarizes approximation algorithms and their limits for solving NP-hard problems. It discusses: 1) Approximation algorithms aim to find solutions within some factor of the optimal (e.g. half as good as optimal). Many early algorithms used linear programming. In 1994, Goemans-Williamson introduced an algorithm for Max Cut using semidefinite programming. 2) Constraint satisfaction problems like Max 3-SAT are discussed. In 2002, Khot introduced the Unique Games Conjecture, which states that certain constraint satisfaction problems are hard to approximate within some factor unless P=NP. 3) Assuming the Unique Games Conjecture is true, Raghavendra showed that for every constraint satisfaction problem

Uploaded by

FrederickSharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views71 pages

Approximating NP-hard Problems: Efficient Algorithms and Their Limits

This document summarizes approximation algorithms and their limits for solving NP-hard problems. It discusses: 1) Approximation algorithms aim to find solutions within some factor of the optimal (e.g. half as good as optimal). Many early algorithms used linear programming. In 1994, Goemans-Williamson introduced an algorithm for Max Cut using semidefinite programming. 2) Constraint satisfaction problems like Max 3-SAT are discussed. In 2002, Khot introduced the Unique Games Conjecture, which states that certain constraint satisfaction problems are hard to approximate within some factor unless P=NP. 3) Assuming the Unique Games Conjecture is true, Raghavendra showed that for every constraint satisfaction problem

Uploaded by

FrederickSharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 71

Approximating NP-hard Problems

Efficient Algorithms and their Limits

Prasad Raghavendra
University of Washington
Seattle
Combinatorial Optimization
Problems

!
Set Cover Max 3 SAT Steiner Tree
!

d
a rd

r
Vertex P xH
( x1  Cover MultiCut
2  x3 )( x2  x 3  x5 )( x 2  x3  x5 )( x 5  x4  x1 )
N

a
Find an assignment that satisfies the
Max 3 SAT

H
Max Cut
maximum number of clauses. Label Cover

P
Max Di Cut Sparsest Cut
Multiway Cut

Max 2 SAT
N Metric TSP Max 4 SAT
Approximation Algorithms

Can
An we find
algorithm α-approximation
A isaansolution that is for
say
a problem if for every instance I,
half as good as optimum?
A(I) ≥ α ∙ OPT(I)

--Vast Literature--
The Tools
Till 1994,
A majority of approximation algorithms directly or
indirectly relied on Linear Programming.

In 1994,
Semidefinite Programming based
algorithm for Max Cut
[Goemans-Williamson]

Semidefinite Programming - A generalization of Linear


Programming.

Semidefinite Programming is the one of the most powerful


tools in approximation algorithms.
Constraint Satisfaction Problems
Max 3 SAT
( x1  x 2  x3 )( x2  x 3  x5 )( x 2  x3  x5 )( x 5  x4  x1 )
Find an assignment that satisfies the
maximum number of clauses.

Variables {x1 ,x2 , x3 , x4 , x5}


Finite Domain {0,1}
Constraints Clauses
Kind of constraints permitted
Different CSPs
Gap for MaxCUT
Approximability of CSPs Algorithm = 0.878
Hardness = 0.941
ALGORITHMS MAX k-CSP
[Charikar-Makarychev-Makarychev 06] Unique Games
[Goemans-Williamson] MAX 3-CSP
[Charikar-Wirth] NP HARD
MAX 3-AND
[Lewin-Livnat-Zwick]
[Charikar-Makarychev-Makarychev 07] MAX 3-MAJ
[Hast] MAX E2 LIN3
[Charikar-Makarychev-Makarychev 07] MAX 3 DI-CUT
[Frieze-Jerrum]
[Karloff-Zwick] MAX 4-SAT
[Zwick SODA 98] MAX DI CUT
[Zwick STOC 98]
MAX 3-SAT
[Zwick 99]
[Halperin-Zwick 01] MAX CUT
[Goemans-Williamson 01] MAX 2-SAT
[Goemans 01] MAX Horn SAT
[Feige-Goemans]
[Matuura-Matsui] MAX k-CUT
[Trevisan-Sudan-Sorkin-Williamson]
0 1
Given linear equations of the x-y = 11 (mod 17)
form: x-z = 13 (mod 17)
TowardsXi – bridging
Xk = cik mod p gap,
this …
….
Satisfy maximum
In 2002, number
Subhash Khotof
introducedz-w
the = 15(mod 17)
equations.

Unique Games Conjecture [Khot 02] [KKMO]


Unique Games Conjecture
For every ε> 0, for large enough p,
Given : 1-ε (99%) satisfiable system,
NP-hard to satisfy
ε (1%) fraction of equations.
Unique Games Conjecture
A notorious open problem.
Algorithm On (1-Є) satisfiable instances
[Khot 02] 1  O( p 2 1/ 5 log(1 /  ) )
[Trevisan] 1  O(3  log n )
[Gupta-Talwar] 1 – O(ε logn)

[Charikar-Makarychev-Makarychev] p  /( 2 )
[Chlamtac-Makarychev-Makarychev] 1  O( log n log p )
[Arora-Khot-Kolla-Steurer-Tulsiani-Vishnoi]  1
1 log
 

Hardness Results:
No constant factor approximation for unique games. [Feige-Reichman]
UGC HARD NP HARD
Assuming UGC
MAX k-CSP UGC Hardness
Unique Games Results
MAX 3-CSP [Khot-Kindler-Mossel-O’donnell]
MAX 3-AND [Austrin 06]
For MaxCut, Max-2-SAT, [Austrin 07]
MAX 3-MAJ
Unique Games
MAX E2 LIN3
based hardness
[Khot-Odonnell]
[Odonnell-Wu]
MAX 3 DI-CUT = [Samorodnitsky-Trevisan]
approximation
MAX 4-SAT obtained by Semidefinite programming!
MAX DI CUT
MAX 3-SAT
MAX CUT
MAX 2-SAT
MAX Horn SAT
MAX k-CUT

0 1
The Connection

MAX k-CSP
Unique Games UGC Hard
How General a CSP? MAX 3-CSP How Simple an SDP?
Theorem: [Raghavendra 08]
MAX 3-AND
Assuming Unique Games MAX
Conjecture, For takes
3-MAJ
everynear
CSP,linear
Can Specify
Theorem: [Raghavendra08]
“the10%simplest semidefinite
of 3-Clauses programs
MAX E2 LIN3 give the
timebest
in the size of
Aapproximation
generic
70% algorithm that
computable
of Cut constraints is optimal for every
efficiently.” CSP under
UGC! GENERIC
MAX 3 DI-CUT
20% of 2-SAT constraints (at least
the CSP.
MAX as good as all known
4-SAT algorithms)
(techniques from
ALGORITHM
MAX DI CUT [Arora-Kale])
MAX 3-SAT
MAX CUT
MAX 2-SAT
MAX Horn SAT
MAX k-CUT
3-way Cut

B
3-Way Cut:
10 “Separate the 3-terminals
15
A 1
7 while separating the
1 minimum number of edges”
3
C A generalization of the
classic s-t cut problem

B
[Karger-Klein-Stein-Thorup-Young]
A 12/11 factor approximation algorithm
for 3-Way Cut

A C
Graph Labelling Problems
ALGORITHMS
Generalizations of 3-Way Cut [Calinescu-Karloff-Rabani 98]
• k-Way Cut [Chekuri-Khanna-Naor-Zosin]
[Calinescu-Karloff-Rabani 01]
• 0-Extension [Gupta-Tardos]
[Karger-Klein-Stein-Thorup-Young]
• Class of Metric Labelling Problems [Kazarnov 98]
[Kazarnov 99]
[Kleinberg-Tardos]

Theorem: [Manokaran-Naor-Raghavendra-Schwartz]
Assuming Unique Games Conjecture,
The “earthmover linear program” gives the best
approximation for every graph labelling problem.
Ranking Teams?
Maximum
Rank teams Acyclic Subgraph
so that result of
“Given a directed
maximum numbergraph,
of
orderagrees
games the vertices to
with the
maximize the number of
ranking
forward edges.”

•Best known approximation


algorithm :
“Output a Random Ordering!”
Result

Theorem: [Guruswami-Manokaran-Raghavendra]
Assuming Unique Games Conjecture,
The best algorithm’s output is as good as a random ordering.

More generally,

Theorem: [Guruswami-Manokaran-Raghavendra]
Assuming Unique Games Conjecture, For every Ordering CSP,
a simple SDP relaxation gives the best approximation.
The UG Barrier

If UGC is true,
Constraint Satisfaction
Problems
Then Simplest SDPs
give the best
approximation
Graph Labelling Problems UGC possible.
HARD
If UGC is false,
Ordering CSPs

Hopefully, a new
Kernel Clustering Problems
algorithmic
technique will arise.
Grothendieck Problem
Even if UGC is false

Generic Approximation SDP Lower Bounds


Algorithm for CSPs For problems like Maximum Acyclic
Subgraph, Multiway Cut,
At least as good as all known
algorithms for CSPs.

Computing Approximation Ratios


An algorithm to compute the value of
approximation ratio obtained by a certain SDP
An Interesting Aside
Grothendieck’s Inequality (1953)
There exists constant KG such that, for all matrices (aij)

1.67 < KG < 1.78 [Krivine]


In computer science terminology,

Grothendieck constant = Approximation given by the


Semidefinite relaxation for the Bipartite
Quadratic Programming Problem

Algorithm to compute Grothendieck constant [Raghavendra-Steurer09]


SEMIDEFINITE PROGRAMMING
Max Cut
Max CUT
Input :
10
A weighted graph G
15 7
1
1 Find :
3 A Cut with maximum
number/weight of
crossing edges
Fraction of
crossing edges
Max Cut SDP

1 Quadratic Program
-1 Semidefinite Program
-1
1 10

15
-1 Variables : vx1 , xv2 … xvn
1 7
1
1 1
-1
x|i =vi1|2or= -1
1
-1 -1
3

1
-1
Maximize  w
4 (i , j )E
ij (
| x
vii  x
v j )| 2

Relax all the xi to be unit vectors instead of {1,-1}.


All products are replaced by inner products of vectors
MaxCut Rounding
v2

Cut the sphere by a random


v1 v3
hyperplane, and output the
induced graph cut.

-A 0.878 approximation for


the problem.
v5
[Goemans-
Williamson]
v4
The Simplest
Max Cut SDP: Relaxation for
MaxCut v2
Embedd the graph on the
N - dimensional unit ball, v1 v3

Maximizing

¼ (Average Squared Length v5


of the edges)
v4
In the integral solution, all the vi are 1,-1. Thus they satisfy
additional constraints
For example : (vi – vj)2 + (vj – vk)2 ≥ (vi – vk)2

Assuming UGC, No additional constraint helps!


Building on the work of [Khot-Vishnoi],

[Raghavendra-Steurer 09]
Adding all valid constraints on at most
Possibility:
2^O((loglogn)1/4 ) variables to the simple SDP does not
disprove the Unique Games Conjecture
Adding a simple constraint on every 5 variables
yields a better approximation for MaxCut,

Breaches the UG barrier and disproves


Constraint Satisfaction
Metric Labelling Problems
Unique Games Conjecture!
Problems
Ordering Constraint Satisfaction
Problems Kernel Clustering Problems

Grothendieck Problem
So far :
• Unique Games Barrier
• Semidefinite Programming
technique (Maxcut example)

Coming Up :
• Generic Algorithm for CSPs
• Hardness Result for MaxCut.
Generic Algorithm for CSPs
Semidefinite Program for CSPs
( x1  x 2  x3 )( x2  x 3  x5 )( x 2  x3  x5 )( x 5  x4  x1 )
Variables : Constraints :
For each variable Xa For each clause P,
0 ≤μ(P,α) ≤ 1
Vectors {V(a,0) , V(a,1)} Xa = 1 V(a,0) = 0


 ( P , )  1
V(a,1) = 1

For each clause P = (xa ν xb ν xc), Xa = 0 V(a,0) = 1


For each clause P (x ν x ν V
x ), = 0
Scalar variables a b
(a,1)
c

μ(P,000) , μ(P,001) , μ(P,010) , μ(P,100) , μ(P,011) , For each pair X X in P,


If Xa = 0, Xb = 1, Xc = 1a , b
consitency between vector and LP
μ(P,110) , μ(P,101) , μ(P,111) variables.
μ(P,000) = 0 μ(P,011) = 1
Objective Function : μ(P,001) = 0 μ(P,110) = 0
V(a,0)= ∙V
μ(P,010) 0 (b,0) = μ(P,000) + μ=(P,001)
μ(P,101) 0
  P( )  ( P , )
Clauses assignments
μ(P,100)
V = ∙V
(a,0)
0
(b,1) = μ μ(P,111)
(P,010)+ μ= 0
(P,011)

P  {0 ,1}3 V(a,1) ∙V (b,0) = μ(P,100) + μ(P,101)


Semidefinite Relaxation for CSP
SDP solution for =: Example of local distr.:
Á = 3XOR(x3, x4, x7)
for every constraint Á in =
x3x4 x7 ¹Á
- local distributions ¹Á over 0 0 0 0.1
0 0 1 0.01
assignments to the variables of Á 0 1 0 0

for every variable xi in = 1 1 1 0.6

- vectors vi,1 , … , vi,q


constraints Explanation of constraints:
first and second moments of
distributions are consistent
(also for first moments) and form PSD matrix

SDP objective:
maximize
v2
Rounding Scheme
[Raghavendra-Steurer] v1 v3

STEP 1 : Dimension Reduction

• Project the SDP solution along


say 100 random directions. v5

Map vector V
V → V’ = (V∙G1 , V∙G2 , … V∙G100) v4

v2
STEP 2 : Discretization

v2
v1
•Pick an Є –net for the
100 dimensional sphere

v3
v4
• Move every vertex to the nearest

v5
point in the Є –net Constant dimensions
STEP 3 : Brute Force
•Find a solution to the new FINITE MODEL
Graph on Є –net points
instance by brute force.
HARDNESS RESULT FOR MAXCUT
The Goal

Theorem: [Raghavendra 08]


Assuming Unique Games Conjecture, For MaxCut,
“the simple semidefinite program give the best
approximation computable efficiently.”

UG Hardness
HARD INSTANCE G Assuming UGC,
Suppose for an instance G, On instances with MaxCut = C,

the SDP value = C It is NP-hard to find a MaxCut


The actual MaxCut value = S better than S
v2

Dimension Reduction v1 v3
Max Cut SDP:

Embed the graph on the


N - dimensional unit ball,
100
v5
Maximizing
v4
¼ (Average Squared Length

v1
v2
of the edges)

v3
v4

v5
Project to random 1/ Є2
Constant dimensional hyperplane
dimensional space.
New SDP Value = Old SDP Value + or - Є
Making the Instance Harder
v2

v5
v3
v2
v1 v3 SDP Value = Average Squared

v5
v3

v2 Length of an Edge
v1 v3

v3
v1

v4
Transformations
v5
• Rotation does not change the
v2

v4
v5 SDP value.
v2

• Union of two rotations has the


v1

v4 v5
same SDP value
v1

v4 v4

Sphere Graph H :
Union of all possible rotations of G.

SDP Value (Graph G) = SDP Value ( Sphere Graph H)


Making the Instance Harder
v2

v5
v3
v2
v1 v3

v5
v3

v2
v1 v3 MaxCut (H) = S
v3
v1

v4
v5 MaxCut (G) ≥ S
v2

v4
v5
Pick a random rotation of G and
v2
v1

v4 v5 read the cut induced on it.


v1

v4 v4 Thus,
v2
v1 v3
MaxCut (H) ≤ MaxCut(G)
v5
v4
SDP Value (G) = SDP Value (H)
v2

Hypercube Graph v1 v3

For each edge e, connect


every pair of vertices in v5
hypercube separated by
SDP Solution v4
the length of e

Generate Edges of Expected Squared


Length = d

1) Starting with a random x Є {-1,1}100 ,


1) Generate y by flipping each bit of x
with probability d/4

Output (x,y)
100 dimensional hypercube : {-1,1}100
Dichotomy of Cuts
1 1

1
1 A cut gives a function F on the
hypercube
F : {-1,1}100 -> {-1,1}
-1

-1 Dictator Cuts
-1
F(x) = xi

Hypercube = {-1,1}100 Cuts Far From Dictators


(influence of each coordinate
on function F is small)
Dictator Cuts
v2
v1 v X

v5 For each edge e = (u,v),


u connect every pair of vertices
Y
in hypercube separated by
100 dimensional hypercube the length of e

Pick an edge e = (u,v), consider all edges in hypercube


corresponding to e
Number of
Fraction of red Fraction of bits in which
edges cut by = dictators that = X,Y differ
horizontal cut one such =
dictator . edge (X,Y) |u-v|2/4
Fraction of edges cut by dictator = ¼ Average Squared
Distance
Value of Dictator Cuts = SDP Value (G)
v2 -1
v1 v3
-1 -1 Cuts far from Dictators
v5
1
v4
1 1
100 dimensional hypercube
v5
v3

v2 Intuition:
v3

v5

v1 v3
Sphere graph : Uniform on all directions
v4
v2

v4
v1 v 2

v5 Hypercube graph : Axis are special directions


v1

v4
If a cut does not respect the axis, then it should
not distinguish between Sphere and Hypercube
graphs.
The Invariance Principle
Central Limit Theorem

``Sum of large number of {-1,1} random variables


has similar distribution as
Sum of large number of Gaussian random variables.”

Invariance Principle for Low Degree Polynomials


[Rotar] [Mossel-O’Donnell-Oleszkiewich], [Mossel 2008]

“If a low degree polynomial F has no influential coordinate,


then F({-1,1}n) and F(Gaussian) have similar distribution.”
Hypercube vs Sphere

H
P : sphere -> Nearly {-1,1}
F:{-1,1} -> {-1,1}
100
is the multilinear extension
is a cut far from every of F
dictator.
By Invariance Principle,
MaxCut value of F on hypercube ≈ Maxcut value of P on
Sphere graph H
Hyper Cube Graph
v2
v1 v3
[Dictatorship Test]
v5 [Bellare-Goldreich-Sudan]
v4
Graph G Completeness
Value of Dictator Cuts
= SDP Value (G)

Soundness
Cuts far from dictators
≤ MaxCut( Sphere Graph)
Hypercube = {-1,1}100 ≤ MaxCut( G)
UG Hardness
UG Hardness
Dictatorship “On instances, with
Test value C, it is NP-hard to
Completeness C
Soundness S [KKMO] output a solution of
value S, assuming UGC”

In our case,

Completeness = SDP Value (G)


Soundness = MaxCut(G)
Cant get better approximation than SDP,
assuming UGC!
FUTURE WORK
Understanding Unique Games

“Unique Games Conjecture is false→New algorithms?”

[Reverse Reduction from MaxCut/CSPs to Unique Games]

“Stronger SDP relaxations → Better approximations?”


equivalently,
“Can stronger SDP relaxations disprove the UGC?”

Unique Games and Expansion of small sets in graphs?


Beyond
Beyond CSPs Approximability

Dichotomy Conjecture
Semidefinite Programming “Every CSP is polynomial
or UG hardness results time solvable or NP-hard”
for problems beyond CSP

Example : [Kun-Szegedy] Techniques from


approximation could be useful here.
1) Metric Travelling
Salesman Problem, 1) When do local
2) Minimum Steiner Tree. propogation algorithms
work?
2) When do SDPs work?
Thank You
Given a function
Dictatorship Test F : {-1,1}R {-1,1}
•Toss random coins
•Make a few queries to F
•Output either ACCEPT or
REJECT

F is a dictator function F is far from every


F(x1 ,… xR) = xi dictator function
(No influential coordinate)

Pr[ACCEPT ] = Pr[ACCEPT ] =
Completeness Soundness
A Dictatorship Test for Maxcut
A dictatorship test is a graph
G on the hypercube.
A cut gives a function F on the
hypercube

Completeness
Value of Dictator Cuts
F(x) = xi
Soundness
Hypercube = {-1,1}100 The maximum value
attained by a cut far from
a dictator
Connections
SDP Gap
Instance
SDP = 0.9
OPT = 0.7
[Khot-Vishnoi]
[This Work] For sparsest cut, max cut.

Dictatorship UG
Test Hardness
Completeness = 0.9 0.9 vs 0.7
Soundness = 0.7
[Khot-Kindler-Mossel-O’Donnell]

All these conversions hold for very general


classes of problems
In Integral Solution
General Boolean 2-CSPs vi = 1 or -1
V0 = 1

Total PayOff

Triangle Inequality
2-CSP over {0,..q-1}

Total PayOff
Arbitrary k-ary GCSP

•SDP is similar to the one used by [Karloff-Zwick]


Max-3-SAT algorithm.
•It is weaker than k-rounds of Lasserre / LS+
heirarchies
1) Tests of the verifier are same as
Key Lemma the constraints in instance G
2) Completeness = SDP(G)

DICTG
Any
Dictatorship Test
CSP Instance
on functions
G
F : {-1,1}n ->{-1,1}

Any RoundF
Function Rounding Scheme
F: {-1,1}n → {-1,1} on CSP Instances G

If F is far from a dictator,


RoundF (G) ≈ DICTG (F)
Key Lemma : Through An Example
SDP:
1
Variables : v1 , v2 ,v3
|v1|2 = |v2|2 = |v3|2 =1
2 3

Maximize
1
3

| v1  v2 |2  | v2  v3 |2  | v3  v1 |2 
c = SDP Value
Local Random Variables v1 , v2 , v3 = SDP Vectors

Fix an edge e = (1,2).


1

A12 A13 There exists random


variables a1 a2 taking
A23 3
2
values {-1,1} such that:
For every edge, E[a thereais] a=local
v ∙ distribution
v over
1 2 1 2
integral solutions such that:
All the moments of order at most 2 match the
E[a 2
inner products.
1 ] = |v 1 |2
E[a 2
2
] = |v 2 | 2
A12,A23,A31 = Local Distributions
Analysis
Pick an edge (i,j) Max Cut Instance

Generate ai,aj in {-1,1}R as follows: 1

The kth coordinates aik,ajk come


from distribution Aij
Add noise to ai,aj 2
3
Accept if Input Function:
F(ai) ≠ F(aj) F : {-1,1}R -> {-1,1}

1 1 1 1 2 
 E A12 [( F (a1 )  F (a2 )) ]  E A23 [( F (a2 )  F (a3 )) ]  E A31 [( F (a3 )  F (a1 )) ]
2 2

3 4 4 4 
A12,A23,A31 = Local Distributions
Completeness
Input Function is a Dictator : F(x) = x1

1  11  1 2 2 1 1 2 2 11 2 2 
E A12 [(EFA12(a[(1 )a11F (aa2 ))21 )] ]  E AE [( F[((aa221) Fa(31a3)))] ] E A3131[([(Fa(31a3) a11F)(a]1 )) ]
 
3  43  4 44 23 A23
44  

Suppose (a1 ,a2) is sampled from A12 then :
E[a11 a21] = v1∙ v2 E[a112] = |v1|2 E[a212] = |v2|2

E A12 [(a1  a2 ) ] | v1  v2 |
2 2

Summing up, Pr[Accept] = SDP Value(v1 , v2 ,v3)


c = SDP Value
Global Random Variables v1 , v2 , v3 = SDP Vectors

g = random Gaussian vector.


1
(each coordinate generated by
i.i.d normal variable)
B b1 = v 1 ∙ g
3
2
b2 = v 2 ∙ g
b3 = v 3 ∙ g
There
E[b1 b2is] a= global
v1∙ v2 distribution
E[b2 b3] = vB=(b
∙ v ,b2 ,bb3)] over
E[b = v real
2 3 1 3 1 3∙ v1
numbers such that:
All the2 moments 2 of order at most 2 2 match the 2
E[b1 ] = |v1| E[b2 ] = |v2| E[b3 ] = |v3|
2 2
inner products.
Rounding with Polynomials
1

Input Polynomial : F(x1 ,x2 ,.. xR) 2


B 3

Generate
b1 = (b11 ,b12 ,… b1R)
b2 = (b21 ,b22 ,… b2R)
b3 = (b31 ,b32 ,… b3R)
with each coordinate (b1t ,b2t ,b3t) according to global distribution B

Compute F(b1),F(b2) ,F(b3)


Round F(b1),F(b2),F(b3) to {-1,1}
Output the rounded solution.
1 1 1 1 2 
 EB [( F (b1 )  F (b2 )) ]  EB [( F (b2 )  F (b3 )) ]  EB [( F (b3 )  F (b1 )) ]
2 2

3 4 4 4 
Invariance
Suppose F is far from every dictator then since A12
and B have same first two moments,
F(a1),F(a2) has nearly same distribution as
F(b1),F(b2)
1 1
E A12 [( F (a1 )  F (a2 )) ]  EB [( F (b1 )  F (b2 )) 2 ]
2
• 4 4

• F(b1), F(b2) are close to {-1,1}


Rounding Scheme
(For Boolean CSPs)

Rounding Scheme was discovered by the


reversing the soundness analysis.
This fact was independently observed by Yi Wu
SDP Rounding Schemes
SDP Vectors For any CSP, it is enough to
(v1 , v2 .. vn ) do the following:
Random Projection
Instead of one random
projection, pick sufficiently
Projections
many projections
(y1 , y2 .. yn )

Process the projections


Use a multilinear
Assignment
polynomial P to process the
projections
Rounding By Polynomial P(y1,… yR)
Roughly Formally
Sample R Random Sample R independent vectors : w(1), w(2) ,.. w(R)
Directions Each with i.i.d Gaussian components.
Project each vi along all directions w(1), w(2) ,..
Project along w(R)
them Y (j) = v ∙v + (1-ε)(v – (v ∙v )v ) ∙ w(j)
i 0 i i 0 i 0
Compute P on Compute
projections xi = P(Yi(1) , Yi(2) ,.. Yi(R))
Round the output If xi > 1, xi = 1
of P If xi < -1, xi = -1
If xi is in [-1,1]
xi = 1 with probability (1+xi)/2
-1 with probability (1-xi)/2
R is a constant parameter
Algorithm

Solve SDP(III) to obtain vectors (v1 ,v2 ,… vn )


Smoothen the SDP solution (v1 ,v2 ,… vn )

For all multlinear polynomials P(y1 ,y2, .. yR) do
Round using P(y1 ,y2, .. yR)
Output the best solution obtained
Discretization
“For all multilinear polynomials P(y1 ,y2, .. yR) do”

- All multilinear polynomials with coefficients


bounded within [-1,1]
- Discretize the set of all such multi-linear
polynomials
There are at most a constant number of such
polynomials.
Smoothening SDP Vectors
Let u1 ,u2 .. un denote the SDP vectors
corresponding to the following distribution
over integral solutions:
``Assign each variable uniformly and
independently at random”

Substitute
vi* ∙ vj* = (1-ε) (vi ∙ vj) + ε (ui∙ uj)
Semidefinite Linear program over the
Program inner products of vectors

Simplest SDP for MaxCut In the integral solution,


all the vi are 1,-1
Variables : v1 , v2 … vn
| v i |2 = 1 Thus they satisfy
1
Maximize  ij i j
w
4 (i , j )E
( v  v ) 2
additional constraints
In the integral solution,
all the
Example Constraint: (viv–i are
vj)2 1,-1
+ (vj – vk)2 ≥ (vi – vk)2

Thus they satisfy


additional constraints
Thank You
MAX k-CSP
Unique Games
MAX 3-CSP
MAX 3-AND
MAX 3-MAJ
MAX E2 LIN3
GENERIC
MAX 3 DI-CUT

ALGORITHM
MAX 4-SAT
MAX DI CUT
MAX 3-SAT
MAX CUT
MAX 2-SAT
MAX Horn SAT
MAX k-CUT

0 1
MAX k-CSP
Unique Games
MAX 3-CSP
MAX 3-AND
MAX 3-MAJ
MAX E2 LIN3
MAX 3 DI-CUT
MAX 4-SAT
MAX DI CUT
MAX 3-SAT
MAX CUT
MAX 2-SAT
MAX Horn SAT
MAX k-CUT

0 1
MAX k-CSP
Unique Games
MAX 3-CSP
MAX 3-AND
MAX 3-MAJ
MAX E2 LIN3
MAX 3 DI-CUT
MAX 4-SAT
MAX DI CUT
MAX 3-SAT
MAX CUT
MAX 2-SAT
MAX Horn SAT
MAX k-CUT

0 1
Approximability of CSPs
ALGORITHMS Unique Games
[Charikar-Makarychev-Makarychev 06] MAX 3-CSP
[Goemans-Williamson] MAX CUT
[Charikar-Wirth]
[Lewin-Livnat-Zwick] MAX 2-SAT
[Charikar-Makarychev-Makarychev 07] MAX k-CSP
[Hast] MAX 3-SAT
[Charikar-Makarychev-Makarychev 07] MAX DI CUT
[Frieze-Jerrum]
MAX 4-SAT
[Karloff-Zwick]
[Zwick SODA 98] MAX k-CUT
[Zwick STOC 98] MAX Horn SAT
[Zwick 99]
[Halperin-Zwick 01] MAX 3 DI-CUT
[Goemans-Williamson 01] MAX E2 LIN3
[Goemans 01] MAX 3-AND
[Feige-Goemans] MAX 3-MAJ
[Matuura-Matsui]
0 1

You might also like