Sp14 Cs188 Lecture 5 - Csps II
Sp14 Cs188 Lecture 5 - Csps II
Sp14 Cs188 Lecture 5 - Csps II
[These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at
Today
Efficient Solution of CSPs
Local Search
Reminder: CSPs
CSPs:
Variables
Domains
Constraints
Goals:
Backtracking Search
Improving Backtracking
General-purpose ideas give huge gains in speed
but its all still NP-hard
SA
Q
NSW
Remember:
Arc consistency detects failure earlier than forward checking
Delete from
the tail!
Important: If X loses a value, neighbors of X need to be rechecked!
What went
wrong here?
K-Consistency
K-Consistency
Increasing degrees of consistency
1-Consistency (Node Consistency): Each single nodes
domain has a value which meets that nodes unary
constraints
2-Consistency (Arc Consistency): For each pair of nodes,
any consistent assignment to one can be extended to
the other
K-Consistency: For each k nodes, any consistent
assignment to k-1 can be extended to the kth node.
Strong K-Consistency
Strong k-consistency: also k-1, k-2, 1 consistent
Claim: strong n-consistency means we can solve without
backtracking!
Why?
Structure
Problem Structure
Extreme case: independent subproblems
Tree-Structured CSPs
Theorem: if the constraint graph has no loops, the CSP can be solved
in O(n d2) time
Compare to general CSPs, where worst-case time is O(d n)
Tree-Structured CSPs
Algorithm for tree-structured CSPs:
Tree-Structured CSPs
Claim 1: After backward pass, all root-to-leaf arcs are consistent
Proof: Each XY was made consistent at one point and Ys domain
could not have been reduced thereafter (because Ys children were
processed before Y)
Improving Structure
Cutset Conditioning
Choose a cutset
SA
Instantiate the
cutset (all possible
ways)
SA
Compute residual
CSP for each
assignment
Solve the residual
CSPs (tree
structured)
SA
SA
Cutset Quiz
Find the smallest cutset for the graph below.
Tree Decomposition*
Idea: create a tree-structured graph of mega-variables
Each mega-variable encodes part of the original CSP
Subproblems overlap to ensure consistent solutions
M1
SA
{(NT=r,SA=g,Q=b),
(NT=b,SA=g,Q=r),
}
NS
W
SA
shared vars
{(WA=r,SA=g,NT=b),
(WA=b,SA=r,NT=g),
}
M4
Agree on
SA
NT
shared vars
shared vars
NT
M3
Agree on
Agree on
WA
M2
NS
W
SA
Agree: (M1,M2)
{((WA=g,SA=g,NT=g), (NT=g,SA=g,Q=g)), }
Iterative Improvement
Example: 4-Queens
Performance of Min-Conflicts
Given random initial state, can solve n-queens in almost constant
time for arbitrary n with high probability (e.g., n = 10,000,000)!
The same appears to be true for any randomly-generated CSP
except in a narrow range of the ratio
Summary: CSPs
CSPs are a special kind of search problem:
States are partial assignments
Goal test defined by constraints
Basic solution: backtracking search
Speed-ups:
Ordering
Filtering
Structure
Local Search
Local Search
Tree search keeps unexplored alternatives on the fringe (ensures
completeness)
Local search: improve a single option until you cant make it better (no
fringe!)
New successor function: local changes
Generally much faster and more memory efficient (but incomplete and
Hill Climbing
Simple, general idea:
Start wherever
Repeat: move to the best neighboring state
If no neighbors better than current, quit
Simulated Annealing
Idea: Escape local maxima by allowing downhill
moves
But make them rarer as time goes on
38
Simulated Annealing
Theoretical guarantee:
Stationary distribution:
Genetic Algorithms
Example: N-Queens