0% found this document useful (0 votes)
48 views25 pages

Iterative Deepening: CPSC 322 - Search 6

The document discusses various search algorithms and their properties. It begins with an overview of iterative deepening, which uses depth-first search to iteratively increase the depth limit to find solutions. Iterative deepening depth-first search (IDS) has optimal time complexity of O(bm) like breadth-first search, but only O(bm) space complexity like depth-first search. Iterative deepening A* (IDA*) operates similarly but uses the f-value as the depth limit, making it optimal under the same conditions as A*. Both IDS and IDA* are complete and optimal search methods.

Uploaded by

choconlilom
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views25 pages

Iterative Deepening: CPSC 322 - Search 6

The document discusses various search algorithms and their properties. It begins with an overview of iterative deepening, which uses depth-first search to iteratively increase the depth limit to find solutions. Iterative deepening depth-first search (IDS) has optimal time complexity of O(bm) like breadth-first search, but only O(bm) space complexity like depth-first search. Iterative deepening A* (IDA*) operates similarly but uses the f-value as the depth limit, making it optimal under the same conditions as A*. Both IDS and IDA* are complete and optimal search methods.

Uploaded by

choconlilom
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Iterative Deepening

CPSC 322 Search 6


Textbook 3.7.3
January 24, 2011
Lecture Overview
Recap from last week
Iterative Deepening
Slide 2
Search with Costs
Sometimes there are costs associated with arcs.
In this setting we often don't just want to find any solution
we usually want to find the solution that minimizes cost
( ) ) , cost( , , cost
1
1 0
=

=
k
i
i i k
n n n n
Def.: The cost of a path is the sum of the costs of its arcs
Def.: A search algorithm is optimal if
when it finds a solution, it is the best one:
it has the lowest path cost
3
Expands the path with the lowest cost on the frontier.
The frontier is implemented as a priority queue ordered
by path cost.
How does this differ from Dijkstra's algorithm?
- The two algorithms are very similar
- But Dijkstras algorithm
- works with nodes not with paths
- stores one bit per node (infeasible for infinite/very large graphs)
- checks for cycles
Lowest-Cost-First Search (LCFS)
4
Heuristic search
Def.:
A search heuristic h(n) is an estimate of the cost of the optimal
(cheapest) path from node n to a goal node.
Estimate: h(n1)
5
Estimate: h(n2)
Estimate: h(n3)
n3
n2
n1
Expands the path with the lowest h value on the frontier.
The frontier is implemented as a priority queue ordered
by h.
Greedy: expands path that appears to lead to the goal
quickest
- Can get trapped
- Can yield arbitrarily poor solutions
- But with a perfect heuristic, it moves straight to the goal
Best-First Search (LCFS)
6
Expands the path with the lowest cost + h value on the
frontier
The frontier is implemented as a priority queue ordered
by f(p) = cost(p) + h(p)
A*
7
Admissibility of a heuristic
E.g. Euclidian distance in routing networks
General construction of heuristics: relax the problem,
i.e. ignore some constraints
- Can only make it easier
- Saw lots of examples on Wednesday:
Routing network, grid world, 8 puzzle, Infinite Mario
8
Def.:
Let c(n) denote the cost of the optimal path from node n to any
goal node. A search heuristic h(n) is called
admissible if h(n) c(n) for all nodes n, i.e. if for all nodes it
is an underestimate of the cost to any goal.
A* is complete (finds a solution, if one exists) and
optimal (finds the optimal path to a goal) if:
the branching factor is finite
arc costs are > 0
h is admissible.
This property of A* is called admissibility of A*
Admissibility of A*
9
If there is a solution, A* finds it:
- f
min
:= cost of optimal solution path s (unknown but finite)
- Lemmas for prefix pr of s (exercise: prove at home)
- Has cost f(pr) f
min
(due to admissibility)
- Always one such pr on the frontier (prove by induction)
- A* only expands paths with f(p) f
min
- Expands paths p with minimal f(p)
- Always a pr on the frontier, with f(pr) f
min
- Terminates when expanding s
- Number of paths p with cost f(p) f
min
is finite
- Let c
min
> 0 be the minimal cost of any arc
- k := f
min
/ c
min
. All paths with length > k have cost > f
min
- Only b
k
paths of length k. Finite b finite
Why is A* admissible: complete
Why is A* admissible: optimal
Proof by contradiction
Assume (for contradiction):
First solution s that A* expands is suboptimal: i.e. cost(s) > f
min
Since s is a goal, h(s) = 0, and f(s) = cost(s) > f
min
A* selected s all other paths p on the frontier
had f(p) f(s) > f
min
But we know that a prefix pr of optimal solution path s is on the
frontier, with f(pr) f
min
Contradiction !
Summary: any prefix of optimal solution is expanded before suboptimal
solution would be expanded
11
Selectthemostappropriatealgorithmsforspecific
problems
Depth-FirstSearchvs.Breadth-FirstSearch
vs.Least-Cost-FirstSearchvs.Best-FirstSearchvs.A*
Define/read/write/trace/debugdifferentsearchalgorithms
- With/withoutcost
- Informed/Uninformed
Constructheuristicfunctionsforspecificsearchproblems
FormallyproveA*optimality
- Defineoptimalefficiency
12
Learning Goals for last week
Apply basic properties of search algorithms:
completeness, optimality, time and space complexity
13
Learning Goals for last week, continued
Complete Optimal Time Space
DFS N
(Y if no cycles)
N O(b
m
) O(mb)
BFS Y Y O(b
m
) O(b
m
)
LCFS
(when arc costs available)
Y
Costs > 0
Y
Costs 0
O(b
m
) O(b
m
)
Best First
(when h available)
N N O(b
m
) O(b
m
)
A*
(when arc costs and h
available)
Y
Costs > 0
h admissible
Y
Costs 0
h admissible
O(b
m
) O(b
m
)
Lecture Overview
Recap from last week
Iterative Deepening
14
Want low space complexity but completeness and optimality
Key Idea: re-compute elements of the frontier
rather than saving them
15
Complete Optimal Time Space
DFS N
(Y if no cycles)
N O(b
m
) O(mb)
BFS Y Y O(b
m
) O(b
m
)
LCFS
(when arc costs available)
Y
Costs > 0
Y
Costs >=0
O(b
m
) O(b
m
)
Best First
(when h available)
N N O(b
m
) O(b
m
)
A*
(when arc costs and h
available)
Y
Costs > 0
h admissible
Y
Costs >=0
h admissible
O(b
m
) O(b
m
)
Iterative Deepening DFS (short IDS): Motivation
depth =1
depth =2
depth =3
. . .
Iterative Deepening DFS (IDS) in a Nutshell
Use DSF to look for solutions at depth 1, then 2, then 3, etc
FordepthD,ignoreanypathswithlongerlength
Depth-boundeddepth-firstsearch
(Time) Complexity of IDS
Depth Total # of paths
at that level
#times created by
BFS (or DFS)
#times created
by IDS
Total #paths
for IDS
1 b 1
2 b
2
1
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
m-1 b
m-1
1
m b
m
1
17
m
m-1
2
1
mb
(m-1) b
2
2 b
m-1
b
m
That sounds wasteful!
Lets analyze the time complexity
For a solution at depth m with branching factor b
Solution at depth m, branching factor b
Total #of paths generated:
b
m
+ 2 b
m-1
+ 3 b
m-2
+ ...+ mb
= b
m
(1 b
0
+ 2 b
-1
+ 3 b
-2
+ ...+ m b
1-m
)
) ) ( ( ) (
1
1 1
1
1

=

=

= =
m
i
i m
m
i
i m
b i b ib b
(Time) Complexity of IDS
r
r
i
i

=
1
1
0
Geometric progression: for |r|<1:
) (
m
b O
2
0
1
0
) 1 (
1
r
ir r
dr
d
i
i
i
i

= =


=

=
) ) ( (
0
1 1

i
i m
b i b
2
1
1
1

=

b
b
m
2
1

=
b
b
b
m
Further Analysis of Iterative Deepening DFS (IDS)
Space complexity
DFS scheme, only explore one branch at a time
Complete?
Only finite #of paths up to depth m, doesnt explore longer paths
Optimal?
Proof by contradiction
19
O(b+m) O(b
m
)
O(bm)
O(m
b
)
Yes No
Yes No
Search methods so far
20
Complete Optimal Time Space
DFS N
(Y if no cycles)
N O(b
m
) O(mb)
BFS Y Y O(b
m
) O(b
m
)
IDS Y Y O(b
m
) O(mb)
LCFS
(when arc costs available)
Y
Costs > 0
Y
Costs >=0
O(b
m
) O(b
m
)
Best First
(when h available)
N N O(b
m
) O(b
m
)
A*
(when arc costs and h
available)
Y
Costs > 0
h admissible
Y
Costs >=0
h admissible
O(b
m
) O(b
m
)
(Heuristic) Iterative Deepening: IDA*
Like Iterative Deepening DFS
But the depth bound is measured in terms of the f value
If you dont find a solution at a given depth
Increase the depth bound:
to the minimum of the f-values that exceeded the previous bound
21
Analysis of Iterative Deepening A* (IDA*)
Complete and optimal? Same conditions as A*
h is admissible
all arc costs > 0
finite branching factor
Time complexity: O(b
m
)
Space complexity:
Same argument as for Iterative Deepening DFS
22
O(b+m) O(b
m
)
O(bm)
O(m
b
)
Examples and Clarifications
On the white board
23
Search methods so far
Complete Optimal Time Space
DFS N
(Y if no cycles)
N O(b
m
) O(mb)
BFS Y Y O(b
m
) O(b
m
)
IDS Y Y O(b
m
) O(mb)
LCFS
(when arc costs available)
Y
Costs > 0
Y
Costs >=0
O(b
m
) O(b
m
)
Best First
(when h available)
N N O(b
m
) O(b
m
)
A*
(when arc costs and h
available)
Y
Costs > 0
h admissible
Y
Costs >=0
h admissible
O(b
m
) O(b
m
)
IDA* Y (same cond.
as A*)
Y O(b
m
) O(mb)
Branch & Bound Y (same cond.
as A*)
Y O(b
m
) O(mb)
Define/read/write/trace/debugdifferentsearchalgorithms
- New: Iterative Deepening,
Iterative Deepening A*, Branch & Bound
Apply basic properties of search algorithms:
completeness, optimality, time and space complexity
Announcements:
New practice exercises are out: see WebCT
Heuristic search
Branch & Bound
Please use these! (Only takes 5 min. if you understood things)
Assignment 1 is out: see WebCT
25
Learning Goals for todays class

You might also like