0% found this document useful (0 votes)
111 views5 pages

ICS 171 HW # 2 Solutions: Nverma@ics - Uci.edu

This document provides solutions to homework problems related to optimal pathfinding algorithms. Problem 1 discusses the optimality of depth-first search, breadth-first search, iterative deepening, and other algorithms under different cost assumptions. Problem 2 analyzes heuristic functions for admissibility on a sample state space. Problem 3 defines heuristics to test optimality of best-first search. Later problems involve analyzing properties of A*, IDA*, and applying algorithms like hill climbing to example state spaces.

Uploaded by

Kshitij Goyal
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
111 views5 pages

ICS 171 HW # 2 Solutions: Nverma@ics - Uci.edu

This document provides solutions to homework problems related to optimal pathfinding algorithms. Problem 1 discusses the optimality of depth-first search, breadth-first search, iterative deepening, and other algorithms under different cost assumptions. Problem 2 analyzes heuristic functions for admissibility on a sample state space. Problem 3 defines heuristics to test optimality of best-first search. Later problems involve analyzing properties of A*, IDA*, and applying algorithms like hill climbing to example state spaces.

Uploaded by

Kshitij Goyal
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 5

ICS 171 HW # 2 SOLUTIONS

Solutions have been prepared by Naval Verma. For reporting mistakes and for queries please write to [email protected] Problem 1. The following questions ask about the optimality of the algorithms we discussed in class. If you answer no - give a counterexample, i.e. example of a state space where the algorithm you consider fails to be optimal. If you answer yes - provide the space and time complexity of the corresponding algorithm. Assume for now that all operators have the same cost: (a) will depth-first search be always optimal? Why? ANS: No. G2 is optimal, but G1 is found by DFS. S

G2

G1

Figure 1 (b) will breadth-first search be always optimal ?Why? ANS: Yes. BFS finds the shortest path from the start state to every state, the first time it visits the state. Time = O (b d ) . Space = O (b d ) . Where d = depth of the goal. (c) will iterative deepening be always optimal ? Why? ANS: Yes. Also finds the shortest path from start state to every state, the first time it visits the state. Now Assume that operators can have different costs: (d) will breadth-first search be always optimal ? Why? ANS: No. In fig 1, assume cost(SA) = cost(AG1) = 1 and cost(SG2) = 10. BFS will find G2, which is not optimal. (e) will iterative deepening be always optimal ? Why? ANS: No. Same counter-example as above. (f) will hill-climbing be always optimal ? Why? ANS: Refer to fig 1. Assume same costs as in 1.d. In addition assume h(S) = 2, h(A) =1, h(G1) = h(G2) = 0. (g) will uniform cost be always optimal? Why? ANS: Yes. Cost of every opened path is less than the cost of the first path ending in a goal. Since cost of any path can only increase as we add nodes, it cant happen that we open a suboptimal goal. For detailed proof, refer to RN. Time = O (b d ). Space = O (b d ).

(h) will A* be always optimal? Why? ANS: Yes. For proof refer to RN. Time and space in the worst case, with bad heuristics, can be

O(b d ).
(i) will IDA* be always optimal? Why? ANS: Yes. Time = O (b d ). Space = O (bf

* / ) , where is the smallest branch cost, because the maximum depth you may expand is f * / . It is ok to write space = O (bd ).
Problem 2. Consider the state-space on the Figure 1 below. Provide an analysis of whether the following heuristic functions are admissible or not for this state-space. Your proof should clearly show. S A B G h1: h2: h3: h4: 0 2 3 4 0 3 4 0 0 5 3 0 0 0 0 0

ANS: h* (S) = 3, h*(A) = 4, h*(B) = 4, h*(G) = 0. Admissible = {h1, h3}

B 1 S 2 A 3
Figure 1. Problem 3. Consider the state-space on the Figure 2 below, where the operator costs are shown for each pair of states, which are connected. Define an admissible heuristic for each of S, A, B, C, and G, such that no two values that H takes coincide and Heuristic H1: best first search will find the optimal path using this heuristic Heuristic H2: best first search doesn't find the optimal path using this heuristic Please fill in your answers below. Note that this example shows that even if the heuristic is admissible it doesn't guarantee that the best-first search algorithm will find the optimal solution. ANS: Sample solution. Your solution may differ. S Heuristic H1 Heuristic H2 30 30 A 15 20 B 20 1 C 10 10 G 0 0

5 G 4

C 10 S 10 A 10 B
Figure 2. Problem 4. Are the following assertions true? Explain your answer in one-two sentences. (a) for the A* algorithm, if h1(node) is always less than h2(node), where both h1 and h2 are admissible heuristics, then A* with h1 will find the goal at least as fast as will A* with h2. ANS: TRUE. (A*, h1) will expand all the nodes/paths with with f cost < f cost of G, where G is the optimum goal state. Note that the f cost of G is the same whatever heuristics we use and that is the cost of the optimal path. Lets call it f0. Consider any path opened by (A*, h2). The f cost of the path has to be less than f0, lets say it is f2. Consider the same path in (A*, h1) the same path has an f cost , say f1. Now f1 < f2 (, since in general f = g + h and g is the same in both the cases & h1 < h2). This implies f1 < f2 < f0, which means the path will be opened in (A*, h1) too. We have shown that every path that is opened by (A*, h2) is also opened by (A*,h1). Hence (A*, h1) opens at least as many paths as (A*, h2) and therefore is slower. (b) the best possible heuristic for the A* search is to have h(node) = the true (minimum) path cost of going from start S to goal G via that node, i.e. h will be admissible AND A* will expand the minimum possible number of nodes before hitting the goal. ANS: If the question had said h(n) = the minimum cost of going from n to G (usually denoted by h*(n) ), the answer is TRUE. As shown above the larger is h, the better it is, but for admissibility h <= h*. Hence the best possible heuristic is h* itself. But remember that finding better heuristic values is more complex than finding not so good values. The heuristic as given in the question above is not even admissible for all but a few trivial cases. (c) if heuristic h is not admissible heuristic then IDA* will use exponential space. ANS: FALSE. IDA* models the fringe as a stack and, hence, similarly to iterative deepening algorithm it'll always use linear space, even when heuristic is inadmissible and optimality is not guaranteed. (Note: as we discussed in class there exist two different definitions of A* and IDA* algorithms. Our main working definition was that both these algorithms use admissible heuristics. This question however asks about the algorithm that works exactly like IDA* but with inadmissible heuristic. The latter is not guaranteed to be optimal but will be complete in fairly general circumstances). (d) if h1 and h2 are two admissible heuristics then (h1+h2)/2 is admissible ANS: TRUE. For that matter if 0 1 then h1 + (1 ) h 2 is admissible. (e) if h1 is admissible heuristic then 2*h1 is admissible. ANS: FALSE. Suppose h(n) = h*(n) 0 then 2h(n) h*(n). (f) if h1 and h2 are two admissible heuristics then max(h1, h2) is admissible ANS: TRUE. Since max(h1,h2) at any node is going to be either h1(n) or h2(n). In either case, it is less than h*(n). (g) if h1 and h2 are two admissible heuristics A* search with heuristic H=max(h1, h2) will find the goal at least as fast as with heuristic F=(h1+h2)/2 ANS: TRUE. Avg < max. Hence by (4.a) H is at least as fast as F.

10 G 20

(h) if h1 and h2 are two admissible heuristics and a, b are two real numbers such that a>0 and b>0 and a+b=1, then IDA* is guaranteed to find the optimal solution with heuristic H=a*h1+b*h2 ANS: TRUE. Show that H is admissible and IDA* is optimal with admissible heuristic. Problem 5. For the state-space below, the path costs are shown on the links. The start state is S and the goal state is G. All search algorithms are assumed to obey the following rules: - when a node is expanded in the search tree, the parent of the node is not added as a child - when multiple nodes have the same cost on the Q, they are placed in the priority Q in alphabetical order (i.e., A goes in front of B, etc) Otherwise, the search algorithms behave as defined in the pseudocode description in class.

A 1 S 2 1 B 2 4
3

C 1 1

Suppose that the following heuristic is defined for the state space above: h(S) = 2; h(A) = 2; h(B) = 2; h(G)=h(C) = 0. Write down the sequence of nodes as they will be expanded by 2a) A* Search ANS: SABCG 2a) Iterative Deepening A* Search; ANS: S SA SAB SABCG 2b) Best First Search ANS: SAG 2c) Hill Climbing (Greedy) Search ANS: SAG Note that you only need to mention nodes that are being popped from the priority queue but not the whole fringe. You should assume that on each iteration IDA* expands all nodes that less than or equal to the current cost limit. Problem 6. Suppose that you are doing a project that involves optimization in the high dimensional space (dimension is the order of several hundreds). For example, such problems are typical for neural networks during their training. Based on the material presented on the lectures argue which optimization algorithms are suitable for the problem at hand and which are not. Your write up should be at most one page in length with clear indications of which algorithms you are talking about and reasons why you think they are suitable or not suitable. Please, try to be as precise as possible. Answer: You should have argued that given the huge dimensionality using iterative improvement algorithms such as simulated annealing and gradient descent is the only feasible way to approach the problem. Using grids to localize the minimum/maximum is either infeasible (if the grid has a small step) or will have little effect (large step of the grid).

Solving the system of equations corresponding to equating all the partial to 0 will also in general require high computational expenses.

You might also like