Lab10 Solution Strategy Challenging Problem
Lab10 Solution Strategy Challenging Problem
We begin by observing that this board can have at most 9! arrangements of tiles
(treating the empty space as a tile too).
The given solution (seemingly counterintuitively), starts by attempting to solve a tougher
problem. We calculate the minimum number of moves for ALL possible arrangements,
instead of just the one asked of us.
The rest of the solution can be broken into 2 parts: the logic behind it, and the code
required to implement it.
The logic:
Say we have a hypothetical orientation (1) which takes 10 moves to solve at minimum.
Take another orientation which differs from this orientation by exactly 1 move. We can
say that orientation (2) will take either 9, 10 or 11 moves to solve.
Proof:
Note, the number of moves to go from an orientation to the solved position, or from
solved position to required orientation, are both equal.
Assume it takes 8 or fewer moves to go from sorted position to orientation 2. We can
add 1 extra move to reach orientation 1 now. But this contradicts the fact that orientation
1 takes 10 moves to solve.
Likewise, we don’t take 12 or more moves to solve orientation 2, it’ll take at most 10+1
moves.
Building upon this logic, we start with the sorted position which takes 0 moves to solve.
Then move to all the positions which take 1 move to solve, use those to find all the
positions which take 2 moves to solve etc.
We keep continuing this, until we reach the pattern provided to us as input.
In the solution provided, we’ve given the orientations positions in the array using the
position of the permutation if all permutations were arranged in ascending order.
While the code in the functions (especially orientation to position, or position to
orientation) might seem foreign or intimidating, I would recommend attempting to
recreate the function in your own logic, in ways you’d be more comfortable with.
Note:
1. This solution is a very brute force method, where we essentially go through all
possible moves in an attempt to find our answer. There exist a lot of optimisations which
could be applied to this to speed our solution up, make it take less space, and make it
more efficient.
2. The solution can be treated as an example for a concept called Dynamic
programming (more specifically, a bottom up approach in dynamic programming).
Interested students can find multiple resources for the same online.
3. This solution is also similar to Breadth First Search (BFS), albeit not exactly. A
powerful optimisation can be made using the A* (A-star) technique, using Manhattan
distances as a cost function, however the code required for this will become significantly
more complicated.