Optimization-Report
Optimization-Report
FINAL PRODUCT
USING SOME EXACT AND HEURISTIC METHOD
TO SOLVE THE OPTIMIZATION PROBLEM
2
1. Overview of our problem
- There are n passenger 1, 2, 3, …, n and a bus located at the point 0 to transport all
passengers which satisfy the constraint of problem. We are given the distance matrix
which give us the distance to travel from 2 point
This is the constraint of problem:
- The passenger i want to travel to point i+n
- The maximum capacity of the bus is k (it means at any time, there are at most k
passengers on the bus)
The objective of this problem is to find the shortest route for the bus, serving n
passenger, satisfy all the constraint, and return to point 0.
The input of our problem:
- Line 1: Enter n, k
- Line i+1 (i=1, 2, …, 2n+1): Enter the (i−1) th line of the distance matrix (rows and
columns are indexed from 0, 1, 2, ..., 2n).
3
+, load: Current vehicle load.
+, current_cost: Cumulative travel cost.
+, start: Save start time to measure execution time.
+, min_cost: Best known solution (initially infinity).
+, best_path: Stores the optimal path.
+, min_dis: calculated by finding the smallest distance between any two points in
the distance matrix. This will be used to calculate the lower bound for the cost during the
search process.
The is¿ function is critical for ensuring that the bus does not exceed its capacity and that
passengers are only dropped off after they have been picked up.
- Pickup Condition: If the current point v is pickup point ( ∀ v ∈ [ 1 , n ] )
+, The number of passengers on the bus does not exceed the limit k, and the
passenger has not yet been picked up.
- Drop-off Condition: If the current point v is a drop-off point ( ∀ v ∈ [ n+1 , 2 n ] )
+, The passenger has already been picked up, and the drop-off point has not yet
been visited.
4
Try(k) Function
- Try(k) is the main recursive function to search for the optimal route. It checks all
possible points to find the route with the lowest cost.
- Check for Base Case:
+, if k==2*n+1 : This condition checks whether all passengers have been picked
up and dropped off. If k equals 2n + 1, it means all points (passengers) have been visited.
+,“ total cost =current cost +distances [x [k−1]][0]”: This calculates the total cost by
adding the current cost (current_cost) to the distance from the last point (the passenger
just dropped off) back to the starting point (0). This step is crucial to determine if the
current route is optimal.
5
- Check lower bound: The current cost plus the minimum possible cost for the
remaining steps is less than the previously stored minimum cost.
- Recursive Call:
+, Try(k + 1): If the above condition is satisfied, make a recursive call to continue
searching for the next step.
- Backtracking: After exploring the possibility of moving to point v, we need to
backtrack to explore other potential routes.
+, if v <= n: If point v was a pickup point, decrease the load by 1.
+, Else: If point v was a drop-off point, increase the load by 1.
+, “current_cost -= distances[x[k - 1]][x[k]]”
- In this method, we first model the problem and use the help of or-tools library to
solve it.
6
The following is how we model and implement the code:
- After that, we model our constraint and implement the code for each constraint:
Constraint 1: each point visit by only 1 point and from each point move to exactly 1
point
7
Constraint 2: pick when move to a pickup point
Constraint 4: can not return to point 0 from a pick up point and can not move to a drop
point from start point
Constraint 5: index of a pickup point < index of its corresponding drop point
8
Constraint 6: flow constraint
- Finally, we define the objective function that depends on the declare variables
as follow:
- After modeling, all the variables, constraints with objective function have been
added to the solver. Now we use the or-tools to solve and gain the solution:
9
2.1.3. Integer Programming
- With the same approach with Constraint Programming, we use or-tools to solve but
we must model the problem in different way.
- The constraints:
10
- And finally, the objective function:
We add our model to the SAT solver in or-tools library to solve and give us the
solution.
11
2.2 Heuristic method
- Unlike the exact method, this method uses the experience to solve instead of
enumerating all the configuration, so we could find a feasible solution at a reasonable
computational cost.
2.2.1. Greedy Algorithms
- If all nodes are visited (k == 2*n+1), calculate the total cost (including returning to
the depot)
- If this cost is smaller than min_cost, we update the min_cost and save the path
12
- Next, we find the next move (v) with the minimum distance from the current
position (x[k-1]) (Ensure that the move is valid based on is_valid_move function)
- Then update x[k] (which is the next node in the path), mark it as visited, update the
cost and adjust ‘load’
- Call ‘greedy(k+1)’ to proceed to the next node until all nodes have been visited.
13
2.2.2. Local Search
- Local search is an optimization algorithm commonly used to find approximate
solutions to combinatorial optimization problems. Instead of searching the entire solution
space, it starts from an initial solution and iteratively improves it by making small
changes..
This is our strategy:
- First, we generate a random solution and check if it is a valid route, if not, we try to
generate again until we get a valid random initial route:
14
- The function to check if the random generated route is valid:
- After obtaining an initial route that satisfies all constraints, we try to enhance this
solution by generating alternative feasible routes through small modifications to the
original route. Two primary approaches are employed:
- The total distance of the current route is calculated. For each newly generated route,
its total distance is computed and compared to the current route's total distance. If a
shorter total distance is found, the new route becomes the best route, and the shortest total
distance is updated accordingly.
15
2-Opt Swap:
Relocation:
16
2.2.3. Meta Heuristic
2.2.3.1. Simulated Annealing
- For meta heuristic method, we use simulated annealing to “try” to pass the local
optima.
- For simulated annealing, we have some parameters:
- We can change these parameters to get higher result (but it will consume much more
time), or get much faster result but the result is not always good.
17
- After that, we update the route if the distance is smaller OR if the distance if larger,
it may be accepted probabilistically:
- At each iteration, we decrease the initial temp and if the current temp small enough,
we stop the algorithms:
18
3. Result and conclusion
Exact methods Heuristic methods
Test(n,k) Branch Linear Constraint Greedy Local Simulated
and Programmin Programming Search Annealing
bound g
5,3 37 37 37 49 45 41
8,4 63 63 63 97 75 77
10,6 38 38 41 41 42
15,6 64 64 93 98 98
40,12 174 211 236
100,40 144 204 230
500,40 6552 9228 17928
1000,40 12176 41808
Table Of Value
From the table of optimal values and the graph of running time of each algorithm with
different size of test case, we have some conclusions:
- Compared between exact methods:
+,With small test-case like n = 15, Integer Programming and Constraint
Programming can give us optimal value while Branch and Bound can just run if n = 8 or
smaller
19
- Compared between heuristic methods:
+, Local Search and Simulated Annealing are based on the random initial route, so
the result is change every time, if the initial is good enough, the result of these algorithms
is better than greedy
+, But with large test case, these algorithms take time to execute and the result is
not good as greedy algorithm, so if we want the algorithms is fast and “good” enough,
Greedy algorithms is always a good choice
20
4. Work assignment
Branch and bound code Ly
IP, CP code Linh
Greedy code Hiếu
Local search, Simulated Annealing code Trung Anh
Testing Trung Anh
Presentation
Slide Ly
Report Hiếu, Trung Anh, Linh, Ly
21