0% found this document useful (0 votes)
65 views4 pages

Advcomp 2010 4 30 20106

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views4 pages

Advcomp 2010 4 30 20106

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

ADVCOMP 2010 : The Fourth International Conference on Advanced Engineering Computing and Applications in Sciences

A Novel Local Search Algorithm for Knapsack Problem

Mostafa Memariani Kambiz Shojaee Ghandeshtani


Department of Electrical Engineering, Department of Electrical Engineering,
Ferdowsi University of Mashhad University of Tehran
Mashhad, Iran Tehran, Iran
E-mail: [email protected] E-mail: [email protected]

Ahmad Madadi Mohammad Mohsen Neshati


Department of Computer Engineering, Department of Electrical Engineering,
University of Amir Kabir Ferdowsi University of Mashhad
Tehran, Iran Mashhad, Iran
E-mail: [email protected] E-mail: [email protected]

Abstract— Knapsack problem is an integer programming that optimum, it has increased the accuracy in solving knapsack
is generally called "Multidimentional Knapsack". Knapsack problem. Therefore evolutionary algorithms and more
problem is known as a NP-hard problem. This paper is an definitely decoder-based evolutionary algorithms are widely
introduction to a new idea for solving one-dimentional used in solving knapsack problem [5], [6]. Their advantage
knapsack that with defining the "Weight Value Index", over the more traditional direct representation of the
"Sorting" and "Smart local search" forms a new algorithm. problem is their ability to always generate and therefore
This algorithm is mathematically formulated and has run on 5 carry out evolution over feasible candidate solutions, and
sample problems of one-dimentional knapsack, that in most of thus focus the search on a smaller more constrained search
them the result is close to the optimum. The results show that
space.
this method by comparison with the others recently published
in this field, despite of its simplicity, has enough required
Many researchers have struggled in developing
functionality in order to get the result on the tested items. evolutionary methods for knapsack problems. From them,
we can name some modern evolution methods like tabu
Keywords-Artificial intelligence; NP-hard; Knapsack search [7], [8], genetic algorithm [9], [10] and simulated
problem; Combinational optimization. annealing [11], [12] that in most cases show good results. In
recent years, genetic algorithms show that it is the best
method for solving large knapsack problems and in general
I. INTRODUCTION
0-1 integer programming problems [13], [14].
Knapsack problem is an integer programming that is in The knapsack repeatedly is used in different processing
general called "Multidimentional Knapsack". Knapsack models like processor allocation in distributed systems [15],
problem is known as a NP-hard problem [1]. One- manufacturing in-sourcing [16], asset-backed securitization
dimentional knapsack problem with "constant weight [17], combinatorial auctions [18], computer systems design
group" is a special form of multidimensional knapsack. For [19], resource-allocation [20], set packing [21], cargo
one-dimensional knapsack in comparison with loading [22], project selection [23], cutting stock [24] and
multidimensional knapsack, more precise evolutionary capital budgeting (where project has profit and consume
algorithms have been studied. Most of the researches is units of resource. The goal is to determine a subset of the
regarding to one-dimentional knapsack problem. For further projects such that the total profit is maximized and all
information about knapsack problem and different precise resource constraints are satisfied) [25].
algorithms, please refer to [2]-[4]. Another type of knapsack is Quadratic Knapsack
The reason for naming this problem to "knapsack" is Problem (QKP) [26]. In the Quadratic Knapsack Problem,
because of its similarity to making decision for a mountain an object’s value density is the sum of all the values
climber to pack his knapsack. The person should decide the associated with it divided by its weight. It can be used in
optimum combination in choosing his accessories for finance [27], VLSI design [28] and location problems [29].
knapsack in a way that according to the knapsack capacity, In the second part of this paper, we will describe the
he should select items with more value (profit). This kind of knapsack problem; in third part, the proposed algorithm will
problems is of combinational optimization problems family. be introduced. In the forth part, algorithm simulation and
For several past years, precise methods such as Branch comparison of results have been presented and we will
and Bound have used for solving knapsack problem [22]. In conclude in the final part.
recent years, and with the development of smart
optimization and evolutionary algorithms, solving more II. PROBLEM DESCRIPTION
difficult problems is now possible, such that in addition to Suppose that some items are available and each item has
reducing the time for achieving results close to the a weight of 'wi' and a value of 'vi'. In knapsack problem,

Copyright (c) IARIA, 2010 ISBN: 978-1-61208-101-4 77


ADVCOMP 2010 : The Fourth International Conference on Advanced Engineering Computing and Applications in Sciences

weight restriction is defined in a way that the total weight of error in the second stage would be existent as well. It is
selected items should be less than knapsack capacity. The important to know that the probability of the error in
goal in this problem is finding a subset of items in a way selecting items based on proposed priority that is
that they have the most total value and also satisfy the weight-value index would increase as we get closer to
knapsack capacity constraint. final stages. The probability of such errors is in the
For mathematically defining the mentioned concepts, we moment that the knapsack is getting filled with lower
have: weight-value index of items. Therefore in this stage that
⎧n n
⎫ is the main part of algorithm, we will replace the items
max ⎨∑ vi xi : ∑ wi xi ≤ b, xi = 0 or 1 , i = 1,...,n⎬ (1) with similar weight-value index in the final stages of
⎩ i=1 i =1 ⎭ selecting items. In this stage we will gradually increase
In formula (1), 'n', 'vi' and 'wi' are number of items, value the boundary of searching. In this part of algorithm we
of item 'i' and weight of item 'i', respectively. In the above will study different results to achieve the best one.
formula, 'b' is the knapsack capacity and xi is the algorithm In this intelligence searching algorithm, in addition to
input array. If the element is chosen, the xi is 1 and otherwise previous stages, we achieved the better results by the helping
is 0. of some sort of modifications and corrections. For instance
As formula (1) shows, the goal is to maximize the goal we can find the minimum of the selected items by dividing
function with the given conditions. In the next section, the the knapsack capacity to the item with the highest weight.
proposed algorithm for solving the knapsack problem will be We can get to the scope of weight-value index results or in
introduced. fact, items that their probability of being among the optimum
answer is very high.
III. PROPOSED ALGORITHM The main foundation of this method has been introduced
The presented method for solving the knapsack problem above in 3 steps and the algorithm pseudocode would be as
is based on statistical operations on data and follows.
combining it with artificial intelligence methods. In this
method we have a set of weight and value data groups that IV. ALGORITHM FORMULA
are related in pairs and each of data shows the weight and s1- Determining optimum powers for achieving
the value of an item. The goal of this method in first stage is optimum weight-value index by scanning from 0 to 2 with
introducing each item with a new coefficient that would be a the step of 0.1 and selecting the best powers for the
combination of its value and weight. With the help of this proportion of value to weight of items by selecting items
new index, the chance of selecting an item will be defined. until the knapsack is completely full. This selection is based
The proposed algorithm with enough experience and on a way that the weight-value index priority, selected items
iteration in changing the method of selecting based on the value should be higher than the other powers that has been
weight-value index and in a converged evolutionary process scanned for the proportion of value to weight.
will provide results close to optimum. The stage of process s2- Random search around the selected power from s1
on data for achieving a real close result to optimum will be with the Radius boundary of α = 0.5.
as follows: s3- Sorting and selecting items based on weight-value of
z According to the point that the goal of knapsack s2 until the completion of knapsack capacity sequence
problem is to take the sum of values to the maximum length accepting items l1 and rejected items the l2
and satisfy the weight constrain of knapsack, for s4- Fixing items from vector value of s3 that is higher
converting 2 item dependents to one dependent, we will than Mean and standard deviation values of weight vector
use the general form of (Value p1 / Weight p2) that the p1 elements as selected items and random replacement of the
and p2 are the power of values and weights, rest selected items from s3 and rejected items as well around
respectively. The best value of them will be different the last selection of s3 with the radius of 0.1 items and l1
depending on the number of items and their dispersion and l2.
that with scanning the power of values and weights in s5- Studying selection rule of selecting minimum items
the above combinational index and calculating the sum equal to dividing the maximum capacity of knapsack to the
of selected item values until satisfaction of the weight highest weight of items value and increasing the length of
constrain, we can have the best selection for the powers sequence of accepted items (l1) until satisfying the
of mentioned formula in the beginning of the algorithm. minimum selection rule.
This value would be the "weight-value index" of items. s6- comparing the answers and the results of the current
z Next step of solving the problem is sorting items based selections with the best achieved result and replacing it with
on their weight-value index and generating initial result the previous if that is a better answer.
that would be close to optimum. In this selection, the
items with higher weight-value prioritized for selection s7- α=0.5 +α
and selection of items will continue until the knapsack s8- reduce the radius boundary of optimum power index
capacity is full. with a coefficient of 0.9.
z Because of the used method in first stage for generating s9- repeating s1 to s9 while α=1 and radius boundary has
weight-value index is not precise. The probability of reached to boundary interval.

Copyright (c) IARIA, 2010 ISBN: 978-1-61208-101-4 78


ADVCOMP 2010 : The Fourth International Conference on Advanced Engineering Computing and Applications in Sciences

V. RESULTS AND COMPARISON ACKNOWLEDGMENT


In this part, the result of running algorithm on set of data This work is supported by Nano-Age Technology Group
that was given in [32]-[33] is analyzed. Five sample in Mashhad, Iran (www.nanoage.ir).
problems are proposed in [30]-[32] for testing the algorithm.
In [30], e2, e3 and e5 samples have been solved with the REFERENCES
different methods. [1] M. Garey and D. Johnson, "Computers and intractability: a guide to
In [31], samples e1 to e4 and in [32], samples e1 and e2 the theory of NPcompleteness," San Francisco: W. H. Freeman,
have been studied. The samples e1 to e5 have 10, 20, 50,100 1979.
and 100 objects respectively. It is obvious that the samples [2] S. Martello and P. Toth, “Knapsack Problems: Algorithms and
Computer Implementations,” Wiley, New York, 1990.
with a greater number of objects are more complicated than
[3] S. Martello, D. Pisinger, and P. Toth, “New trends in exact algorithms
samples with less number of objects and they are more for the 0–1 knapsack problem,” European Journal of Operational
difficult and more time consuming to solve. Research, vol. 123,No. 2, pp. 325–336, 1999.
In Table 1, the best obtained results in the relevant [4] D. Pisinger, “Contributed research articles: a minimal algorithm for
papers have been compared with the results of our proposed the bounded knapsack problem”, ORSA Journal on Computing, vol.
algorithm. As it is clear in Table 1, the proposed algorithm 12, No. 1, pp. 75–84, 2000.
that is called Wise Experiencing Knapsack Problem [5] J. Gottlieb, “Permutation-Based evolutionary algorithms for
(WEKP) has resulted acceptable answers. multidimensional knapsack problem,” Proc. of ACM Symp. on
Applied Computing, 2000.
The algorithm that has been introduced in this paper has
improved the results of greedy and simple evolutionary [6] G. R. Raidl, “An improved genetic algorithm for the multiconstrained
0-1 knapsack problem,” Proc of 1998 IEEE Congress on Evolutionary
algorithms by rate of 0.9 and 1.9 percent to the best answer. Computation, pp. 207 – 211, 1998.
The algorithm of [31], which is a combination of greedy and [7] F. Glover and G. A. Kochenberger, "Critical event tabu search for
genetic algorithms, has been improved the results of e1-e4 multidimensional knapsack problems," Kluwer Academic Publishers,
problems by rate of 0.7 and 0.2. The algorithm mentioned in pp. 407–427, 1996.
[32] is an enhanced form of ACO that the results shows 0.2 [8] S. Hanafi and A. Fréville, "An efficient tabu search approach for the
percent improvement in e1 and e2 problems as well. 0-1 multidimensional knapsack problem," European Journal of
The results after simulating the proposed algorithm by Operational Research, vol. 106, pp. 659–675, 1998.
this paper show that the results have been improved by 0.16 [9] Chu and J. Beasley, "A genetic algorithm for the multiconstrained
knapsack problem," Journal of Heuristics, vol. 4, pp. 63–86, 1998.
percent regarding to [30], 0.05 percent to [31] and 0.9 %
regarding to [32]. [10] G. R. Raidl, “Weight-Codings in a genetic algorithm for the
multiconstraint knapsack problem,” Proc of 1999 IEEE Congress on
In Table 1, we can see that for the 3rd sample problem Evolutionary Computation, pp. 596-603, 1999.
we have achieved a result that was never achieved in other [11] C. Reeves, “Modern Heuristic Techniques for Combinatorial
papers up to now. Problems," McGraw-Hill Book Company Europe, 1995.
In Table 2, the best, average and the worst answers for [12] A. Drexl. "A simulated annealing approach to the multiconstraint
20 times run for every sample has been given. Also, the zero-one knapsack problem". Computing, vol. 40, pp. 1–8, 1988.
sequence of the best obtained results for every sample has [13] Y. Sun and Z. Wang, “The Genetic Algorithm for 0–1 Programming
been determined as a string containing 0 and 1, where 0 with Linear Constraints,” Proc. of the 1st ICEC’94, Orlando,FL,
means no selection and 1 stands for selecting the ith object. edited by D. B. Fogel, pp. 559–564, 1994.
As it is illustrated in Table 2, even the average of the [14] R. Hinterding, “Mapping, order-independent genes and the knapsack
responses is very close to the optimum response and these problem”, Proc. of the 1st IEEE International Conference on
Evolutionary Computation 1994, Orlando, FL, edited by D. B. Fogel,
responses acquire in an acceptable time period. pp. 13–17, 1994.
The mean time for running every problem on a pentium4 [15] B. Gavish and H. Pirkul, “Allocation of data bases and processors in a
and a processor of 1.8GHz speed and 512MB of ram with distributed computing system Management of Distributed Data
the MATLAB 7.7 software is given answers. Processing,” vol. 31, pp. 215–231, 1982.
[16] N. S. Cherbaka, R. D. Meller, and K. P. Ellis, “Multidimensional
VI. CONCLUSION knapsack problems and their application to solving manufacturing
insourcing problems,” Proc. of the Annual Industrial Engineering
This paper is an introduction to a novel idea for solving Research Conference, Houston,TX, May 16-19, 2004.
one-dimensional knapsack problem by defining weight-value [17] R. Mansini and M. Speranza, “A multidimensional knapsack model
index and sorting; as a consequence, a new algorithm was for the asset-backed securitization,” Journal of Operational Research
proposed. This algorithm is mathematically formulated and Society, vol. 53, pp. 822-832, 2002.
has run on 5 samples regarding to one-dimensional knapsack [18] S. DeVries, and R. Vohra, “Combinatorial Auctions: A Survey,”
that in most of them the answers are near to optimum. Northernwestern University Technical Report, Evanston, IL (2000).
The results shows that this method in comparison with [19] C. Ferreira, M. Grotschel, S. Kiefl, C. Krispenz, A. Martin, and R.
the recent works published in this field, despite of its Weismantel., “Some integer programs arising in the design of
simplicity is functional enough to achieve acceptable results mainframe computers,” ZORMethods Models Operations Research,
vol. 38, No. 1, pp. 77-110, 1993.
in tested problems.
[20] E. Johnson, M. Kostreva, and U. Suhl, “Solving 0 – 1 integer
programming problems arising from large scale planning models,”
Operations Research, vol. 33, pp. 805-819, 1985.

Copyright (c) IARIA, 2010 ISBN: 978-1-61208-101-4 79


ADVCOMP 2010 : The Fourth International Conference on Advanced Engineering Computing and Applications in Sciences

[21] G. Fox and G. Scudder, “A heuristic with tie breaking for certain 0 – [28] C. E. Ferreira, A. Martin, C. C. de Souza, R. Weismantel, and L. A.
1 integer programming models,” Naval Research Logistics, vol 32, Wolsey," Formulations and valid inequalities for node capacitated
No. 4, pp. 613-623, 1985. graph partitioning," Mathematical Programming, vol. 74, pp. 247–
[22] W. Shih, “A branch and bound method for the multiconstraint zero- 266, 1996.
one knapsack problems,” Journal of the Operations Research Society, [29] J. Rhys, "A selection problem of shared fixed costs and network
vol. 30, pp. 369-378, 1979. flows," Management Science, vol. 17, pp. 200–207, 1970.
[23] C. Peterson, “Computational experience with variants of the balas [30] K. Li, Y. Jia, W. Zhang, and Y. Xie, "A new method for solving 0-1
algorithm applied to the selection of research and development knapsack problem based on evolutionary algorithm with schema
projects,” Management Science, vol. 13, pp. 736-750, 1967. replaced," Proceedings of the IEEE, International Conference on
[24] P. Gilmore and R. Gomery, “The theory and computation of knapsack Automation and Logistics Qingdao, China, pp. 2569-2571, Sep. 2008.
functions,” Operations Research, vol. 14, pp. 1045-1074 1966. [31] Y. Shao, H. Xu, and W. Yin, "Solve zero-one knapsack problem by
[25] J. Lorie and L. Savage, “Three problems in capital rationing,” journal greedy GA," IEEE 2009 -International Workshop on Intelligent
of business, vol. 28, pp. 229-239, 1955. Systems and Applications.
[26] B. A. Julstrom," Greedy, genetic, and greedy genetic algorithms for [32] P. Zhao, P. Zhao, and X. Zhang, "A new ant colony optimization for
the quadratic knapsack problem," GECCO’05, Washington, DC, the knapsack problem," Computer-Aided Industrial Design and
USA, pp.607-614, June 25–29, 2005. Conceptual Design, 2006, CAIDCD '06, 7th International Conference
on 17-19 Nov. 2006.
[27] D. L. Laughhunn, "Quadratic binary programming with applications
to capital budgeting problems", Operations Research, vol. 18, pp.
454–461, 1970.

TABLE 1. COMPARATIVE RESULTS BY OTHER HEURISTIC METHODS

[30] [31] [32] WEKP


evolutionary Standard Greedy Proposed
Simple Greedy
Greedy algorithm genetic genetic Basic Improved method
evolutionary algorithm
algorithm with schema algorithm algorithm ACO ACO (Best
algorithm (GA)
replace (SGA) (GGA) result)
e1 - - - 295 295 295 292 295 295

e2 1023 1042 1042 1024 1037 1042 1022 1024 1042

e3 3095 3077 3103 3077 3103 3112 - - 3119

e4 - - - 5372 5365 5375 - - 5372

e5 26380 25848 26559 - - - - - 26553

TABLE 2. BEST RESULTS FOR FIVE SAMPLE PROBLEM

Example 1 Example 2 Example 3 Example 4 Example 5


No. of
10 20 50 100 100
Objects
Best 295 1042 3119 5372 26553
Mean (20
295 1040.5 3106 5367.4 26553
runs)
worst 295 1037 3115.1 5360 26553
Mean time
7.1 22.8 13.08 23.07 45.33
(s)
110101011110100110 111111111011111111111001110111011 1111111111111111111111111111111111
Best 101111110
111000111 110111111111000010 000101001110111110010110101000001 1111111111110111111110100010110110
chromosome 10111100000
11011000000010 0000100001100100101000011000000000 11111110001110111000000000000000

Copyright (c) IARIA, 2010 ISBN: 978-1-61208-101-4 80

You might also like