Achab Berrechid Crow Search Algorithm
Achab Berrechid Crow Search Algorithm
net/publication/335699232
CITATIONS READS
13 476
4 authors:
All content following this page was uploaded by Hassan El Amri on 16 April 2020.
1 Introduction
The bin packing problem belongs to the family of cutting and packing (C&P) problems
[1]. It occupies an important place in the literature of C&P problems, as it can be
encountered in several fields, such as logistics [2], industry [3], and Cloud computing
[4]. In this paper, we consider the bin packing problem of rectangles, called two-
dimensional bin packing problem (2D-BPP). Given a finite number of small rectangles
(items), and an undetermined number of large and identical rectangles (bins). The aim
is to allocate all items, without overlapping, to a minimum number of bins. Formally,
2 Background
In this section, we will briefly review genetic algorithm and crow search algorithm.
crow i has a memory mi ðGÞ, at the generation G, where it memorizes its best hiding
place. The quality of positions is measured by a fitness or objective function of the
optimization problem.
The CSA starts with initial positions of crows and sets the memory of crows to their
initial positions, and then iteratively updates the position and the memory of crows
according to the following formulas:
xi ðG 1Þ þ ri x FL x mj ðG 1Þ xi ðG 1Þ if rj AP
xi ðGÞ ¼ ð1Þ
a random position otherwise
xi ðGÞ if f ðxi ðGÞÞ f ðmi ðG 1ÞÞ
mi ðGÞ ¼ ð2Þ
mi ðG 1Þ otherwise
In fact, at each generation, each crow i selects randomly a crow j from the swarm in
order to follow it aiming to steal its food. If the crow j knows about the presence of
crow i, the former fools the latter by moving towards a random position. We note that
FL is the crows flight length, AP is the awareness probability, f ð:Þ is the fitness
function, while ri and rj are uniform distributed random numbers in the interval [0, 1].
The CSA terminates when a maximum number of generations is met. Then, it picks
out the overall best memory in the swarm as a global optimum of the problem.
For more details, we invite the readers to refer to the original paper of CSA [24].
For all i, the decision variable xijk takes the value 1 if the j-th item is packed in the k-th
bin, and it is equal to 0 otherwise. For the sake of clarity, an item is allocated to one and
only one bin, then it is immediately removed from the list of items. Moreover, the
packing of each item might respect the Bottom-Left strategy that consists of placing an
item in the lowest and the left-most position of a given bin (see [12]).
The memory matrix M i has the same dimensions as the position matrix X i of each
crow i. In the initial stage, the position matrix coefficients are generated randomly and
the position matrix coincides with memory matrix of each crow i.
1
f ¼ Pd ð6Þ
k¼1 yk
The Eq. (6) corresponds to reverse of the cost function of the 2D-BPP, which means
that the fewer the bins used to pack all items, the higher the fitness value.
position matrix corresponds to a gene of the chromosome, while such a gene is encoded
by a binary vector of n alleles. Figure 1 shows the structure of a chromosome.
Selection Operation. In this stage, each crow i selects a crow j from the population,
by using a binary tournament selection. In fact, the binary tournament selection consists
of choosing at random two crows from the population and making a competition
between them. The winner is a crow with highest fitness value. So, the crow i chooses
the winner as a target to follow. That is, the position of the crow i is updated.
Crossover Operation. Crossover is the process of exchanging genes between indi-
viduals in order to create new ones. Its main goal is to exploit more the search space of
solutions. In this stage, we combine the CSA rules with crossover strategy in order to
update positions as follows. If rj APj , we cross the position of crow i with the
position of the selected crow j. Else, the position of crow i is crossed with its memory.
A PMX crossover operator developed by Goldberg and Lingle [26], is applied to
recombine two positions, or even position and memory. The example in Fig. 2 explains
how we adapt it in our context. Indeed, we choose at random a bin from the first parent
position (for instance, bin 1), and we replace its content {item 2, item 3} in the
offspring at the same bin ID. The other bins inherits their contents from the second
parent. The duplicate items are replaced by items packed in the bin 1 of the second
parent by respecting the same order. In our example, items 2 and 3 are packed
respectively in bins 3 and 2 of the second parent. So, we replace the item 2 with item 4
and the item 3 by the item 5. We note that the sign «» indicates the crossover
operation.
Once the crossover strategy has finished, the feasibility of offspring positions is
evaluated. If it is necessary, the above-mentioned repair operator is applied for
restoring the feasibility of solutions.
A Crow Search-Based Genetic Algorithm 209
4 Simulation
4.1 Simulation Design
The proposed CSGA is coded in JAVA on a PC with an Intel Core i5 with 2.5 GHz
and 4.0 GB of RAM, running on 64-bits Windows 7 operating system. The perfor-
mance of CSGA is examined on various test instances of the 2D-BPP available at the
URL [27]. We should note that the considered test data are divided into different
210 S. Laabadi et al.
classes. Ten classes are generated randomly by varying the capacity of bins as well as
the range of item’ sizes. The first six classes were introduced by Berkey and Wang
[12], while the last four classes were proposed by Martello and Vigo [28]. In each class,
identical bins are considered while the items are characterized by different sizes that are
uniformly generated in a same interval of values. Each class is also divided into five
subclasses according to the number of available items (n = 20, 40, 60, 80, 100),
whereas each subclass is divided into ten problem instances which gives a total of 50
problem instances for each class. During the experiments, four problem classes are
undertaken (see Table 1). Three classes that seem to be representative are chosen from
Berkey and Wang classes, while only one class is selected from Martello and Vigo
classes to represent its colleagues. Meanwhile, we considered just three subclasses
having n = 20, 60, 100 that corresponds, respectively, to small, medium and large size
problem.
To validate the efficiency of CGSA, this later is compared against the binary
particle swarm optimization algorithm (BPSO) proposed by Kennedy and Eberhart
[29], and the standard genetic algorithm (GA). We should note that the BPSO has
undergone minor modifications in order to adapt it to the 2D-BPP context, especially,
in encoding scheme which is similar to that of CSGA, and in the fact that it incor-
porates the abovementioned repair operator in order to meet the capacity constraint of
bins. Moreover, the standard GA uses the same genetic operators as used by CSGA
(see Subsect. 3.3). The control parameters of each algorithm are reported in Table 2.
That is; Popsize means the population size, Maxgen indicates the maximum number of
generations, BTS-size indicates the size of binary tournament selection. Furthermore,
wmin and wmax denote respectively the minimum and maximum inertia weight, while c1
and c2 denote the acceleration coefficients. Finally, Vmin and Vmax refer to the minimum
and maximum velocity of particles. It is worth mentioning that the parameters of
CSGA are set based upon many preliminary independent experiences, while the BPSO
parameters are chosen according to the experiences of other authors.
A Crow Search-Based Genetic Algorithm 211
Table 3. Wilcoxon test results on average fitness value of CSGA against other algorithms
Problem class Number of items Compared algorithms R+ R− p-value s
Class 1 20 CSGA - GA 55 0 0,005 1
CSGA - BPSO 18,5 2,5 0,093 0
60 CSGA - GA 55 0 0,005 1
CSGA - BPSO 27,5 27,5 1,000 0
100 CSGA - GA 55 0 0,005 1
CSGA - BPSO 55 0 0,005 1
Class 3 20 CSGA - GA 55 0 0,004 1
CSGA - BPSO 2 13 0,136 0
60 CSGA - GA 55 0 0,005 1
CSGA - BPSO 0 55 0,005 −1
100 CSGA - GA 55 0 0,005 1
CSGA - BPSO 2 53 0,009 −1
Class 5 20 CSGA - GA 55 0 0,005 1
CSGA - BPSO 4 41 0,028 −1
60 CSGA - GA 55 0 0,005 1
CSGA - BPSO 0 55 0,005 −1
100 CSGA - GA 55 0 0,005 1
CSGA - BPSO 0 55 0,005 −1
212 S. Laabadi et al.
Table 4. Comparative results of CSGA versus GA and BPSO using instances with 40 items
Test instance LB* CSGA GA BPSO
Used bins CPU time Used bins CPU time Used bins CPU time
Instance1 25 29 0,067 37 2,013 34 4,346
Instance2 32 29 0,066 36 1,916 35 4,383
Instance3 29 29 1,914 35 1,914 34 4,386
Instance4 31 28 0,063 35 2,085 34 4,472
Instance5 27 27 0,064 33 1,993 29 4,382
Instance6 29 29 0,067 36 2,030 35 4,393
Instance7 24 26 0,068 32 1,920 29 4,421
Instance8 26 28 0,063 35 2,006 34 4,676
Instance9 21 25 0,062 33 1,995 30 4,389
Instance10 34 30 0,064 38 2,028 39 4,423
Average 27,8 28 0,250 35 1,990 33,3 4,427
value of the difference between results of two algorithms. The value of s in Table 3
shows the statistical result of pairwise comparison: s ¼ 1 indicates that the first algo-
rithm is significantly better than the latter; s ¼ 1 indicates that the first algorithm is
significantly worse than the latter. Both values 1 and 1 means there is a significant
difference. Whereas s ¼ 0 indicates that there is no significant difference between the
two compared algorithms. From Table 3, we can then observe that CSGA is superior to
the standard GA. However, it is sometimes nearly equivalent to BPSO and sometimes
it is outperformed by BPSO.
Tables 4, 5 and 6 report the empirical results of CSGA, GA and BPSO in terms of
average used bins obtained over 30 runs, and the average execution time. These tests
are performed on class 9. In each table, the first column indicates the test instances of
Table 5. Comparative results of CSGA versus GA and BPSO using instances with 80 items
Test instance LB* CSGA GA BPSO
Used bins CPU time Used bins CPU time Used bins CPU time
instance1 59 60 0,264 70 7,816 65 34,436
instance2 58 59 0,271 69 7,718 66 39,465
Instance3 57 59 0,252 70 8,001 66 34,008
Instance4 53 60 0,263 69 7,753 65 37,740
Instance5 62 61 0,259 74 7,772 72 32,446
Instance6 62 59 0,251 71 7,576 68 31,868
Instance7 59 59 0,262 70 7,833 66 39,240
Instance8 58 58 0,264 71 7,902 68 32,110
Instance9 49 54 0,261 59 40,232 59 34,454
Instance10 60 61 0,274 74 7,575 71 32,291
Average 57,7 59 0,262 69,7 11,018 66,6 34,806
A Crow Search-Based Genetic Algorithm 213
Table 6. Comparative results of CSGA versus GA and BPSO using instances with 100 items
Test instance LB* CSGA GA BPSO
Used bins CPU time Used bins CPU time Used bins CPU time
instance1 71 75 0,377 87 11,755 81 63,273
instance2 64 69 0,371 86 11,785 81 64,554
Instance3 68 61 0,384 82 11,767 71 65,076
Instance4 78 76 0,372 92 11,872 88 62,890
Instance5 65 73 0,388 84 11,892 76 74,067
Instance6 71 74 0,381 78 64,425 78 73,664
Instance7 66 69 0,380 87 12,084 81 64,983
Instance8 74 73 0,375 90 11,859 86 62,992
Instance9 66 66 0,379 85 12,039 78 73,892
Instance10 72 75 0,384 84 64,165 84 65,849
Average 69,5 71,1 0,379 85,5 22,364 80,4 67,124
class 9. The second column records the benchmark lower bounds of 2D-BPP (see URL
[27]). The last columns represent the obtained average results of the three compared
algorithms.
From Tables 4, 5 and 6, we can see that the number of used bins obtained by
CSGA, in each instance, is nearest or equal to corresponding lower bound, and
sometimes even better than the latter. Furthermore, the proposed algorithm yields better
results in less computational time, in comparison with both BPSO and GA. We note
that best results are in bold and the computation time is expressed in seconds.
5 Conclusions
In this paper, we proposed a crow search algorithm based on genetic operators for
solving the 2D-BPP. We used various benchmark tests to examine the performance of
the proposed algorithm. In addition, we compared our algorithm with two state-of-the-
art algorithms that have the same nature (i.e. bio-inspired algorithms). The experi-
mental results showed that the proposed approach gives good results against that of GA
regardless the type of instances. However, our algorithm is outperformed by BPSO in
difficult instances, but it performs well than BPSO in more realistic instances. In
addition, our algorithm is quite fast than both comparative algorithms in easy instances
as well as in difficult instances. These results are very encouraging. They prove the
efficiency of the CSGA. Since our algorithm incorporates the genetic operators in the
core idea of CSA, it can be applied to solve, as the GA, several optimization problems
that work in discrete search space, as well as continuous search space. So, in future
research, we hope to exploit that in 2D-BPP which takes into account the guillotine
constraint and/or allows rotation of items by 90°, in addition to other important real-
world applications coming from different fields.
References
1. Wäscher, G., Haußner, H., Schumann, H.: An improved typology of cutting and packing
problems. Eur. J. Oper. Res. 183, 1109–1130 (2007)
2. Crainic, T.G., et al.: Bin packing problems with uncertainty on item characteristics: An
application to capacity planning in logistics. Procedia-Soc. Behav. Sci. 111, 654–662 (2014)
3. Wee, T.S., Magazine, M.J.: Assembly line balancing as generalized bin packing. Oper. Res.
Lett. 1, 56–58 (1982)
4. Song, W., et al.: Adaptive resource provisioning for the cloud using online bin packing.
IEEE Trans. Comput. 63, 2647–2660 (2014)
5. Lodi, A., Martello, S., Vigo, D.: Heuristic and metaheuristic approaches for a class of two-
dimensional bin packing problems. INFORMS J. Comput. 11, 345–357 (1999)
6. Epstein, L., Van Stee, R.: Optimal online algorithms for multidimensional packing problems.
SIAM J. Comput. 35, 431–448 (2005)
7. Laabadi, S.: A new algorithm for the Bin-packing problem with fragile objects. In: 3rd
International Conference on Logistics Operations Management, pp. 1–7. IEEE, Fez (2016)
8. Bansal, N., Liu, Z., Sankar, A.: Bin-packing with fragile objects and frequency allocation in
cellular networks. Wirel. Netw. 15, 821–830 (2009)
9. Shakhsi, N.M., Joulaei, F., Razmi, J.: Extending two-dimensional bin packing problem:
consideration of priority for items. J. Ind. Syst. Eng. 3, 72–84 (2009)
10. Hamdi-Dhaoui, K., Labadie, N., Yalaoui, A.: Algorithms for the two dimensional bin
packing problem with partial conflicts. RAIRO-Oper. Res. 46, 41–62 (2012)
11. Khanafer, A., Clautiaux, F., Talbi, E.G.: Tree-decomposition based heuristics for the two-
dimensional bin packing problem with conflicts. Comput. Oper. Res. 39, 54–63 (2012)
12. Berkey, J.O., Wang, P.Y.: Two-dimensional finite bin-packing algorithms. J. Oper. Res. Soc.
38, 423–429 (1987)
13. Monaci, M., Toth, P.: A set-covering-based heuristic approach for bin-packing problems.
INFORMS J. Comput. 18, 71–85 (2006)
A Crow Search-Based Genetic Algorithm 215
14. Baumgartner, L., Schmid, V., Blum, C.: Solving the two-dimensional bin packing problem
with a probabilistic multi-start heuristic. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol.
6683, pp. 76–90. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_6
15. Wong, L.: Heuristic placement routines for two-dimensional rectangular bin packing
problems. Ph.D. thesis, Universiti Putra Malaysia (2009)
16. Parreño, F., et al.: A hybrid GRASP/VND algorithm for two-and three-dimensional bin
packing. Ann. Oper. Res. 179, 203–220 (2010)
17. Lai, K.K., Chan, J.W.: Developing a simulated annealing algorithm for the cutting stock
problem. Comput. Ind. Eng. 32, 115–127 (1997)
18. Gonçalves, J.F., Resende, M.G.: A biased random key genetic algorithm for 2D and 3D bin-
packing problems. Int. J. Prod. Econ. 145, 500–510 (2013)
19. Soke, A., Bingul, Z.: Hybrid genetic algorithm and simulated annealing for two-dimensional
non-guillotine rectangular packing problems. Eng. Appl. Artif. Intell. 19, 557–567 (2006)
20. Blum, C., Schmid, V.: Solving the 2D bin packing problem by means of a hybrid
evolutionary algorithm. Procedia Comput. Sci. 18, 899–908 (2013)
21. Shin, Y.B., Kita, E.: Solving two-dimensional packing problem using particle swarm
optimization. Comput. Assist. Methods Eng. Sci. 19, 241–255 (2017)
22. Liu, D.S., et al.: On solving multiobjective bin packing problems using evolutionary particle
swarm optimization. Eur. J. Oper. Res. 190, 357–382 (2008)
23. Reeves, C., Rowe, J.E.: Genetic Algorithms: Principles and Perspectives: A Guide to GA
Theory. Springer, New York (2002). https://doi.org/10.1007/b101880
24. Askarzadeh, A.: A novel metaheuristic method for solving constrained engineering
optimization problems: crow search algorithm. Comput. Struct. 169, 1–12 (2016)
25. De Souza, R.C.T., et al.: A V-shaped binary crow search algorithm for feature selection. In:
IEEE Congress on Evolutionary Computation, pp. 1–8. IEEE, Rio de Janeiro (2018)
26. Goldberg, D., Lingle, R.: Alleles, loci, and the travelling salesman problem. In: First
International Conference on Genetic Algorithms and their Applications, pp. 154–159.
Lawrence Erlbaum Associates, Hillsdale (1985)
27. http://or.dei.unibo.it/library/two-dimensional-bin-packing-problem
28. Martello, S., Vigo, D.: Exact solution of the two-dimensional finite bin packing problem.
Manag. Sci. 44, 388–399 (1998)
29. Kennedy, J., Eberhart, R.C.: A discrete binary version of the particle swarm algorithm. In:
International Conference on Systems, Man, and Cybernetics. Computational Cybernetics
and Simulation, vol. 5, pp. 4104–4108 (1997)