0% found this document useful (0 votes)
13 views14 pages

Achab Berrechid Crow Search Algorithm

This document presents a hybrid algorithm combining the Crow Search Algorithm (CSA) and Genetic Algorithm (GA) to address the Two-Dimensional Bin Packing Problem (2D-BPP), which involves efficiently packing rectangular items into bins without overlap. The proposed method, called Crow Search-based Genetic Algorithm (CSGA), aims to leverage the strengths of both algorithms to enhance solution quality and performance, evaluated against standard benchmarks. Results indicate that the CSGA outperforms traditional algorithms, showcasing its effectiveness in solving complex packing problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views14 pages

Achab Berrechid Crow Search Algorithm

This document presents a hybrid algorithm combining the Crow Search Algorithm (CSA) and Genetic Algorithm (GA) to address the Two-Dimensional Bin Packing Problem (2D-BPP), which involves efficiently packing rectangular items into bins without overlap. The proposed method, called Crow Search-based Genetic Algorithm (CSGA), aims to leverage the strengths of both algorithms to enhance solution quality and performance, evaluated against standard benchmarks. Results indicate that the CSGA outperforms traditional algorithms, showcasing its effectiveness in solving complex packing problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/335699232

A Crow Search-Based Genetic Algorithm for Solving Two-Dimensional Bin


Packing Problem

Chapter · August 2019


DOI: 10.1007/978-3-030-30179-8_17

CITATIONS READS

13 476

4 authors:

Soukaina Laabadi Mohamed Naimi

8 PUBLICATIONS 101 CITATIONS


Hassan First University, National School of Applied Sciences (ENSA), Berrechid, M…
8 PUBLICATIONS 110 CITATIONS
SEE PROFILE
SEE PROFILE

Hassan El Amri Achchab Boujemâa


Université Hassan II de Casablanca Université Hassan 1er and Université Mohammed VI Polytechnique
27 PUBLICATIONS 284 CITATIONS 127 PUBLICATIONS 784 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Hassan El Amri on 16 April 2020.

The user has requested enhancement of the downloaded file.


A Crow Search-Based Genetic Algorithm
for Solving Two-Dimensional Bin
Packing Problem

Soukaina Laabadi1(&), Mohamed Naimi2, Hassan El Amri1,


and Boujemâa Achchab2
1
Laboratory of Mathematics and Applications, ENS - Hassan II University,
Casablanca, Morocco
[email protected]
2
Laboratory of Analysis, Modeling Systems and Decision Support,
ENSA - Hassan I University, Berrechid, Morocco

Abstract. The two-dimensional bin packing problem (2D-BPP) consists of


packing, without overlapping, a set of rectangular items with different sizes into
smallest number of rectangular containers, called “bins”, having identical
dimensions. According to the real-word requirements, the items may either have
a fixed orientation or they can be rotated by 90°. In addition, it may or not be
subjugate to the guillotine cutting. In this article, we consider the two-
dimensional bin packing problem with fixed orientation and free cutting. In fact,
we propose a hybrid approach by combining two bio-inspired algorithms that
are the crow search algorithm (CSA) and the genetic algorithm (GA) to solve the
considered problem. So, the main idea behind this hybridization is to expect
reaching a sort of cooperative synergy between the operators of the two com-
bined algorithms. That is, the CSA is discretized and adapted to the 2D-BPP
context, while using genetic operators to improve individuals (i.e. crows)
adaptation. The average performance of the proposed hybrid approach is eval-
uated on the standard benchmark instances of the considered problem and
compared with two other bio-inspired algorithms having closely similar nature;
namely standard genetic algorithm and binary particle swarm optimization
algorithm. The obtained results are very promising.

Keywords: Two-dimensional bin packing problem  Bio-inspired algorithms 


Crow search algorithm  Genetic algorithm  Hybridization

1 Introduction

The bin packing problem belongs to the family of cutting and packing (C&P) problems
[1]. It occupies an important place in the literature of C&P problems, as it can be
encountered in several fields, such as logistics [2], industry [3], and Cloud computing
[4]. In this paper, we consider the bin packing problem of rectangles, called two-
dimensional bin packing problem (2D-BPP). Given a finite number of small rectangles
(items), and an undetermined number of large and identical rectangles (bins). The aim
is to allocate all items, without overlapping, to a minimum number of bins. Formally,

© Springer Nature Switzerland AG 2019


C. Benzmüller and H. Stuckenschmidt (Eds.): KI 2019, LNAI 11793, pp. 203–215, 2019.
https://doi.org/10.1007/978-3-030-30179-8_17
204 S. Laabadi et al.

we have a set of n items; each item j is characterized by a width wj and a height


hj ðj ¼ 1; . . .; nÞ; additionally to an unlimited number of bins, each bin has a width W
and a height H. The goal is to minimize the number of used bins to pack all items by
respecting the loading capacity ðW  H Þ of each used bin. The items are packed with
their edges parallel to those of bins. In addition, each item must be packed in one and
only one bin. We assume, without loss of generality, that all input data are positive
integers satisfying hj  H and wj  W, for all j 2 f1; . . .; ng: No further restriction is
present: The items cannot be rotated by 90°, and we do not impose the so-called
guillotine cutting, which means that the items are obtained through a sequence of edge-
to-edge cuts parallel to the edges of the bin.
Several variants of 2D-BPP exist in the literature. In most cases, they are classified
into four categories according to two criteria: whether or not the items could be rotated
by 90° and whether or not a guillotine cutting is required. The reader is referred to [5]
for more details. The availability of information about items is another aspect to
distinguish two other different categories of 2D-BPP, which are online and offline 2D-
BPP. Online version means that items arrive one by one and we don’t have any
information about the complete items sequence [6]. Offline version means that all items
are known before they are packed. This latter is the standard version of 2D-BPP. Other
extensions of 2D-BPP depend on the fragility of items [7, 8], priority of items [9], and
compatibility between items. Indeed, in some fields, items may be incompatible, and
have to be separated by a safety distance when they are packed in the same bin [10] or
they must be packed totally in different bins [11].
The 2D-BPP is NP-hard in the strong sense [5] since it is a generalization of the
well-known one-dimensional bin packing problem (1D-BPP). Exact methods cannot
calculate optimal solutions in reasonable runtime for such problems, especially in
large-scale instances. So, heuristics and metaheuristics are the methods of choice.
Traditional heuristic methods for solving 2D-BPP include level-oriented heuristics are
proposed in [12]. Other heuristics dedicated to different variants of 2D-BPP are
introduced in [5]. In [13], the authors have proposed a set-covering-based heuristic
approach. Recently, the authors in [14], have designed a multi-start algorithm based on
probabilistic version of Wang and Lee’s heuristic [15]. To enhance the search ability
hoping to find high-quality solutions, the recent studies are widely focalized on
metaheuristic approaches. In [5], the authors have incorporated their proposed
heuristics in Tabu search algorithm which is proved to be effective for solving 2D-BPP.
In [16], a new Greedy Randomized Adaptive Search Procedure (GRASP) is designed,
in which the constructive phase is based on maximal-space heuristic (see [17]), while in
the improvement phase, several new moves are combined with Variable Neighborhood
Descent (VND). Moreover, the authors in [18], have proposed a new random-key
genetic algorithm (RKGA) that uses the maximal-space concept to manage free space
in bins. We note that RKGA is a genetic algorithm in which the chromosomes are
represented as vectors of randomly generated real numbers in the interval [0, 1]. In
[19], the authors have proposed an improved genetic algorithm and a simulated
annealing method to solve 2D-BPP. In [20], the authors have developed an evolu-
tionary algorithm by combining it with Wang and Lee’s heuristic. Nevertheless, only
few researchers have been interested by swarm intelligence algorithms to solve 2D-
A Crow Search-Based Genetic Algorithm 205

BPP. An improved version of particle swarm optimization algorithm (PSO) is proposed


in [21], wherein they used a second global best position of all particles additionally to
the global and the personal best positions. Otherwise, an evolutionary PSO dealing
with multi-objective 2D-BPP is introduced in [22].
The remainder of the paper is organized as follows. In Sect. 2, we give briefly the
principles of CSA and GA. Our proposed CSA based on genetic operators, called
CSGA, is described in Sect. 3. The results reached by the proposed algorithm are
presented and analyzed in Sect. 4. Finally, we give some conclusions in Sect. 5.

2 Background

In this section, we will briefly review genetic algorithm and crow search algorithm.

2.1 Genetic Algorithm Principle


GA is a bio-inspired algorithm developed from the natural reproduction process. In
general, GA maintains a population with a certain number of encoded individuals. Each
member is encoded as a chromosome that represents a candidate solution, and it is
composed of genes that correspond to decision variables of the optimization problem.
The quality of chromosomes is evaluated with a fitness function. In fact, GA starts with
an initial population of chromosomes and then conducts selection, crossover, and
mutation operators to improve its population generation by generation until satisfying
the stopping criterion (i.e. a maximum number of generation or a fixed amount of time).
For more details about GA, one can refer to [23]. A conventional GA works as follows:
• Generate randomly an initial population of chromosomes.
• Repeat until the stopping criterion is met:
– Execute selection operation in the whole population. The selected chromosomes
(parents) will participate to reproduction.
– Recombine pairs of parents with a probability pc to breed offspring chromo-
somes, by applying a crossover operator.
– Mutate the offspring chromosomes with a probability pm to avoid the classical
problem of GA; that is premature convergence.
– Replace the population of parents with the generated offspring chromosomes
thanks to a replacement strategy.
• Return the best chromosome as a global optimum of the considered problem.

2.2 Crow Search Algorithm Principle


CSA is a recent bio-inspired algorithm developed from the crows’ behavior for hiding
food and finding it later if needed. CSA maintains a swarm with N crows. Each crow is
encoded  by its position (hiding  place) in d-dimensional environment
xi ðGÞ ¼ x1i ðGÞ; x2i ðGÞ; . . .; xdi ðGÞ , where i 2 f1; . . .; N g and G is the generation
number. Each crow position corresponds to a feasible solution in search space. Each
206 S. Laabadi et al.

crow i has a memory mi ðGÞ, at the generation G, where it memorizes its best hiding
place. The quality of positions is measured by a fitness or objective function of the
optimization problem.
The CSA starts with initial positions of crows and sets the memory of crows to their
initial positions, and then iteratively updates the position and the memory of crows
according to the following formulas:
  
xi ðG  1Þ þ ri x FL x mj ðG  1Þ  xi ðG  1Þ if rj  AP
xi ðGÞ ¼ ð1Þ
a random position otherwise

xi ðGÞ if f ðxi ðGÞÞ  f ðmi ðG  1ÞÞ
mi ðGÞ ¼ ð2Þ
mi ðG  1Þ otherwise

In fact, at each generation, each crow i selects randomly a crow j from the swarm in
order to follow it aiming to steal its food. If the crow j knows about the presence of
crow i, the former fools the latter by moving towards a random position. We note that
FL is the crows flight length, AP is the awareness probability, f ð:Þ is the fitness
function, while ri and rj are uniform distributed random numbers in the interval [0, 1].
The CSA terminates when a maximum number of generations is met. Then, it picks
out the overall best memory in the swarm as a global optimum of the problem.
For more details, we invite the readers to refer to the original paper of CSA [24].

3 Crow Search-Based Genetic Algorithm

It is worth mentioning that the CSA cannot be directly applied to binary-valued


optimization problems. To the best of our knowledge, [25] was the first work to deal
with a binary version of CSA, in which the authors have applied a v-shaped bina-
rization technique to resolve the feature selection problem. In this paper, we propose a
hybrid approach to solve 2D-BPP. Our approach combines GA with CSA, and we call
it Crow Search-based Genetic Algorithm (CSGA). That is, we are first motivated by the
smart behavior of crows in CSA and then by the evolutionary behavior of individuals
in GA. Besides, the GA operators can be easily and even directly adapted in binary
search spaces. In comparison with GA, in the CSGA each member of the population
participates with its own information to crossover operation. So, the crossover prob-
ability is not considered and the algorithm doesn’t need a replacement strategy to create
the upcoming generations. In comparison with CSA, in the CSGA each crow i selects
one of the crows in the swarm as direction to follow it using a selection operator. In
addition, a mutation operator is applied with a mutation probability, in order to increase
the population diversity. The following subsections will elaborate the full process of
our proposed CSGA.

3.1 Building Initial Population


The positions of crows in d-dimensional environment are elaborated by allocating a
given item or not to a given bin. So, each position is considered as a potential solution
A Crow Search-Based Genetic Algorithm 207

of the problem. It is encoded as a n  d matrix, where n is the number of items to be


packed into bins, and d is regarded as the number of available bins. The representation
of each solution i, in a population having N crows, is shown below:
 
X i ¼ xijk 2 f0; 1gnd ; for i 2 f1; . . .; N g ð3Þ

For all i, the decision variable xijk takes the value 1 if the j-th item is packed in the k-th
bin, and it is equal to 0 otherwise. For the sake of clarity, an item is allocated to one and
only one bin, then it is immediately removed from the list of items. Moreover, the
packing of each item might respect the Bottom-Left strategy that consists of placing an
item in the lowest and the left-most position of a given bin (see [12]).
The memory matrix M i has the same dimensions as the position matrix X i of each
crow i. In the initial stage, the position matrix coefficients are generated randomly and
the position matrix coincides with memory matrix of each crow i.

3.2 Evaluating Feasibility and Quality of Solutions


On one hand, the random process used at the initial stage can generate infeasible
solutions. A solution is regarded as infeasible whether it violates the capacity constraint
(see Eq. (4)) of at least one of the used bins:
Xn
for i 2 f1; . . .; N g : w  hj  xijk  W  H  yk ; 8k 2 f1; . . .; d g
j¼1 j
ð4Þ

1 if the bin k is used
Where yk ¼ ð5Þ
0 otherwise
P
Notice that the bin k is considered as a used bin, if nj¼1 xijk  1, i.e. it contains at
Pn i
least one item, while it is not considered whether j¼1 xjk ¼ 0.
Meanwhile, to restore the feasibility of infeasible solutions. A repair operator is
applied that consists of removing items from exceeded bins iteratively until the
capacity of the corresponding bins will be respected, then inserting the removed items
in the former well-filled bins that can accommodate them.
On the other hand, we use the following fitness function to assess the quality of
feasible solutions:

1
f ¼ Pd ð6Þ
k¼1 yk

The Eq. (6) corresponds to reverse of the cost function of the 2D-BPP, which means
that the fewer the bins used to pack all items, the higher the fitness value.

3.3 Updating Crows Position and Memory


The GA operators and the CSA process are applied in order to update the crows
position. Henceforth, we consider crow positions as chromosomes. Each column of the
208 S. Laabadi et al.

position matrix corresponds to a gene of the chromosome, while such a gene is encoded
by a binary vector of n alleles. Figure 1 shows the structure of a chromosome.

Fig. 1. Representation of a chromosome with ðn ¼ 5Þ and ðd ¼ 4Þ

Selection Operation. In this stage, each crow i selects a crow j from the population,
by using a binary tournament selection. In fact, the binary tournament selection consists
of choosing at random two crows from the population and making a competition
between them. The winner is a crow with highest fitness value. So, the crow i chooses
the winner as a target to follow. That is, the position of the crow i is updated.
Crossover Operation. Crossover is the process of exchanging genes between indi-
viduals in order to create new ones. Its main goal is to exploit more the search space of
solutions. In this stage, we combine the CSA rules with crossover strategy in order to
update positions as follows. If rj  APj , we cross the position of crow i with the
position of the selected crow j. Else, the position of crow i is crossed with its memory.
A PMX crossover operator developed by Goldberg and Lingle [26], is applied to
recombine two positions, or even position and memory. The example in Fig. 2 explains
how we adapt it in our context. Indeed, we choose at random a bin from the first parent
position (for instance, bin 1), and we replace its content {item 2, item 3} in the
offspring at the same bin ID. The other bins inherits their contents from the second
parent. The duplicate items are replaced by items packed in the bin 1 of the second
parent by respecting the same order. In our example, items 2 and 3 are packed
respectively in bins 3 and 2 of the second parent. So, we replace the item 2 with item 4
and the item 3 by the item 5. We note that the sign «» indicates the crossover
operation.

Fig. 2. An example of crossover operation with ðn ¼ 5Þ and ðd ¼ 4Þ

Once the crossover strategy has finished, the feasibility of offspring positions is
evaluated. If it is necessary, the above-mentioned repair operator is applied for
restoring the feasibility of solutions.
A Crow Search-Based Genetic Algorithm 209

Mutation Operation. Besides crossover operation, the mutation operator is used to


explore more regions of solution search space hopping to find new better solutions. We
mutate some generated offspring by using the split bin mechanism proposed in [22]. It
consists of choosing randomly one used bin and split its content into two bins. In our
algorithm, we have applied this mechanism according to the following manner. The
first part of items is kept in its original bin and the second part of items is merged into
other bin that can bear it without violating the constraint capacity. If no used bin can
pack it, a new one is opened.
Update of Crow Memory. After mutation, the quality of each offspring position is
investigated using the fitness function, and compared with the memory of each crow. If
the offspring position is better than the so far memorized position, the crow updates
then its memory with the offspring position. Otherwise, it keeps its former memory.

3.4 The Framework of CSGA


The overall algorithm is summarized as follows.
Step1. Initialize the population with N crows, and set the control parameters, such as,
maximal generation number G, flight length of crows and their awareness probability.
For each crow i, set M i ðG ¼ 1Þ ¼ X i ðG ¼ 1Þ.
Step 2. Evaluate the feasibility of solutions, and restore the feasibility of infeasible
solutions by the proposed repair operator. Then, calculate their fitness value.
Step 3. For each crow i, apply a binary tournament selection to select the crow j.
Step 4. Update the position of each crow i using the following equations:

If rj  APj ; then X i ðG þ 1Þ ¼ X i ðGÞ  X j ðGÞ ð7Þ

Else, X i ðG þ 1Þ ¼ X i ðGÞ  M i ðGÞ ð8Þ

Step 5. Restore the feasibility of infeasible offspring positions.


Step 6. Apply mutation strategy to generated offspring positions with probability pm .
Step 7. Calculate the fitness function for each offspring position, and update the
memory of each crow by the Eq. (2).
Step 8. Loop to Step 3 until a maximum number of generations is satisfied and then
output the best found memory as an optimal solution of 2D-BPP.

4 Simulation
4.1 Simulation Design
The proposed CSGA is coded in JAVA on a PC with an Intel Core i5 with 2.5 GHz
and 4.0 GB of RAM, running on 64-bits Windows 7 operating system. The perfor-
mance of CSGA is examined on various test instances of the 2D-BPP available at the
URL [27]. We should note that the considered test data are divided into different
210 S. Laabadi et al.

classes. Ten classes are generated randomly by varying the capacity of bins as well as
the range of item’ sizes. The first six classes were introduced by Berkey and Wang
[12], while the last four classes were proposed by Martello and Vigo [28]. In each class,
identical bins are considered while the items are characterized by different sizes that are
uniformly generated in a same interval of values. Each class is also divided into five
subclasses according to the number of available items (n = 20, 40, 60, 80, 100),
whereas each subclass is divided into ten problem instances which gives a total of 50
problem instances for each class. During the experiments, four problem classes are
undertaken (see Table 1). Three classes that seem to be representative are chosen from
Berkey and Wang classes, while only one class is selected from Martello and Vigo
classes to represent its colleagues. Meanwhile, we considered just three subclasses
having n = 20, 60, 100 that corresponds, respectively, to small, medium and large size
problem.

Table 1. Information about test instances used in experiments


Class Bin capacity Item height Item width
problem
Class 1 W ¼H ¼ 10 [1, 10] [1, 10]
Class 3 W ¼H ¼ 40 [1, 35] [1, 35]
Class 5 W ¼H ¼ 100 [1, 100] [1, 100]
Class 9 W ¼H ¼ 100 70% in ½1=2H; H  and 10% 70% in ½1=2W; W  and 10%
in ½2=3H; H ; ½1; 1=2H  in ½2=3W; W , ½1; 1=2W 

To validate the efficiency of CGSA, this later is compared against the binary
particle swarm optimization algorithm (BPSO) proposed by Kennedy and Eberhart
[29], and the standard genetic algorithm (GA). We should note that the BPSO has
undergone minor modifications in order to adapt it to the 2D-BPP context, especially,
in encoding scheme which is similar to that of CSGA, and in the fact that it incor-
porates the abovementioned repair operator in order to meet the capacity constraint of
bins. Moreover, the standard GA uses the same genetic operators as used by CSGA
(see Subsect. 3.3). The control parameters of each algorithm are reported in Table 2.
That is; Popsize means the population size, Maxgen indicates the maximum number of
generations, BTS-size indicates the size of binary tournament selection. Furthermore,
wmin and wmax denote respectively the minimum and maximum inertia weight, while c1
and c2 denote the acceleration coefficients. Finally, Vmin and Vmax refer to the minimum
and maximum velocity of particles. It is worth mentioning that the parameters of
CSGA are set based upon many preliminary independent experiences, while the BPSO
parameters are chosen according to the experiences of other authors.
A Crow Search-Based Genetic Algorithm 211

Table 2. Parameter settings


Algorithms Parameter values
CSGA Popsize = 100; FL = 2; AP = 0,01; pm = 0,10; Maxgen = 100
GA Popsize = 100; BTS-size = 2; pc = 0,95; pm = 0,01; Maxgen = 100
BPSO Popsize = 100; wmin = 0,4; wmax = 0,9; c1 = c1 = 1,5; Vmin ¼ 2; Vmax ¼ 2;
Maxgen = 100

4.2 Simulation Results


All test cases of CSGA as well as the comparative algorithms were evaluated by 30
independent runs, and the average results are retained. To assess our results, in terms of
fitness values, we apply Wilcoxon test that is used in order to determine whether the
average fitness values of CSGA compared with those of BPSO and GA are signifi-
cantly different or not. The confidence level is fixed at 0,95, and the obtained results of
p-value are shown in Table 3. We use the SPSS statistics 22 for statistical testing. If the
p-value is less than 0,05, we reject the null hypothesis that assumes there is no sig-
nificant difference between two compared algorithms. Otherwise, we accept the
alternative hypothesis that assumes there is a significant difference between them. The
value of R− (respectively, R+) in Table 3 indicates the sum of the ranks corresponding
to negative (respectively, positive) difference. In fact, the sum of ranks is the absolute

Table 3. Wilcoxon test results on average fitness value of CSGA against other algorithms
Problem class Number of items Compared algorithms R+ R− p-value s
Class 1 20 CSGA - GA 55 0 0,005 1
CSGA - BPSO 18,5 2,5 0,093 0
60 CSGA - GA 55 0 0,005 1
CSGA - BPSO 27,5 27,5 1,000 0
100 CSGA - GA 55 0 0,005 1
CSGA - BPSO 55 0 0,005 1
Class 3 20 CSGA - GA 55 0 0,004 1
CSGA - BPSO 2 13 0,136 0
60 CSGA - GA 55 0 0,005 1
CSGA - BPSO 0 55 0,005 −1
100 CSGA - GA 55 0 0,005 1
CSGA - BPSO 2 53 0,009 −1
Class 5 20 CSGA - GA 55 0 0,005 1
CSGA - BPSO 4 41 0,028 −1
60 CSGA - GA 55 0 0,005 1
CSGA - BPSO 0 55 0,005 −1
100 CSGA - GA 55 0 0,005 1
CSGA - BPSO 0 55 0,005 −1
212 S. Laabadi et al.

Table 4. Comparative results of CSGA versus GA and BPSO using instances with 40 items
Test instance LB* CSGA GA BPSO
Used bins CPU time Used bins CPU time Used bins CPU time
Instance1 25 29 0,067 37 2,013 34 4,346
Instance2 32 29 0,066 36 1,916 35 4,383
Instance3 29 29 1,914 35 1,914 34 4,386
Instance4 31 28 0,063 35 2,085 34 4,472
Instance5 27 27 0,064 33 1,993 29 4,382
Instance6 29 29 0,067 36 2,030 35 4,393
Instance7 24 26 0,068 32 1,920 29 4,421
Instance8 26 28 0,063 35 2,006 34 4,676
Instance9 21 25 0,062 33 1,995 30 4,389
Instance10 34 30 0,064 38 2,028 39 4,423
Average 27,8 28 0,250 35 1,990 33,3 4,427

value of the difference between results of two algorithms. The value of s in Table 3
shows the statistical result of pairwise comparison: s ¼ 1 indicates that the first algo-
rithm is significantly better than the latter; s ¼ 1 indicates that the first algorithm is
significantly worse than the latter. Both values 1 and 1 means there is a significant
difference. Whereas s ¼ 0 indicates that there is no significant difference between the
two compared algorithms. From Table 3, we can then observe that CSGA is superior to
the standard GA. However, it is sometimes nearly equivalent to BPSO and sometimes
it is outperformed by BPSO.
Tables 4, 5 and 6 report the empirical results of CSGA, GA and BPSO in terms of
average used bins obtained over 30 runs, and the average execution time. These tests
are performed on class 9. In each table, the first column indicates the test instances of

Table 5. Comparative results of CSGA versus GA and BPSO using instances with 80 items
Test instance LB* CSGA GA BPSO
Used bins CPU time Used bins CPU time Used bins CPU time
instance1 59 60 0,264 70 7,816 65 34,436
instance2 58 59 0,271 69 7,718 66 39,465
Instance3 57 59 0,252 70 8,001 66 34,008
Instance4 53 60 0,263 69 7,753 65 37,740
Instance5 62 61 0,259 74 7,772 72 32,446
Instance6 62 59 0,251 71 7,576 68 31,868
Instance7 59 59 0,262 70 7,833 66 39,240
Instance8 58 58 0,264 71 7,902 68 32,110
Instance9 49 54 0,261 59 40,232 59 34,454
Instance10 60 61 0,274 74 7,575 71 32,291
Average 57,7 59 0,262 69,7 11,018 66,6 34,806
A Crow Search-Based Genetic Algorithm 213

Table 6. Comparative results of CSGA versus GA and BPSO using instances with 100 items
Test instance LB* CSGA GA BPSO
Used bins CPU time Used bins CPU time Used bins CPU time
instance1 71 75 0,377 87 11,755 81 63,273
instance2 64 69 0,371 86 11,785 81 64,554
Instance3 68 61 0,384 82 11,767 71 65,076
Instance4 78 76 0,372 92 11,872 88 62,890
Instance5 65 73 0,388 84 11,892 76 74,067
Instance6 71 74 0,381 78 64,425 78 73,664
Instance7 66 69 0,380 87 12,084 81 64,983
Instance8 74 73 0,375 90 11,859 86 62,992
Instance9 66 66 0,379 85 12,039 78 73,892
Instance10 72 75 0,384 84 64,165 84 65,849
Average 69,5 71,1 0,379 85,5 22,364 80,4 67,124

class 9. The second column records the benchmark lower bounds of 2D-BPP (see URL
[27]). The last columns represent the obtained average results of the three compared
algorithms.
From Tables 4, 5 and 6, we can see that the number of used bins obtained by
CSGA, in each instance, is nearest or equal to corresponding lower bound, and
sometimes even better than the latter. Furthermore, the proposed algorithm yields better
results in less computational time, in comparison with both BPSO and GA. We note
that best results are in bold and the computation time is expressed in seconds.

4.3 Results Analysis


From Table 3, the outcomes of experiments tested on classes 1, 3 and 5 have shown
that the average results of CSGA are better than GA and worse than BPSO. For these
test classes the difficulty grows quite regularly with the number of available items (see
[28]). However, from Tables 4, 5 and 6 in which class 9 is used as data test instances,
the computations prove that CSGA allocates the available items to a smallest number of
bins when it is compared against GA and BPSO, and even LB*. In class 9, the
generated data set seems to be easy than the sub-mentioned problem classes. This is
because there are a high percentage of large items, thereby having few free areas to be
managed. Such instances illustrate real-life situations [5].
It is interesting to see that CSGA has a better performance in comparison with GA
independently of whatever the type of instances. Nevertheless, it has generally a worst
performance against BPSO in difficult instances (classes 1, 3 and 5), but it performs
better than BPSO in more realistic situations as test instances of class 9 and yields
results of high quality in few amount of time.
214 S. Laabadi et al.

5 Conclusions

In this paper, we proposed a crow search algorithm based on genetic operators for
solving the 2D-BPP. We used various benchmark tests to examine the performance of
the proposed algorithm. In addition, we compared our algorithm with two state-of-the-
art algorithms that have the same nature (i.e. bio-inspired algorithms). The experi-
mental results showed that the proposed approach gives good results against that of GA
regardless the type of instances. However, our algorithm is outperformed by BPSO in
difficult instances, but it performs well than BPSO in more realistic instances. In
addition, our algorithm is quite fast than both comparative algorithms in easy instances
as well as in difficult instances. These results are very encouraging. They prove the
efficiency of the CSGA. Since our algorithm incorporates the genetic operators in the
core idea of CSA, it can be applied to solve, as the GA, several optimization problems
that work in discrete search space, as well as continuous search space. So, in future
research, we hope to exploit that in 2D-BPP which takes into account the guillotine
constraint and/or allows rotation of items by 90°, in addition to other important real-
world applications coming from different fields.

References
1. Wäscher, G., Haußner, H., Schumann, H.: An improved typology of cutting and packing
problems. Eur. J. Oper. Res. 183, 1109–1130 (2007)
2. Crainic, T.G., et al.: Bin packing problems with uncertainty on item characteristics: An
application to capacity planning in logistics. Procedia-Soc. Behav. Sci. 111, 654–662 (2014)
3. Wee, T.S., Magazine, M.J.: Assembly line balancing as generalized bin packing. Oper. Res.
Lett. 1, 56–58 (1982)
4. Song, W., et al.: Adaptive resource provisioning for the cloud using online bin packing.
IEEE Trans. Comput. 63, 2647–2660 (2014)
5. Lodi, A., Martello, S., Vigo, D.: Heuristic and metaheuristic approaches for a class of two-
dimensional bin packing problems. INFORMS J. Comput. 11, 345–357 (1999)
6. Epstein, L., Van Stee, R.: Optimal online algorithms for multidimensional packing problems.
SIAM J. Comput. 35, 431–448 (2005)
7. Laabadi, S.: A new algorithm for the Bin-packing problem with fragile objects. In: 3rd
International Conference on Logistics Operations Management, pp. 1–7. IEEE, Fez (2016)
8. Bansal, N., Liu, Z., Sankar, A.: Bin-packing with fragile objects and frequency allocation in
cellular networks. Wirel. Netw. 15, 821–830 (2009)
9. Shakhsi, N.M., Joulaei, F., Razmi, J.: Extending two-dimensional bin packing problem:
consideration of priority for items. J. Ind. Syst. Eng. 3, 72–84 (2009)
10. Hamdi-Dhaoui, K., Labadie, N., Yalaoui, A.: Algorithms for the two dimensional bin
packing problem with partial conflicts. RAIRO-Oper. Res. 46, 41–62 (2012)
11. Khanafer, A., Clautiaux, F., Talbi, E.G.: Tree-decomposition based heuristics for the two-
dimensional bin packing problem with conflicts. Comput. Oper. Res. 39, 54–63 (2012)
12. Berkey, J.O., Wang, P.Y.: Two-dimensional finite bin-packing algorithms. J. Oper. Res. Soc.
38, 423–429 (1987)
13. Monaci, M., Toth, P.: A set-covering-based heuristic approach for bin-packing problems.
INFORMS J. Comput. 18, 71–85 (2006)
A Crow Search-Based Genetic Algorithm 215

14. Baumgartner, L., Schmid, V., Blum, C.: Solving the two-dimensional bin packing problem
with a probabilistic multi-start heuristic. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol.
6683, pp. 76–90. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_6
15. Wong, L.: Heuristic placement routines for two-dimensional rectangular bin packing
problems. Ph.D. thesis, Universiti Putra Malaysia (2009)
16. Parreño, F., et al.: A hybrid GRASP/VND algorithm for two-and three-dimensional bin
packing. Ann. Oper. Res. 179, 203–220 (2010)
17. Lai, K.K., Chan, J.W.: Developing a simulated annealing algorithm for the cutting stock
problem. Comput. Ind. Eng. 32, 115–127 (1997)
18. Gonçalves, J.F., Resende, M.G.: A biased random key genetic algorithm for 2D and 3D bin-
packing problems. Int. J. Prod. Econ. 145, 500–510 (2013)
19. Soke, A., Bingul, Z.: Hybrid genetic algorithm and simulated annealing for two-dimensional
non-guillotine rectangular packing problems. Eng. Appl. Artif. Intell. 19, 557–567 (2006)
20. Blum, C., Schmid, V.: Solving the 2D bin packing problem by means of a hybrid
evolutionary algorithm. Procedia Comput. Sci. 18, 899–908 (2013)
21. Shin, Y.B., Kita, E.: Solving two-dimensional packing problem using particle swarm
optimization. Comput. Assist. Methods Eng. Sci. 19, 241–255 (2017)
22. Liu, D.S., et al.: On solving multiobjective bin packing problems using evolutionary particle
swarm optimization. Eur. J. Oper. Res. 190, 357–382 (2008)
23. Reeves, C., Rowe, J.E.: Genetic Algorithms: Principles and Perspectives: A Guide to GA
Theory. Springer, New York (2002). https://doi.org/10.1007/b101880
24. Askarzadeh, A.: A novel metaheuristic method for solving constrained engineering
optimization problems: crow search algorithm. Comput. Struct. 169, 1–12 (2016)
25. De Souza, R.C.T., et al.: A V-shaped binary crow search algorithm for feature selection. In:
IEEE Congress on Evolutionary Computation, pp. 1–8. IEEE, Rio de Janeiro (2018)
26. Goldberg, D., Lingle, R.: Alleles, loci, and the travelling salesman problem. In: First
International Conference on Genetic Algorithms and their Applications, pp. 154–159.
Lawrence Erlbaum Associates, Hillsdale (1985)
27. http://or.dei.unibo.it/library/two-dimensional-bin-packing-problem
28. Martello, S., Vigo, D.: Exact solution of the two-dimensional finite bin packing problem.
Manag. Sci. 44, 388–399 (1998)
29. Kennedy, J., Eberhart, R.C.: A discrete binary version of the particle swarm algorithm. In:
International Conference on Systems, Man, and Cybernetics. Computational Cybernetics
and Simulation, vol. 5, pp. 4104–4108 (1997)

View publication stats

You might also like