Paper 10
Paper 10
Abstract—In this paper, an optimization problem belonging (abbreviated to KP). In fact, there are a wide variety of
to project management family is determined, where the practical situations in real-life problems in various domains
objective is to maximize the benefit of a certain projects. The that can be simulated as the KP [4].
selection of projects to be performed with a limited budget is a Let goods be a set I of n items and the limited finance be
decision process which is characterized by its difficulty in case
the capacity c of the knapsack, where each item i is
of high scale problems. In this paper, an approach based on
genetic algorithm is presented for solving high scale of the characterized by a profit pi and a cost wi, the objective of the
tackled problem with the goal of maximization the benefits. problem is to select a subset of items so that the sum of the
First, a mathematical programming model is presented to selected items’ profits is maximized. Meanwhile, the sum of
represent the problem. Then, a heuristic method was proposed the selected items’ cost not exceed the limited budget
based on genetic algorithm and random neighborhood search capacity c. The mathematical programming model of the
techniques. The study realized here simulate a practical considered problem can be stated as follows:
situation as an optimization problem and highlighted the
effectiveness of genetic algorithm and random search techniques ݔܽܯǣ݂ሺݔሻ ൌ ܲ ܺ ሺͳሻ
ୀଵ
for solving it. The presented method is competitive since it is able
to present high quality solutions in acceptable solution time. As
shown in the computation results section the proposed genetic ݏǤ ݐǤ ܹ ܺ ܿሺʹሻ
ୀଵ
algorithm: Decision-Making in Project Management average
solution needs 13.7 s, random neighborhood search needs 14.1 s, ܺ אሼͲǡͳሽ ܫ א ݅ൌ ሼͳǡ ǥ ǡ ݊ሽሺ͵ሻ
and greedy procedure needs so small time. In spite of genetic
algorithm: Decision-Making in Project Management required The decision variable Xi, is equal to 1 (i.e. Xi=1) if an item
more solution time than others algorithms greedy algorithm and
random neighborhood algorithm, but the solution quality is
i is selected (included in the solution), otherwise xi=0 (out of
better. In addition, the presented work highlights the the solution). In this integer linear programming, there are
effectiveness of optimization solution procedures in decision- three equations. The first “(1)” is the objective function where
making to justify the investment budget and so maximizing the goal is to maximize the value of the total profit of items
benefits of organizations, personnel, and others. with high quality under two constraint. The first constraint “(
2)” is the capacity constraint, ensuring that the costs’ sum of
Keywords— Optimization, Genetic algorithm, Random search, the selected items does not exceed the limited finance.
Project Management. Meanwhile, the second constraint “ (3)” is imposed on items
I. INTRODUCTION that are to be selected or not in the solution (it is not allowed
to select a fractional item) [5]. In order to avoid trivial cases,
To conduct a project with satisfying requirements of it is assumed that: all input data, are positive integers [6]. The
quality and function within a budget limit and cost, while rest of the paper is organized as follows. Second section
achieving high economic and social benefits, the project reviews some related works. Third section presents a decision
management is recommended. Successfully operating a support approach which can help to choose a subset of items
project requires high level of coordination and cooperation in among a given set, to achieve the objective of the problem:
all aspects, including: the choice of appropriate goods that highest profit rates as possible with the limited finance. In
achieve a high benefit [1],[2]. This is a practical situation fourth section, the performance of the proposed approach is
belonging to the project management problems’ family, evaluated, and analyzes the obtained results. Finally, fifth
which is about a project and how to explore the available section summarizes the contribution of the paper.
resources, efficiently. In such situations, it is difficult to make
a suitable decision for choosing a sub-set of items to II. RELATED WORKS
maximize the benefit without exceeding the predetermined As mentioned earlier, the PMP realistic problem which is
budget. For this reason, it is necessary to possess a decision simulated as the KP, therefore, the procedures that can solve
support system which can aid in such situations [3]. The Knapsack family-related problems can also be applied to solve
problem is abbreviated to PMP. As it is clear, if the available PMP (for more details, see [4]). Chen et.al. 2019, used an
items are too huge, the problem will be considered as NP- improved genetic algorithm to make an emergency decision
hard problem. In order to simplify the treating of such under restricted resource situation. They used a prospect
problems, it can be simulated as a well-known combinatorial theory based genetic algorithm to solve the allocation
optimization problem that is the knapsack problem
Authorized licensed use limited to: East Carolina University. Downloaded on July 27,2020 at 20:43:36 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Computer Science and Software Engineering (CSASE), Duhok, Kurdistan Region – Iraq
problems of emergency resources between multiple areas a random neighborhood search procedure. Third and last, on
under emergency status and restricted resources [7]. Ezugwu the populations, four evolutionary biology technique are
et. al. 2019, used various genetic algorithms to solve i KP01 used: inheritance, mutation, selection, and crossover. The
to increase the benefits of objects to without surpassing its inheritance, selection and crossover are used as
capacity. Among the algorithms used were Dynamic intensification strategies, while mutation is used as
programming (DP) and Branch and Bound (BB), in which diversification strategy in order to escape from a series of
they showed to produce best optimal values with lowest local optimum solutions and evolves toward better solutions.
computational time. Genetic algorithm is further used to solve Both intensification and diversification strategies are used to
constraints knapsack problems in dynamic environments [8]. explore the huge solution space of the considered problem
Qian et.al. 2019, used the algorithm based on memory and escape from a series of local optimum solutions.
updating and schemes of environmental reactions, and provide
A. A Starting Solution
multiple solutions in the memory that can adapt to different
dynamic environments. As an environmental issue rises, the As with most solution procedures of most combinatorial
algorithm, considering the hamming distance, replace the optimization problems, the first step involves finding a fast-
worst solution with be most convenient one [9]. Fan et. al. starting feasible solution [13]. Among heuristic methods,
2012, provided a new model through incorporation of cost as greedy methods are easy to implement and yield a feasible
a critical variable with considering soft logic as a key point in solution in a very short time, for this reason a greedy
the optimization process. This method allows team members algorithm is proposed to build a starting solution. The
in the project to minimize the overall cost of the project, proposed approach builds a solution piece by piece focusing
especially in projects having repetitive activities through on immediate improvement. As it is expected, the result of
selection of optimum output rate the Number of units that a such procedures is mostly of moderate quality but the
repetitive Activity can proceed at One Time (NAOT) from a solution time required is so small.
set of possible alternatives [10]. Wei et. al. 2019, used the
Algorithm I: A Greedy Procedure for PMP
Pareto mathematical model with genetic algorithm combined
with other different algorithms to determine the optimal Input An instance of PMP.
duration and cost requirement of a multi objective optimized Output A starting solution x.
problem in a construction project. Unlike traditional 1: Let C be the available project finance;
optimization models, the Pareto based model boosts the 2: Let N be the total numbers of items;
degree of fitness between the genetic algorithm and the actual
3: Let α be a residual project budget;
project through modifying time cost curve and improving
parameters of either traditional genetic algorithm and 4: Let Y be the objective value;
operation process design [11]. 5: Set Y = 0;
6: Set α = C;
In this paper, a specific case in project management
family is introduced in which the objective is to propose a 7: Let i = 0;
subset of items among a large number of items to be traded, 8: While (i is equal or less than N)
and make the project success and yield high profit rates. 9: if (α is equal or greater than 0)
Indeed, this work propose a decision support procedure that 10: xi==1;
yield approximate solutions based on genetic algorithms. The
11: α=α-wj;
proposed algorithm has been compared with a random
neighborhood search algorithm and a greedy algorithm. The 12: Y= Y+ pj
experiment results show that the proposed genetic algorithm 13: else
outperformance the both algorithms: the random 14: xi=0;
neighborhood search and the greedy algorithm. 15: end if
III. A GENETIC ALGORITHM FOR SOLVING PMP 16: i=i+1;
Genetic Algorithm (abbreviated to GA) are global search 17: End while;
heuristics used to find approximate (or may be optimal) 18: Return x and Y;
solutions. It has been considered as an effective algorithm for
optimizing large-scale problems, when the search space is Algorithm I illustrates the main steps of the proposed
very large or too complex for analytic treatment. Genetic greedy procedure. The proposed greedy build a starting
algorisms are a class of evolutionary algorithms inspired by feasible solution step by step sequentially, without looking
evolution of living organisms, where some techniques are ahead (i.e., without considering any consequence for the
used such as inheritance, mutation, selection, and crossover. future regarding best profit achieved). Line 1 initialize “C” as
Such evolutionary biology techniques can be used to the available finance of the project. Line 2 initialize “N“as the
implement genetic algorithms, which in turns, play as a total number of items. In line 3, α represent the residual budget
search technique to solve optimization problems [12]. This which is used to calculate the solution. In line 4, Y represent
section illustrates our proposed method for optimizing the the objective value. Algorithm I is started by setting Y to zero
PMP. The proposed method is based on the principle of GA. and α to C as mentioned in lines 5 and 6 respectively. Line 8-
Indeed, the proposed method is composed of three main 17 is the main loop of the algorithm showing the main steps
steps. First, a feasible solution is created using greedy of the proposed procedure. The loop iterates to the total
procedure. Second, a population of solution is created using number of items (N). In each iteration, an item i is tested
222
Authorized licensed use limited to: East Carolina University. Downloaded on July 27,2020 at 20:43:36 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Computer Science and Software Engineering (CSASE), Duhok, Kurdistan Region – Iraq
whether its inclusion in the solution violate or not the total condition. Line 4-11 illustrates the main steps of the
available budget (line 9). If it doesn’t violate the constraint, it algorithm. In each iteration, the starting solution (obtained
is considered as a member in the solution, by setting xi=1 (see from algorithm I) is destroyed by randomly removing a subset
line 10). Then, the remaining budget is calculated using of β of items to produce a destroying solution (see line 5).
equation α=α-wj (see line 11). After that, the objective Then, the destroyed solution is repaired and completed by
function Y is updated using equation Y= Y+ pj (see line 12). using the same greedy procedure explained in algorithm I (see
Otherwise, see line 14, the corresponding item will be rejected line 6). The knew generated solution is used to update the
xi=0 (i.e., not included in the solution). Meanwhile, (line 12) population (Line 7), only in two cases: (i) add the new solution
the objective function is updated with the new considered if and only if the size of the population is not reached to its
item’s benefit. Finally, (see line 18) the algorithm returns a maximum, and (ii) replace the new solution if it is better than
starting feasible solution, which is represented by a vector x of a solution in the population in case the size of the population
items and its related objective function Y. is reached to its maximum.
B. Population of Solutions Using a Random Neighborhood C. Genetic Algorithm Implementation
Procedure The random neighborhood search algorithm, as illustrated
The second step, in the proposed method, involves in the previous section, can enhance the starting solution,
creating a population of candidate solutions to the considered when it tries to remove randomly some items and to add some
optimization problem. A random neighborhood search others, in creating population. However, genetic algorithm can
procedure is proposed to create the population. It consists of evolve the population towards better solutions by depending
three main steps. First, destroy the starting solution by on four evolutionary techniques, which are: inheritance,
removing randomly β% of items from a solution to produce a mutation, selection, and crossover (see Algorithm III).
destroyed one. Second, repair the destroyed solution by
Algorithm III illustrates the main steps of the proposed
adding other items in order to yield a new feasible solution.
genetic algorithm. Line 1 initialize x as the starting solution
Third and last, update the population with the new generated
obtained from Algorithm I. Line 2 initialize “pop” as the
solution. These three steps are repeated to the required size of
population obtained from Algorithm II. In line 3, “N” is
the population.
initialized as a stopping criterion, i.e., the algorithm is stopped
Algorithm II: A Random Neighborhood Procedure to when it satisfied its maximum iteration “N”. Lines 4-13 shows
Create a Population the main steps of the algorithm. First, (line 5) two solutions
are selected randomly from the current population. Second
Input A starting solution obtained from Algorithm I. (see line 6) a single point crossover is applied to produce an
Output “Pop” a population of solutions. offspring. Third (line 7), a mutation strategy is applied to
1: Let “pop” be a population of candidate solutions; ensure the diversification of the solution procedure. In
mutation strategy, some items are selected randomly to
Let β be a starting solution, where β=x; the starting
2:
solution obtained from Algorithm I
change state from 1 to 0 to produce an evolved solution and to
diversify the procedure. The mutation may degrade the
3: Let i= no. of iterations; solution, therefore (see line 8), algorithm I is then applied to
4: while (i > 0) enhance the current solution. Forth (see line 9), the objective
Remove randomly a subset of β to yield a random function is calculated of the new offspring, if it is better than
5:
reduced problem. any solution in the population (see line 10), then replace the
Repair the reduced problem using Algorithm I to weak solution with the produced offspring. Five and last (see
6: line 11), update Sbest with the best solution obtained.
yield a new feasible solution.
If (the new solution is better than at least one of the Algorithm III: A Genetic Algorithm to Reproduce a Series
7:
pop’s solutions) or “Pop” is not reached to its of Evolved Solutions
Update pop with the new solution by replacing a
8: Input “Pop” the population obtained from Algorithm II.
weak solution with it;
9: End if; Output Sbest: best solution.
10: ݅ ൌ ݅ െ ͳ; 1: Let x be a starting solution obtained from Algorithm I;
11: end while;
2: Let pop be an initial population from Algorithm II;
12: Return pop;
3: Let N be the number of iterations; i.e., stopping criteria
Algorithm II illustrates the main steps of the proposed 4: While (N is equal or greater than 0)
neighborhood procedure of creating the population. The
5: Select randomly two parents to reproduce;
procedure creates candidate solutions by randomly destroying
part of the starting solution then rebuild it using the same 6: Apply crossover to produce offspring;
greedy procedure illustrated in Algorithm I. Line 1, initialize
7: Apply mutation to offspring;
“pop” as the size of the population; i.e., the number of
candidate solutions in the population. In line 2, β is initialized 8: Apply Algorithm I to enhance the current solution;
as the starting solution obtained from Algorithm I, where β=x. 9: Observe the objective function of the offspring;
Line 3, initialize the number of iterations; i.e., a stopping
223
Authorized licensed use limited to: East Carolina University. Downloaded on July 27,2020 at 20:43:36 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Computer Science and Software Engineering (CSASE), Duhok, Kurdistan Region – Iraq
Update pop with the offspring by replacing the Table I shows the effectiveness of the proposed method
10:
weakest individual with it; with the variation of population’s size. The first column (Tr.)
11: Update Sbest with the best offspring; shows the number of trials, where 10 trials have been
performed. Column 2 to 5 ( = 100 to 800) illustrates
12: ܰ ൌ ܰ െ ͳ; solutions’ time and the solutions’ value obtained when the
13: End while; population is varying among 100, 200, 400, and 800 candidate
solutions respectively. Row 13 (Av. Sol.) shows the average
14: Return Sbest and Y;
time and average solutions’ value. Row 14 (Best Sol.) shows
IV. COMPUTATION RESULTS the best time and solution’s value achieved by the algorithm
with related population. As it is shown in the table, by
In this section, the results of the proposed genetic increasing the size of population, the quality of the solutions
algorithm: Decision-Making in Project Management increased. In fact, the average solutions for both population
(abbreviated to GAPM) are reported. The computational 400 and 800 are nearly close. Meanwhile, the solutions’ time
experiments were performed on a computer with a core i5 required when the population is 800 larger than the time
CPU at 2.5 GHz. The proposed algorithm was coded in C++. required when the population was 400. Thus, the aim was to
The tested instance consisted of 400 variants of items, each propose a fast algorithm, so for the next steps regarding
item with a cost and a profit value, and the limited finance is computational results, the GAPM algorithm has been tuned to
2000$. The results obtained by the proposed GAPM are population size of 400.
compared with those obtained by a greedy algorithm and a
random neighborhood search algorithm. For the next step in the computational results see Table II,
the size of the population was set to 400 solutions (i.e., pop =
In the first step of computational results, the stopping 400) and vary the stopping criteria (i.e., the number of
criteria of the proposed genetic algorithm Algorithm III was iterations “N”) to 200k, 500k, 800k, and 1200k, respectively.
fixed to 200000 (200k) iterations and then the performance of Table II illustrates the effectiveness of the proposed method
the algorithm was evaluated by varying the size of the when the population size is fixed to 400 while the stopping
population see Table I. In other words, the genetic algorithm criteria (N) is varying. The first column (Tr.) shows the
iterates 200K times for each of the following population: The number of trials, where 10 trials have been performed.
first suggested population is “pop = 100”, the second is “pop Column 2 to 5 (N=200k, N=500k, N=800k, and N=1200k)
= 200”, the third is “pop = 400”, and finally, the fourth illustrates solutions’ time and the solutions’ value obtained
suggested population is “pop = 800”. Table I illustrates the when the population is fixed to 400 and the stopping criteria
variation of the solutions values and required solution running “N” vary as following: 200000, 500000, 800000, and 1200000
time achieved by the proposed algorithm over the treated iterations respectively. Row 13 (Av. Sol.) reports the average
instance. solutions achieved, the average value and its solution time
required. Row 14 (Best Sol.) reports the best solutions
TABLE I. THE EFFECT OF THE VARIATION OF THE SIZE OF THE
POPULATION
achieved through the 10 trials, the best solution and its
required solution time.
Tr. pop =100 pop =200 pop =400 pop =800
Table II shows that, the quality of solutions and their time
t(s) Val. t(s) Val. t(s) Val. t(s) Val. are increased with the increasing of iterations. This is because
a random procedure has been used in the selection of
st
1 1.993 17468 2.098 18147 2.211 18337 2.62 18846 solutions in the population for the crossover technique, so, the
2nd 1.996 17712 2.113 18350 2.228 17989 2.601 18964 chance to provide better solution is increased with the
increasing of iterations. However, the best average solution
3rd 2.004 17930 2.114 17847 2.224 18612 2.601 19051 can be obtained with 1200K iterations. Therefore, for the next
4 th
2.002 17150 2.107 18124 2.219 19005 2.592 18410 and last step in the computational GAPM algorithm has been
tuned to population size of 400 and stopping criteria to
5th 1.987 17936 2.119 18746 2.206 17833 2.613 18593 1200000 iteration.
th
6 2.007 17593 2.108 17976 2.213 18444 2.61 18174
TABLE II. EFFECT OF THE VARIATION OF ITERATION
7th 1.995 17757 2.126 18352 2.22 18428 2.606 18564
Tr. N =200k N =500k N =800k N =1200k
8th 1.996 18000 2.094 18416 2.223 18294 2.61 18161
t(s) Val. t(s) Val. t(s) Val. t(s) Val.
9th 1.998 17076 2.115 17820 2.22 18678 2.604 18444
st
1 2.24 18728 5.716 18388 9.274 18147 13.616 20203
10th 2.006 18277 2.134 17992 2.208 18746 2.605 18099
2nd 2.228 17816 5.693 18652 9.228 18350 13.757 18655
Av.
1.998 17689.9 2.112 18177 2.217 18436.6 2.606 18530.6 3rd
2.209 18598 5.704 18654 9.15 17847 13.712 20454
Sol.
Best 4th 2.205 18148 5.684 19436 9.241 18124 13.678 20508
1.987 18277 2.094 18416 2.206 20288 2.601 20434
Sol.
5th 2.227 18691 5.802 19537 9.156 18746 13.748 20671
th
6 2.213 19000 5.674 18643 9.036 17976 13.697 20007
224
Authorized licensed use limited to: East Carolina University. Downloaded on July 27,2020 at 20:43:36 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Computer Science and Software Engineering (CSASE), Duhok, Kurdistan Region – Iraq
7th 2.212 18153 5.731 18824 9.171 18352 13.671 18586 of high quality and outperformed a random neighborhood
search and greedy algorithms.
8th 2.221 18326 5.667 18501 9.281 18416 13.732 19054
th
REFERENCES
9 2.214 19318 5.766 18782 9.269 17820 13.627 20424
[1] A. Badiru, "Project Management: Systems, Principles,
10th 2.219 17852 5.828 18959 9.156 17992 13.746 19256 and Applications", CRC Press, 2011.
[2] C. Artigues, S. Demassey, & E. Néron, "Resource Constrained
Av. Project Scheduling: Models, Algorithms, Extensions and
2.218 18463 5.7265 18837.6 9.196 19058.6 13.6984 19781.8
Tot Applications", John Wiley & Sons, 2013.
[3] N. Pillay & R. Qu, "Hyper-Heuristics: Theory and Applications",
Best Springer, 2018.
2.212 19318 5.667 19537 9.15 19888 13.616 20671
Sol.
[4] H. Kellerer, U. Pferschy, & D. Pisinger, "Knapsack Problems", New
Yourk, Springer-Verlag, Berlin Heidelberg, 2014.
For the next and last step in the computational results see [5] D. Du, P. Pardalos, "Handbook of Combinatorial Optimization",
Table III, the performance of GAPM algorithm has been Springer Science & Business Media, 2013.
compared with the performance of a greedy algorithm and a [6] M. Hifi, & N. Otmani, “An algorithm for the disjunctively constrained
knapsack problem”, International Journal of Operational Research,
random neighborhood search algorithm. GAPM is illustrated 13(1): 22- 43, 2012.
in Algorithm III. The greedy procedure is illustrated in [7] L. Chen , Y. Wang and G. Guo,” An Improved Genetic Algorithm for
Algorithm I. The neighborhood search algorithm is illustrated Emergency Decision Making under Resource Constraints Based on
in Algorithm II. Both GAPM and neighborhood search Prospect Theory”, Algorithms 12, 43, 2019.
algorithms have been tuned to 400 population size and [8] A. Ezugwu, V. Pillay, D. Hirasen, K. Sivanarain and M. Govender, “A
1200000 iteration. Comparative Study of Meta-Heuristic Optimization Algorithms for 0 -
1 Knapsack Problem: Some Initial Results”, Digital Object Identifier
10.1109/ACCESS.2019.2908489, March 24, 2109.
TABLE III. PERFORMANCE OF GAPM WITH COMPARE WITH A GREEDY [9] S. Qian, Y. Liu, Y. Ye and G. Xu, “An enhanced genetic algorithm for
AND A RANDOM NEIGHBORHOOD ALGORITHM constrained knapsack problems in dynamic environments”, Natural
Computing volume 18, pages913–932, 2019.
Random [10] S. Fan, K. Sun, Y. Wang, “GA optimization model for repetitive
GAPM
Greedy Algo. neighborhood projects with soft logic”, Automation in Construction: 253-261,2012.
1200K iteration
1200K iteration.
[11] H. Wei, S. Yichao, & K. Dewei, “Optimization Model Calculation of
t(s) Val. t(s) Val. t(s) Val. Construction Cost and Time Based on Genetic Algorithm”, Earth
Environ, Sci: 242 062044, 2109.
Av. [12] S. Forrest, “Genetic Algorithms: Principles of Natural Selection
̱Ͳ 13504 14.1 13987 13.7 19781.8
Val. Applied to Computation Science” ,261, 872–878, 1993.
[13] S. Dasgupta, H. Christos Papadimitriou, and U. Vaziani “Algorithms”,
McGraw-Hill, Inc., 2006.
As shown in Table III, the average solution value obtained by
PMGA is better than those obtained from the both greedy and
random neighborhood search procedures. Indeed, PMGA is
able to realize an average value about 19781.8, whereas
greedy and random neighborhood search algorithms realize
13504, 13978 respectably. According to solution time
required, GAPM average solution needs 13.7 s, random
neighborhood search needs 14.1 s, and greedy procedure
needs so small time. In spite of PMGA required more solution
time than the greedy algorithm, but the solution quality is
better.
V. CONCLUSION
In this paper, a real-life problem belonging to project
management family has been tackled, where, the objective is
to maximize the profit by choosing a subset of items to be
traded with a limited budget. First the problem has been
simulated as an optimization problem, then, an algorithm has
been proposed as a decision-support algorithm to help in
optimizing the problem. The proposed method is based on the
principles of genetic algorithm and random neighborhood
search technique. It consists of three main steps. First, a
greedy procedure is used to create a starting feasible solution.
Second, a random neighborhood search technique is used to
generate a random population of solutions. Third and last, the
four genetic principles are used in order to enhance the quality
of solutions, which are: inheritance, mutation, selection, and
crossover. Computational results showed that, the proposed
method (GAPM) is competitive since it yields fast solutions
225
Authorized licensed use limited to: East Carolina University. Downloaded on July 27,2020 at 20:43:36 UTC from IEEE Xplore. Restrictions apply.