Unit II
Unit II
The methods that apply trial‐and‐error search include (1) sampling grid, (2) random
sampling, and (3) targeted sampling.
• The goal of a sampling grid is evaluating all possible solutions and choosing the
best one.
• If the problem is discrete, a sampling network evaluates all possible solutions and
constraints.
• The solution that satisfies all the constraints and has the best objective function
value among all feasible solutions is chosen as the optimum.
Therefore, the sampling grid method is practical for relatively small problems only.
• S possible solutions, among which r = 1 is the optimal one, and K possible
solutions are chosen randomly among the S possible ones to be evaluated.
• First, let us consider that the random selection is done without replacement,
and let Z denote the number of optimal solutions found in the randomly
chosen sample of K possible solutions.
• Z can only take the value 0 or 1 in this instance
• optimal one is found from the hypergeometric distribution
P(Z = 1) = K/S. , if there are S = 10^6 possible solutions and K = 10^5 possible
solutions are randomly
• P(Z = 1) 1-((S -1) S)^K = 1 - (999999 10^6 )10^5 = 0.095 or 9.5%
• The sampling grid and random sampling are not efficient or practical methods
to solve real‐world engineering problems
Random sampling
• Methods is that they require that all the decision space be searched Precisely
Targeted Sampling
• It searches the decision space, taking into account the knowledge gained from
previously tested possible solutions
• Selects the next sample solutions based on results from previously tested
solutions.
• Targeted sampling is the basis of all meta‐heuristic and evolutionary algorithms
that rely on a systematic search to find an optimum.
• Meta‐heuristic and evolutionary algorithms are typically applied to calculate
near‐optimal solutions of problems that cannot be solved easily or at all using
other techniques
Definition of Terms of Meta‐Heuristic and Evolutionary Algorithms
• Finding appropriate values for the decision variables of an optimization problem so that
Constraints
• Constraints delimit the feasible space of solutions of an optimization problem.
Fitness Function
• The value of the objective function is not always the chosen measure of desirability of a
solution.
Principles of Meta‐Heuristic and Evolutionary Algorithms
• Relation between the simulation model and the optimization algorithm in an optimization
problem.
p
Classification of Meta‐Heuristic and Evolutionary Algorithms
• Algorithms are inspired by natural process, such as the genetic Algorithm (GA),
the ant colony optimization (ACO), the honey‐bee mating optimization (HBMO)
• The are other types of algorithms such as Tabu Search (TS) that has origins
unrelated to natural processes.
• TS is classified as a non‐nature‐inspired algorithm, it takes advantage of artificial
intelligence aspects such as memory
• During the search for an optimum, they generate random values of the decision variables.
• Continuous decision variables are assigned values randomly between their lower and upper
boundaries
Dealing with Constraints
Removal Method
• The removal method eliminates each possible solution that does not satisfy the
constraints
Disadvantages
• This method does not distinguish between solutions with small and large
constraints violations.
• Even infeasible solutions may yield clues about the optimal solution.
Refinement Method
• This method does not delete any of the infeasible solutions from the search
process.
• The refinement method refines infeasible solutions to render them feasible
solutions.
Penalty Functions
• The application of penalty functions to avoid infeasible solutions overcomes
the shortcomings of the removal and refinement methods.
• This method adds (or subtracts) a penalty function to the objective function of a
minimization (or maximization) problem.
X = solution of the optimization problem, f(X) = value of the objective function of
solution X, G(X) = a constraint whose value exceeds δ1 (delta), H(X) = a constraint
whose value is less than δ2, Z(X) = a constraint whose value equals δ3
Phi -
theta (θ)
psi (Ψ
-- function
By iteration , new algorithm will evolve, that time new solution occurs.
Common selection methods are the Boltzmann selection, the roulette wheel, the
tournament selection.
A selection method with high selective pressure most likely selects the best solutions and
eliminates the worst ones at every step of the search.
General Algorithm
• All meta‐heuristic and evolutionary algorithms begin by generating initial (possible or
tentative) solution(s)
• Which are improved by the old algorithm.
• Iterative
Step 0: Read input data.
Step 1: Generate initial possible or tentative solutions randomly or deterministically.
Step 2: Evaluate the fitness values of all current solutions.
Step 3: Rename the current solutions as old solutions.
Step 4: Rank all the old solutions and identify the best among them, those with
relatively high fitness values.
Step 5: Select a subset of the old solutions with relatively high fitness values.
Step 6: Generate new solutions.
Step 7: Evaluate the fitness value of the newly generated solutions.
Step 8: If termination criteria are not satisfied, go to step 3; otherwise go to step 9.
Step 9: Report all the most recently calculated solutions or the best solution
achieved at the time when the algorithm terminates execution.
Performance Evaluation of Meta‐Heuristic and Evolutionary Algorithms
• Run No. 1 converges fast but to a solution that is clearly non optimal.
• Run No. 2 reaches an acceptable solution that is close enough to the global optimum, but
its convergence speed is relatively low.
• Run No. 3 achieves a near‐optimal solution with the fastest convergence rate.
The following factors determine the quality of the algorithm’s performance
(1) capacity to reach near‐optimal solutions consistently, that is, across several runs solving a
given problem,