Tselnp 74
Tselnp 74
Genetic Algorithms
Chapter 74
Genetic Algorithms
of an individual At each point during the search pro- 74.2 The Canonical Genetic Algo-
cess we maintain a ”generation” of ”individuals.” Each
individual is a data structure representing the ”genetic
rithm
structure” of a possible solution or hypothesis. Like a
74.2.1 Concepts
chromosome, the genetic structure of an individual is de-
scribed using a fixed, finite alphabet. In GAs, the alpha- Genetic Algorithms are search algorithms that are based
bet 0, 1 is usually used. This string is interpreted as a on concepts of natural selection and natural genet-
solution to the problem we are trying to solve. ics.Genetic algorithm was developed to simulate some
For example, say we want to find the optimal quantity of the processes observed in natural evolution, a pro-
of the three major ingredients in a recipe (say, sugar, cess that operates on chromosomes (organic devices for
wine, and sesame oil). We can use the alphabet 1, 2, 3 encoding the structure of living being). The genetic
..., 9 denoting the number of ounces of each ingredient. algorithm differs from other search methods in that it
Some possible solutions are 1-1-1, 2-1-4, and 3-3-1. searches among a population of points, and works with
As another example, the traveling salesperson problem a coding of parameter set, rather than the parameter
is the problem of finding the optimal path to traverse, values themselves. It also uses objective function infor-
say, 10 cities. The salesperson may start in any city. A mation without any gradient information. The transition
solution is a permutation of the 10 cities: 1-4-2-3-6-7-9- scheme of the genetic algorithm is probabilistic, whereas
8-5-10. traditional methods use gradient information.Because of
these features of genetic algorithm, they are used as gen-
As another example, say we want to represent a rule-
eral purpose optimization algorithm. They also provide
based system. Given a rule such as ”If color=red and
means to search irregular space and hence are applied to
size=small and shape=round then object=apple” we can
a variety of function optimization, parameter estimation
describe it as a bit string by first assuming each of the
and machine learning applications.
attributes can take on a fixed set of possible values. Say
color=red, green, blue, size=small, big, shape=square,
round, and fruit=orange, apple, banana, pear. Then we 74.2.2 Basic Principle
could represent the value for each attribute as a sub-
The working principle of a canonical GA is illustrated in
string of length equal to the number of possible values
Fig. 74.1. The major steps involved are the generation of
of that attribute. For example, color=red could be rep-
a population of solutions, finding the objective function
resented by 100, color=green by 010, and color=blue by
and fitness function and the application of genetic oper-
001. Note also that we can represent color=red or blue
ators. These aspects are described briefly below. They
by 101, and any color (i.e., a ”don’t care”) by 111. Do-
are described in detail in the following subsection.
ing this for each attribute, the above rule might then look
like: 100 10 01 0100. A set of rules is then represented
by concatenating together each rule’s 11-bit string. For /*Algorithm GA */
another example see page 620 in the textbook for a bit- formulate initial population
string representation of a logical conjunction. randomly initialize population
repeat
evaluate objective function
74.1.4 Genetic algorithm vocabulary find fitness function
apply genetic operators
Explanation of Genetic Algorithm terms: reproduction
crossover
Genetic Algorithms Explanation mutation
Chromosome(string, individual) Solution (coding) until stopping criteria
Genes (bits) Part of solution
Locus Position of gene Figure 74.1: The Working Principle of a Simple Genetic
Alleles Values of gene Algorithm
Phenotype Decoded solution
Genotype Encoded solution
An important characteristic of genetic algorithm is the
coding of variables that describes the problem. The most
common coding method is to transform the variables to
Figure 74.2: The basic GA operations: One generation is broken down into a selection phase and recombination phase.
Strings are assigned into adjacent slots during selection.
a binary string or vector; GAs perform best when solu- 74.2.3 Working Principle
tion vectors are binary.If the problem has more than one
To illustrate the working principles of GAs, an uncon-
variable, a multi-variable coding is constructed by con-
strained optimization problem is considered. Let us con-
catenating as many single variables coding as the number
sider following maximization problem,
of variables in the problem. Genetic Algorithm processes
a number of solutions simultaneously. Hence, in the first Maximize f (x), xli ≤ xi ≤ xui , i = 1, 2, ...., N, (74.1)
step a population having P individuals is generated by
pseudo random generators whose individuals represent a where, xli and xui are the lower and upper bound the
feasible solution. This is a representation of solution vec- variable xi can take. Although a maximization problem
tor in a solution space and is called initial solution. This is considered here, a maximization problem can also be
ensures the search to be robust and unbiased, as it starts handled using GAs. The working of GAs is completed
from wide range of points in the solution space. by performing the following tasks.
74.2.4 Coding
In order to use GAs to solve the above problem (equa-
In the next step, individual members of the popula- tion 74.1), variables xi ’s are first coded in some string
tion are evaluated to find the objective function value. structures. It is important to mention here that the cod-
In this step, the exterior penalty function method is uti- ing of the variables is not absolutely necessary. There
lized to transform a constrained optimization problem to exist some studies where GAs are directly used on the
an unconstrained one. This is exclusively problem spe- variables themselves, but here these exceptions are ig-
cific. In the third step, the objective function is mapped nored and the working principles of a simple genetic al-
into a fitness function that computes a fitness value for gorithm is discussed.
each member of the population. This is followed by the Binary-coded strings having 1’s and 0’s are mostly
application of GA operators. used. The length of the string is usually determined
00
11
11
00
00
11
a l 74.2.5 Fitness Function
eg
ill
solution space
0
1
ible 1
0
1 As mentioned earlier, GAs mimic the survival-of-the-
infeas
1
0 0
0
1
0
1
fittest principle of nature to make a search process.
coding space1
0
Therefore, GAs are naturally suitable for solving max-
feasible area
feasible
11
00
11
00 11
00
00
11
00
11
imization problems. Maximization problems are usu-
ally transformed into maximization problem by suitable
transformation. In general, a fitness function F (i) is first
Figure 74.3: Coding in GA derived from the objective function and used in succes-
sive genetic operations. Fitness in biological sense is a
quality value which is a measure of the reproductive ef-
according to the desired solution accuracy. For ex- ficiency of chromosomes. In genetic algorithm, fitness is
ample, if four bits are used to code each variable used to allocate reproductive traits to the individuals in
in a two-variable optimization problem, the strings the population and thus act as some measure of good-
(0000 0000) and (1111 1111) would represent the points ness to be maximized. This means that individuals with
(xl1 , xl2 )T (xu1 , xu2 )T respectively, because the sub-strings higher fitness value will have higher probability of be-
(0000) and (1111) have the minimum and the maximum ing selected as candidates for further examination. Cer-
decoded values. Any other eight bit string can be found tain genetic operators require that the fitness function be
to represent a point in the search space according to a non-negative, although certain operators need not have
fixed mapping rule. Usually, the following linear mapping this requirement. For maximization problems, the fitness
rule is used: function can be considered to be the same as the objec-
tive function or F (i) = O(i). For minimization problems,
xui − xli X
β
xi = xli + β γj 2 j (74.2) to generate non-negative values in all the cases and to
2 − 1 j=0 reflect the relative fitness of individual string, it is neces-
sary to map the underlying natural objective function to
In the above equation, the variable xi is coded with sub- fitness function form. A number of such transformations
string si of length β. The P decoded value of a binary is possible. Two commonly adopted fitness mappings is
β
sub-string si is calculated as j=0 γj 2j where si ∈ (0, 1) presented below.
and the string s is represented as (sβ−1 sβ−2 ....s2 s1 s0 ).
For example, a four bit string (0111) has a decoded value 1
equal to ((1)20 + (1)21 + (1)22 + (0)23 ) or 7. It is worth- F (x) = (74.4)
1 + f (x)
while to mention here that with four bits to code each
variable, there are only 24 or 16 distinct sub-strings pos-
sible because each bit-position can take a value either 0 This transformation does not alter the location of the
to 1. The accuracy that can be obtained with a four bit minimum, but converts a minimization problem to an
coding is only approximately 1/16th of the search space. equivalent maximization problem. An alternate func-
But as this string length is increased by 1, the obtainable tion to transform the objective function to get the fitness
accuracy increases exponentially to 1/32th of the search value F (i) as below.
space. It is not necessary to code all variables in equal
sub-string lengths. The length of a sub-string represent- O(i)P
F (i) = V − PP , (74.5)
ing a variable depends on the desired accuracy in that
i=1 O(i)
variable. The length of the sub-string varies with the de-
sired precision of the results, the longer the string length,
where, O(i) is the objective function value of i th indi-
the more the accuracy. The relationship between string
vidual, P is the population size and V is a large value
length β and precision α is
to ensure non-negative fitness values. The value of V
(xui − xli )10α ≤ (2β − 1). (74.3) adopted in this work is the maximum value of the second
term of equation 74.5 so that the fitness value correspond-
Once the coding of the variables is complete, the cor- ing to maximum value of the objective function is zero.
responding point x = (x1 , x2 , ..., xN )T can be found. This transformation also does not alter the location of
(Eq. 74.2). There after, the function value at the point the solution, but converts a minimization problem to an
x can also be calculated by substituting x in the given equivalent maximization problem. The fitness function
objective function f (x) . value of a string is known as the string fitness.
4 10.0
4
74.2.7 Reproduction 5 5 20.0
range in cumulative probability values and has a smaller strings produced may or may not have a combination
probability of being copied into the mating pool. of good sub-strings from parent strings, depending on
Stochastic remainder selection: A better selec- whether or not the crossing site falls in the appropriate
tion scheme is also presented here The basic idea of this place. But this is not a matter of serious concern, be-
selection is to remove or copy the strings depending on cause if good strings are created by crossover, there will
the values of reproduction counts. This is achieved by be more copies of them in the next mating pool gener-
computing the reproduction count associated with each ated by crossover. It is clear from this discussion that
string. Reproduction count is computed based on the the effect of cross over may be detrimental or beneficial.
fitness value by stochastic remainder selection without Thus, in order to preserve some of the good strings that
replacement as it is superior to other schemes.Hence, are already present in the mating pool, all strings in the
this scheme is recommended. First the mating pool are not used in crossover. When a crossover
P probability of
selection ps is calculated as ps = F (i)/ F (i). The ex- probability, defined here as pc is used, only 100pc per cent
pected number of individuals of each string is calculated strings in the population are used in the crossover opera-
as ei = ps × P , where P is the population size. The tion and 100(1−pc) per cent of the population remains as
fractional parts of ei are treated as probabilities with they are in the current population. A crossover operator
which individuals are selected for reproduction. One by is mainly responsible for the search of new strings even
one Bernoulli trials (i.e. weighted coin tosses) are per- though mutation operator is also used for this purpose
formed using the fractional part of ei. For example, a sparingly.
string with ei = 1.5 will get a single count surely and
another with a probability of 0.5. This is continued till String 1 011|01100 String 1 011|11001
all the candidates in the population are examined. Re-
production is done based on this computed reproduction String 2 110|11001 String 2 011|01100
count. Individuals with 0 count are eliminated from the Before crossover After crossover
population. Other individuals with non-zero counts get
multiple copies in population equal to the value of their
Figure 74.5: One site crossover operation
counts. The size of the population is kept constant and
this completes the reproduction operation. Different se-
lection schemes vary in principle by assigning different Many crossover operators exist in the GA litera-
number of copies to better strings in the population but ture.One site crossover and two site crossover are the
in all selection schemes the essential idea is that more most common ones adopted. In most crossover operators,
copies are allocated to the strings with higher fitness val- two strings are picked from the mating pool at random
ues. and some portion of the strings are exchanged between
the strings. Crossover operation is done at string level by
randomly selecting two strings for crossover operations.
74.2.8 Crossover A one site crossover operator is performed by randomly
choosing a crossing site along the string and by exchang-
A crossover operator is used to recombine two strings to ing all bits on the right side of the crossing site as shown
get a better string. In crossover operation, recombination in Fig. 74.5.
process creates different individuals in the successive gen-
erations by combining material from two individuals of
the previous generation. In reproduction, good strings in String 1 011|011|00 String 1 011|110|00
a population are probabilistic-ally assigned a larger num- String 2 110|110|01 String 2 011|011|01
ber of copies and a mating pool is formed. It is important Before crossover After crossover
to note that no new strings are formed in the reproduc-
tion phase. In the crossover operator, new strings are
created by exchanging information among strings of the Figure 74.6: Two site crossover operation
mating pool.
The two strings participating in the crossover opera- In one site crossover, a crossover site is selected ran-
tion are known as parent strings and the resulting strings domly (shown as vertical lines). The portion right of the
are known as children strings. It is intuitive from this selected site of these two strings are exchanged to form a
construction that good sub-strings from parent strings new pair of strings. The new strings are thus a combina-
can be combined to form a better child string, if an ap- tion of the old strings. Two site crossover is a variation
propriate site is chosen. With a random site, the children of the one site crossover, except that two crossover sites
are chosen and the bits between the sites are exchanged These three operators are simple and straightforward.
as shown in Fig. 74.6. The reproduction operator selects good strings and the
One site crossover is more suitable when string length crossover operator recombines good sub-strings from
is small while two site crossover is suitable for large good strings together, hopefully, to create a better sub-
strings. Hence the present work adopts a two site string. The mutation operator alters a string locally ex-
crossover. The underlying objective of crossover is to ex- pecting a better string. Even though none of these claims
change information between strings to get a string that are guaranteed and/or tested while creating a string, it is
is possibly better than the parents. expected that if bad strings are created they will be elim-
inated by the reproduction operator in the next genera-
74.2.9 Mutation tion and if good strings are created, they will be increas-
ingly emphasized. Further insight into these operators,
Mutation adds new information in a random way to the different ways of implementations and some mathemat-
genetic search process and ultimately helps to avoid get- ical foundations of genetic algorithms can be obtained
ting trapped at local optima. It is an operator that in- from GA literature.
troduces diversity in the population whenever the popu- Application of these operators on the current popu-
lation tends to become homogeneous due to repeated use lation creates a new population. This new population
of reproduction and crossover operators. Mutation may is used to generate subsequent populations and so on,
cause the chromosomes of individuals to be different from yielding solutions that are closer to the optimum solu-
those of their parent individuals. tion. The values of the objective function of the indi-
Mutation in a way is the process of randomly disturb- viduals of the new population are again determined by
ing genetic information. They operate at the bit level; decoding the strings. These values express the fitness of
when the bits are being copied from the current string to the solutions of the new generations. This completes one
the new string, there is probability that each bit may be- cycle of genetic algorithm called a generation. In each
come mutated. This probability is usually a quite small generation if the solution is improved, it is stored as the
value, called as mutation probability pm . A coin toss best solution. This is repeated till convergence.
mechanism is employed; if random number between zero
and one is less than the mutation probability, then the
bit is inverted, so that zero becomes one and one becomes 74.3 GA - Illustrated
zero. This helps in introducing a bit of diversity to the
population by scattering the occasional points. This ran- To understand the working of GA, a simple two variable
dom scattering would result in a better optima, or even function is solved using GA. Detailed steps and solution
modify a part of genetic code that will be beneficial in obtained are listed below. Consider the following mini-
later operations. On the other hand, it might produce mization problem
a weak individual that will never be selected for further 2
operations. f (x1 , x2 ) = (x1 2 + x2 − 11) + (x1 + x2 2 − 7) (74.8)
The need for mutation is to create a point in the neigh-
in the interval 0 ≤ x1 , x2 ≤ 6. The true solution to this
borhood of the current point, thereby achieving a local
problem is (3, 2)T having a function value equal to zero.
search around the current solution. The mutation is also
Step 1: To solve this problem using genetic algo-
used to maintain diversity in the population. For exam-
rithm, a binary coding is chosen to represent variables
ple, the following population having four eight bit strings
x1 and x2 . In the calculation here, 10-bits are chosen
may be considered:
for each variable, thereby making the total string length
01101011 equal to 20. With 10 bits, we can get a solution accu-
00111101 racy of (6-0)/(210 -1) or 0.006 in the interval (0,6). The
00010110 crossover and mutation probabilities are assigned to be
01111100. 0.8 and 0.05 respectively. The population size is 20 and
the number of generation is 30. The built in c random
It can be noticed that all four strings have a 0 in the number generator is used and stochastic sampling with
left most bit position. If the true optimum solution re- out replacement is used for selection The next step is to
quires 1 in that position, then neither reproduction nor evaluate each string in the population We calculate the
crossover operator described above will be able to create fitness of the first string.
1 in that position. The inclusion of mutation introduces Step 2: The next step is to calculate the fitness of
probability pm of turning 0 into 1. each population. This is done by decoding the strings.
x_2
3 3
one, the fitness function of this point is calculated as 2 *Solution point 2 *Solution point
25
ple optimal solutions can be captured in the population
20
easily, thereby reducing the effort to use the same algo-
rithm many times.
15 Genetic algorithms differ from conventional optimiza-
tion and search procedures in several fundamental ways.
10 It can be summarized as follows:
ods
As seen from the above description of the working princi-
ples of GAs, they are radically different from most of the
traditional optimization methods. However, the funda- Search space
mental differences are described subsequently. GAs work
with a string-coding of variables instead of the variables. Figure 74.9: Transition in GA
The advantage of working with a coding of variables is
that the coding discretizes the search space, even though
the function may be continuous. On the other hand,
74.4.1 Exploitation and exploration
since GAs require only function values at various dis-
crete points, a discrete or discontinuous function can be Search is one of the more universal problem-solving meth-
handled with no extra burden. This allows GAs to be ods for such problems where one cannot determine a pri-
applied to a wide variety of problems. Another advan- ory the sequence of steps leading to a solution. Search
tage is that the GA operators exploit the similarities in can be performed with either blind strategies or heuristic
string-structures to make an effective search. The most strategies. Blind search strategies do not use informa-
striking difference between GAs and many traditional op- tion about the problem domain. Heuristic search strate-
timization methods is that GAs work with a population gies use additional information to guide the search along
of points instead of a single point. Because there are more with the best search directions. There are two important
than one string being processed simultaneously, it is very issues in search strategies: exploiting the best solution
likely that the expected GA solution may be a global and exploring the search space. Hill-climbing is an ex-
solution. Even though some traditional algorithms are ample of a strategy which exploits the best solution for
population based, like Box’s evolutionary optimization possible improvement while ignoring the exploration of
and complex search methods, those methods do not use the search space. Random search is an example of a
previously obtained information efficiently. In GAs, pre- strategy which explores the search space while ignoring
the exploitation of the promising regions of the search 1. The search space is large, complex or poorly under-
space. Genetic algorithms are a class of general-purpose stood
can make a remarkable balance between exploration and
exploitation of the search space. At the beginning of ge- 2. Domain knowledge is scarce or expert knowledge is
netic search, there is a widely random and diverse popula- difficult to encode to narrow the search space
tion and crossover operator tends to perform widespread
search for exploring all solution space. As the high fit- 3. No mathematical analysis is available
ness solutions develop, the crossover operator provides 4. Traditional search methods fail
exploration in the neighborhood of each of them. In
other words, what kinds of searches (exploitation or ex- The advantage of the GA approach is the ease with
ploration) a crossover performs would be determined by which it can handle arbitrary kinds of constraints and
the environment of the genetic system (the diversity of objectives; all such things can be handled as weighted
population), but not by the operator itself. In addition, components of the fitness function, making it easy to
simple genetic operators are designed as general-purpose adapt the GA scheduler to the particular requirements
search methods (the domain-independent search meth- of a very wide range of possible overall objectives.
ods); they perform essentially a blind search and could GAs have been used for problem-solving and for mod-
not guarantee to yield an improved offspring. eling. Gas are applied to many scientific, engineering
problems, in business and entertainment, including:
74.4.2 Population-based search 1. Optimization: GAs have been used in a wide variety
of optimization tasks, including numerical optimiza-
Generally, shown in Figure 1, the algorithm for solv-
tion, and combinatorial optimization problems such
ing optimization problem is a sequence of computational
as traveling salesman problem (TSP), circuit design
steps which asymptotically converge to optimal solution.
[Louis 1993] , job shop scheduling [Goldstein 1991]
Most classical optimization methods generate a deter-
and video & sound quality optimization.
ministic sequence of computation based on the gradient
or higher-order derivatives of objective function. This 2. Automatic Programming: GAs have been used to
point-to-point approach takes the danger of falling in lo- evolve computer programs for specific tasks, and to
cal optima. Genetic algorithms perform a multiple di- design other computational structures, for example,
rectional search by maintaining a population of poten- cellular automata and sorting networks.
tial solutions. The population-to-population approach
attempts to make the search escape from local optima. 3. Machine and robot learning: GAs have been used
Population undergoes a simulated evolution: At each for many machine- learning applications, including
generation the relatively good solutions are reproduced classification and prediction, and protein structure
while the relatively bad solutions die. Genetic algorithms prediction. GAs have also been used to design neu-
use probabilistic transition rules to select someone to be ral networks, to evolve rules for learning classifier
reproduced and someone to die so as to guide their search systems or symbolic production systems, and to de-
toward regions of the search space with like improvement. sign and control robots.
7. Population genetics models: GAs have been used to In this report, we have placed more emphasis in ex-
study questions in population genetics, such as ”un- plaining the use of GAs in many areas of engineering and
der what conditions will a gene for recombination commerce. We believe that, through working out these
be evolutionarily viable?” Interactions between evo- interesting examples, one could grasp the idea of GAs
lution and learning: GAs have been used to study with greater ease. We have also discuss the uncertain-
how individual learning and species evolution affect ties about whether computer generated life could exist
one another. as real life form. The discussion is far from conclusive
and ,whether artificial life will become real life, will re-
8. Models of social systems: GAs have been used to main to be seen.
study evolutionary aspects of social systems, such In future, we would witness some developments of vari-
as the evolution of cooperation [Chughtai 1995], the ants of GAs to tailor for some very specific tasks. This
evolution of communication, and trail-following be- might defy the very principle of GAs that it is ignorant
havior in ants. of the problem domain when used to solve problem. But
we would realize that this practice could make GAs even
more powerful.
74.6 Conclusion
Genetic Algorithms are easy to apply to a wide range
of problems, from optimization problems like the travel-
References
ing salesperson problem, to inductive concept learning, 1. Genetic algorithms : principles and perspectives: a
scheduling, and layout problems. The results can be guide to GA theory. Kluwer Academic, Boston,
very good on some problems, and rather poor on oth- 2002.
ers. If only mutation is used, the algorithm is very slow.
Crossover makes the algorithm significantly faster. GA is 2. David Beasley, David R Bull, and Ralph R Martin.
a kind of hill-climbing search; more specifically it is very An overview of genetic algorithms: Part 2, research
similar to a randomized beam search. As with all hill- topics. University Computing, 15(4):170–181, 1993.
climbing algorithms, there is a problem of local maxima. citeseer.nj.nec.com/article/beasley93overview.html.
Local maxima in a genetic problem are those individu-
als that get stuck with a pretty good, but not optimal, 3. L.P Chambers. Practical handbook of genetic algo-
fitness measure. Any small mutation gives worse fitness. rithms:Applications, Vol.I. CRC Press, Boca Raton,
Fortunately, crossover can help them get out of a local Florida, 1995.
maximum. Also, mutation is a random process, so it
is possible that we may have a sudden large mutation 4. P Charbonneau. An introduction to genetic
to get these individuals out of this situation. (In fact, algorithms for numerical optimization, 1998.
these individuals never get out. It’s their offspring that http://www.hao.ucar.edu/public.research/si/pikaia/tutorial.h
get out of local maxima.) One significant difference be- 5. K Deb. Optimization for engineering design: Algo-
tween GAs and hill-climbing is that, it is generally a good rithms and Examples. Prentice Hall, India, 1998.
idea in GAs to fill the local maxima up with individuals.
Overall, GAs have less problems with local maxima than 6. Mitsuo Gen and Runwei Cheng. Genetic algorithms
back-propagation neural networks. and engineering opimization. John Wiley, New
If the conception of a computer algorithms being based York, 2000.
on the evolutionary of organism is surprising, the ex-
tensiveness with which this algorithms is applied in so 7. D E Goldberg. Genetic Algorithms in search, op-
many areas is no less than astonishing. These applica- timization and machine learning. Addison-Wesley,
tions, be they commercial, educational and scientific, are Massachusetts, 1989.
increasingly dependent on this algorithms, the Genetic
8. J.H. Holland. Adaptation in Natural and Artificial
Algorithms. Its usefulness and gracefulness of solving
Systems. 2nd ed., MIT Press, MIT, Cambridge.
problems has made it the a more favorite choice among
the traditional methods, namely gradient search, random 9. Michael D Vose. Simple genetic algorithm : founda-
search and others. GAs are very helpful when the de- tions and theory. MIT Press, 1999.
veloper does not have precise domain expertise, because
GAs possess the ability to explore and learn from their 10. D Whitley. A genetic algorithm tutorial, 2001.
domain. http://samizdat.mines.edu/ga tutorial.
Acknowledgments
I wish to thank several of my students and staff of
NPTEL for their contribution in this lecture.
74.7 Acknowledgments
I wish to thank Prof. Rajeev and Prof. Mohan of IIT
Madras who introduced me to the world of Genetic Al-
gorithms. I also appreciate your constructive feedback
which may be sent to [email protected]. Prof. Tom
V. Mathew, Department of Civil engineering, Indian In-
stiute of Technology Bombay, India.