Explore 1.5M+ audiobooks & ebooks free for days

From $11.99/month after trial. Cancel anytime.

Learning Genetic Algorithms with Python: Empower the performance of Machine Learning and AI models with the capabilities of a powerful search algorithm (English Edition)
Learning Genetic Algorithms with Python: Empower the performance of Machine Learning and AI models with the capabilities of a powerful search algorithm (English Edition)
Learning Genetic Algorithms with Python: Empower the performance of Machine Learning and AI models with the capabilities of a powerful search algorithm (English Edition)
Ebook449 pages3 hours

Learning Genetic Algorithms with Python: Empower the performance of Machine Learning and AI models with the capabilities of a powerful search algorithm (English Edition)

Rating: 4 out of 5 stars

4/5

()

Read preview

About this ebook

Genetic algorithms are one of the most straightforward and powerful techniques used in machine learning. This book ‘Learning Genetic Algorithms with Python’ guides the reader right from the basics of genetic algorithms to its real practical implementation in production environments.

Each of the chapters gives the reader an intuitive understanding of each concept. You will learn how to build a genetic algorithm from scratch and implement it in real-life problems. Covered with practical illustrated examples, you will learn to design and choose the best model architecture for the particular tasks. Cutting edge examples like radar and football manager problem statements, you will learn to solve high-dimensional big data challenges with ways of optimizing genetic algorithms.
LanguageEnglish
PublisherBPB Online LLP
Release dateFeb 13, 2021
ISBN9788194837749
Learning Genetic Algorithms with Python: Empower the performance of Machine Learning and AI models with the capabilities of a powerful search algorithm (English Edition)

Read more from Ivan Gridin

Related to Learning Genetic Algorithms with Python

Related ebooks

Information Technology For You

View More

Reviews for Learning Genetic Algorithms with Python

Rating: 4 out of 5 stars
4/5

1 rating0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Learning Genetic Algorithms with Python - Ivan Gridin

    CHAPTER 1

    Introduction

    As we know, evolution is one of the most perfect adaptation mechanisms. It is a way to achieve extraordinary and complex solutions. Understanding the principles of evolution gave us a new approach called genetic algorithms . We will now explore this rather beautiful, simple, and effective approach to problem-solving.

    Structure

    In this chapter, we will discuss the following topics:

    Nature of genetic algorithm

    Applicability of genetic algorithms

    Pros and cons of genetic algorithms

    Your first genetic algorithm

    1.1 Nature of genetic algorithm

    The rapid development in AI is made possible for humans to obtain solution to abstract problems. Complex computational problems that are very difficult to solve by classical methods can now be solved by AI.

    One of the most powerful techniques to solve such complex problems is genetic algorithms (GA), which is based on the principle of an evolutionary approach.

    In the late 60s, American researcher J. Holland proposed to find solutions to optimization problems using methods and evolution models of animal populations in nature. Since the evolution’s basic laws were investigated and described by genetics, the proposed approach was called genetic algorithms. GA is a randomly directed search algorithm based on mechanisms of natural selection and natural genetics. It implements the principle of survival of the fittest, forming and changing the search algorithm based on evolutionary modeling.

    The basic steps in natural evolution are as follows:

    Selection: According to Charles Darwin, natural selection laws were formulated in the book On the Origin of Species. The central postulate is that individuals who can better solve problems, survive and reproduce more. In GAs, each individual is a solution to some problem. According to this principle, individuals who solve the problem better have a greater chance of surviving and leaving offsprings.

    Crossover: This means that the offspring chromosome is made up of parts that are derived from the parents’ chromosomes. This principle was discovered in 1865 by G. Mendel.

    Mutation: In 1900, H. de Vries discovered the principle of random change. Initially, this term was used to describe significant changes in descendants’ properties that were not present in their parents. By analogy, genetic algorithms use a similar mechanism to change offspring’s properties, thereby increasing individuals’ diversity in a population.

    Genetic algorithms have the following characteristics:

    Easy to implement

    Used for a wide range of tasks

    They do not require any additional information about the nature of the problem

    Easy and convenient to parallelize

    1.2 Applicability of genetic algorithms

    As a solution, the GA tries to find the extremum of some function that characterizes the quality of the solution to the problem. Generally, the GA does not guarantee that the solution found is the best of all that’s possible. Usually, this is not required, but it is only important that the found solution satisfies the meaning of the problem being solved.

    The areas of application of GAs include the following:

    Search for extremum of various functions

    Finding the shortest paths (traveling salesman problem)

    Combinatorial optimization

    Tasks of placement and scheduling

    Automatic programming tasks

    AI tasks (choosing the structure and parameters of artificial neural networks)

    In real time scenarios, GAs are used to develop AI systems, like designing tasks for aircraft routes at airports, finding the optimal behavior of robots, problems of constructing investment portfolios, and so on.

    1.3 Pros and cons of genetic algorithms

    Like any other approach to problem-solving, GAs have their pros and cons as well. Understanding these features will allow you to solvе the practical problems in a better way.

    The pros of genetic algorithms are as follows:

    A wide range of tasks to be solved: GA is successfully applied in the following areas – combinatorial optimization, finance (portfolio optimization), machine learning (feature extraction, neural network hyper-parameter optimization), code-breaking, game theory, natural sciences, and so on.

    Ease of implementation: The algorithm implies the presence of steps – natural selection, crossing, and mutation. This conceptual simplicity makes this method available to a wide range of developers.

    Resistance to dynamic changes in problem conditions: The GA is able to retrain if the conditions of the problem change when searching for a solution.

    The ability for self-adaptation: GAs are able, after a certain period of evolution, to adapt to the conditions of the problem being solved.

    Ease of scaling: Can easily be used on big data where the data is spread over the distributed systems. GAs, as a highly parallel process, can be easily parallelized, which makes it possible to proportionally accelerate the finding of a solution with an increase in computing power.

    Solving problems for which there is no solution experience: One of the biggest advantages of GAs is their ability to investigate problems for which there is no relevant solution experience. It should be noted that expert assessments are often used to solve difficult-to-formalize problems, but they sometimes give less acceptable solutions than automated methods.

    The cons of genetic algorithms are as follows:

    The complexity of representing an individual in a population and determining the fitness function.

    For real problems, it is initially not-at-all obvious in what form it is necessary to present a set of individual genes for a successful solution to the problem, and also determine the assessment of the quality of a particular individual.

    The choice of parameters of the architecture of the GA.

    There are no effective criteria for the termination of the algorithm.

    Not effective for finding an extremum for smooth functions with one extremum.

    They require large enough computing resources.

    When solving problems, there are cases of premature convergence, and therefore, generally, they do not guarantee in finding the global extremum.

    1.4 Your first genetic algorithm

    Well, let’s try to build our first GA solution. We will start from a trivial example which shows us the basics.

    Let’s say we have the following function, sin(x) - 0.2 * abs(x). Refer to the following figure 1.1:

    Figure 1.1: sin(x) - |x|

    We will find the maxima of the preceding function.

    This function has several local maximums. All individuals in the population in our GA will try to climb as high as possible.

    Let’s see the GA in action. Execute the following code (we will cover the details in future chapters) ch1/your_first_genetic_algorithm.py :

    Import part

    import random

    from typing import List

    import numpy as np

    import matplotlib.pyplot as plt

    Auxiliary GA operations

    def _utils_constraints(g, min, max):

    if max and g > max:

    g = max

    if min and g < min:

    g = min

    return g

    def crossover_blend(g1, g2, alpha, min = None, max = None):

    shift = (1. + 2. * alpha) * random.random() - alpha

    new_g1 = (1. - shift) * g1 + shift * g2

    new_g2 = shift * g1 + (1. - shift) * g2

    return _utils_constraints(new_g1, min, max), _utils_constraints(new_g2, min, max)

    def mutate_gaussian(g, mu, sigma, min = None, max = None):

    mutated_gene = g + random.gauss(mu, sigma)

    return _utils_constraints(mutated_gene, min, max)

    def select_tournament(population, tournament_size):

    new_offspring = []

    for _ in range(len(population)):

    candidates = [random.choice(population) for _ in range(tournament_size)]

    new_offspring.append(max(candidates, key = lambda ind: ind.fitness))

    return new_offspring

    def func(x):

    return np.sin(x) - .2 * abs(x)

    def get_best(population):

    best = population[0]

    for ind in population:

    if ind.fitness > best.fitness:

    best = ind

    return best

    def plot_population(population, number_of_population):

    best = get_best(population)

    x = np.linspace(-10, 10)

    plt.plot(x, func(x), ‘--’, color = ‘blue’)

    plt.plot([ind.get_gene() for ind in population], [ind.fitness for ind in population], ‘o’, color = ‘orange’)

    plt.plot([best.get_gene()], [best.fitness], ‘s’, color = ‘green’)

    plt.title(fGeneration number {number_of_population})

    plt.show()

    plt.close()

    Individual class

    class Individual:

    def __init__(self, gene_list: List[float]) -> None:

    self.gene_list = gene_list

    self.fitness = func(self.gene_list[0])

    def get_gene(self):

    return self.gene_list[0]

    @classmethod

    def crossover(cls, parent1, parent2):

    child1_gene, child2_gene = crossover_blend(parent1.get_gene(), parent2.get_gene(), 1, -10, 10)

    return Individual([child1_gene]), Individual([child2_gene])

    @classmethod

    def mutate(cls, ind):

    mutated_gene = mutate_gaussian(ind.get_gene(), 0, 1, -10, 10)

    return Individual([mutated_gene])

    @classmethod

    def select(cls, population):

    return select_tournament(population, tournament_size = 3)

    @classmethod

    def create_random(cls):

    return Individual([random.randrange(-1000, 1000) / 100])

    GA flow

    random.seed(52)

    # random.seed(16) # local maximum

    POPULATION_SIZE = 10

    CROSSOVER_PROBABILITY = .8

    MUTATION_PROBABILITY = .1

    MAX_GENERATIONS = 10

    first_population = [Individual.create_random() for _ in range(POPULATION_SIZE)]

    plot_population(first_population, 0)

    generation_number = 0

    population = first_population.copy()

    while generation_number < MAX_GENERATIONS:

    generation_number += 1

    # SELECTION

    offspring = Individual.select(population)

    # CROSSOVER

    crossed_offspring = []

    for ind1, ind2 in zip(offspring[::2], offspring[1::2]):

    if random.random() < CROSSOVER_PROBABILITY:

    kid1, kid2 = Individual.crossover(ind1, ind2)

    crossed_offspring.append(kid1)

    crossed_offspring.append(kid2)

    else:

    crossed_offspring.append(ind1)

    crossed_offspring.append(ind2)

    # MUTATION

    mutated_offspring = []

    for mutant in crossed_offspring:

    if random.random() < MUTATION_PROBABILITY:

    new_mutant = Individual.mutate(mutant)

    mutated_offspring.append(new_mutant)

    else:

    mutated_offspring.append(mutant)

    population = mutated_offspring.copy()

    plot_population(population, generation_number)

    Now, let’s examine how individuals of each population behave during each generation. Refer to the following graphs:

    Figure 1.2: Generation 1

    In the preceding figure 1.2, the first-generation is just the random distribution of points on the curve. We denote the green point as the

    Enjoying the preview?
    Page 1 of 1