0% found this document useful (0 votes)
13 views2 pages

Genetic Algorithm

The document outlines a genetic algorithm for optimizing mathematical functions, specifically the Rosenbrock function. It includes functions for initializing a population, selecting parents, performing crossover, and applying mutation. The algorithm iteratively evolves the population over a specified number of generations to find the best solution and its corresponding fitness value.

Uploaded by

chahat20singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views2 pages

Genetic Algorithm

The document outlines a genetic algorithm for optimizing mathematical functions, specifically the Rosenbrock function. It includes functions for initializing a population, selecting parents, performing crossover, and applying mutation. The algorithm iteratively evolves the population over a specified number of generations to find the best solution and its corresponding fitness value.

Uploaded by

chahat20singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Genetics Algorithm

Mathematical Function Optimization:


import numpy as np

def rosenbrock(x):
return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0)

def initialize_population(pop_size, dim):


return np.random.uniform(-5, 5, (pop_size, dim))

def selection(population, fitness, num_parents):


parents = population[np.argsort(fitness)][:num_parents]
return parents

def crossover(parents, pop_size):


offspring = []
crossover_point = parents.shape[1] // 2
for _ in range(pop_size - parents.shape[0]):
parent1, parent2 = parents[np.random.choice(parents.shape[0],
2, replace=False)]
offspring.append(np.concatenate((parent1[:crossover_point],
parent2[crossover_point:])))
return np.array(offspring)

def mutation(offspring, mutation_rate):


for i in range(offspring.shape[0]):
if np.random.rand() < mutation_rate:
mutation_point = np.random.randint(offspring.shape[1])
offspring[i, mutation_point] = np.random.uniform(-5, 5)
return offspring

def genetic_algorithm(pop_size, dim, num_generations, mutation_rate,


num_parents):
population = initialize_population(pop_size, dim)
for generation in range(num_generations):
fitness = np.array([rosenbrock(ind) for ind in population])
parents = selection(population, fitness, num_parents)
offspring = crossover(parents, pop_size)
offspring = mutation(offspring, mutation_rate)
population = np.vstack((parents, offspring))
best_solution = population[np.argmin(fitness)]
return best_solution, rosenbrock(best_solution)

pop_size = 50
dim = 10
num_generations = 100
mutation_rate = 0.1
num_parents = 20

best_solution, best_fitness = genetic_algorithm(pop_size, dim,


num_generations, mutation_rate, num_parents)

print("Best Solution:", best_solution)


print("Best Fitness (Objective Value):", best_fitness)

OUTPUT:

You might also like