Genetic Algorithm¶
This class represents a basic (\(\mu+\lambda\)) genetic algorithm for singleobjective problems. The figure below shows the flow of a genetic algorithm in general. In the following, it is explained how pymoo does allow to customize the modules.
Initial Population:: A starting population is sampled in the beginning. In this framework, this can be either a Sampling object, which defines different initial sampling strategies, or Population where the X and F values are set, or a simple numpy array (pop_size x n_var).
Evaluation: It is executed using the problem defined to be solved.
Survival: It is very often the core of the genetic algorithm that is used. For simple singleobjective genetic algorithm, the individuals can be sorted by their fitness and survival of the fittest can be applied.
Selection: In the beginning of the recombination process individuals need to be selected to participate in mating. Depending on the recombination crossover a different number of parents is is used to select some individual to be the parents of a new offspring. Different kind of selections can increase the convergence of the algorithm.
Crossover: When the parents are selected the actual mating is done. A crossover operator combines parents into one or several offspring. Commonly, problem information, such as the variable bounds, are needed to perform the mating. For more customized problems even more information might be necessary (e.g. current generation, diversity measure of the population, …)
Mutation: It is performed after the offsprings are created through crossover. Usually, the mutation is executed with a predefined probability. This operator helps to increase the diversity in the population.
Example¶
[1]:
from pymoo.algorithms.so_genetic_algorithm import GA
from pymoo.factory import get_problem
from pymoo.optimize import minimize
problem = get_problem("g01")
algorithm = GA(
pop_size=100,
eliminate_duplicates=True)
res = minimize(problem,
algorithm,
termination=('n_gen', 50),
seed=1,
verbose=False)
print("Best solution found: \nX = %s\nF = %s" % (res.X, res.F))
Best solution found:
X = [0.99846487 0.99896452 0.99227818 0.98115003 0.97598486 0.93133579
0.9989595 0.99283295 0.93527323 2.74333764 2.73101357 2.83238454
0.99763872]
F = [13.99514066]
API¶

pymoo.algorithms.so_genetic_algorithm.
GA
(pop_size=100, sampling=<pymoo.operators.sampling.random_sampling.FloatRandomSampling object>, selection=<pymoo.operators.selection.tournament_selection.TournamentSelection object>, crossover=<pymoo.operators.crossover.simulated_binary_crossover.SimulatedBinaryCrossover object>, mutation=<pymoo.operators.mutation.polynomial_mutation.PolynomialMutation object>, eliminate_duplicates=True, n_offsprings=None, **kwargs) This class represents the abstract class for any algorithm to be implemented. Most importantly it provides the solve method that is used to optimize a given problem.
The solve method provides a wrapper function which does validate the input.
 Parameters
 problem: class
Problem to be solved by the algorithm
 termination: class
Object that tells the algorithm when to terminate.
 seed: int
Random seed to be used. Same seed is supposed to return the same result. If set to None, a random seed is chosen randomly and stored in the result object to ensure reproducibility.
 verbosebool
If true information during the algorithm execution are displayed
 callbackfunc
A callback function can be passed that is executed every generation. The parameters for the function are the algorithm itself, the number of evaluations so far and the current population.
 def callback(algorithm):
pass
 save_historybool
If true, a current snapshot of each generation is saved.
 pfnp.array
The Paretofront for the given problem. If provided performance metrics are printed during execution.
 return_least_infeasiblebool
Whether the algorithm should return the least infeasible solution, if no solution was found.
 evaluatorclass
The evaluator which can be used to make modifications before calling the evaluate function of a problem.