Version: 0.5.0

# GA: Genetic Algorithm¶

This class represents a basic ($$\mu+\lambda$$) genetic algorithm for single-objective problems. The figure below shows the flow of a genetic algorithm in general. In the following, it is explained how pymoo can be customized.

1. Initial Population:: A starting population is sampled in the beginning. In this framework, this can be either a Sampling object, which defines different initial sampling strategies, or Population where the X and F values are set, or a simple NumPy array (pop_size x n_var).

2. Evaluation: It is executed using the problem defined to be solved.

3. Survival: It is often the core of the genetic algorithm used. For a simple single-objective genetic algorithm, the individuals can be sorted by their fitness, and survival of the fittest can be applied.

4. Selection: At the beginning of the recombination process, individuals need to be selected to participate in mating. Depending on the crossover, a different number of parents need to be selected. Different kinds of selections can increase the convergence of the algorithm.

5. Crossover: When the parents are selected, the actual mating is done. A crossover operator combines parents into one or several offspring. Commonly, problem information, such as the variable bounds, are needed to perform the mating. For more customized problems, even more, information might be necessary (e.g. current generation, diversity measure of the population, …)

6. Mutation: It is performed after the offsprings are created through the crossover. Usually, the mutation is executed with a predefined probability. This operator helps to increase the diversity in the population.

## Example¶

[1]:

from pymoo.algorithms.soo.nonconvex.ga import GA
from pymoo.factory import get_problem
from pymoo.optimize import minimize

problem = get_problem("g01")

algorithm = GA(
pop_size=100,
eliminate_duplicates=True)

res = minimize(problem,
algorithm,
seed=1,
verbose=False)

print("Best solution found: \nX = %s\nF = %s" % (res.X, res.F))


Best solution found:
X = [1.         0.99999999 1.         0.99999998 0.99999996 1.
0.99999992 0.99999999 1.         2.99906987 2.99525757 2.98473236
0.99999999]
F = [-14.97905952]


## API¶

pymoo.algorithms.soo.nonconvex.ga.GA(pop_size=100, sampling=<pymoo.operators.sampling.rnd.FloatRandomSampling object>, selection=<pymoo.operators.selection.tournament.TournamentSelection object>, crossover=<pymoo.operators.crossover.sbx.SimulatedBinaryCrossover object>, mutation=<pymoo.operators.mutation.pm.PolynomialMutation object>, survival=<pymoo.algorithms.soo.nonconvex.ga.FitnessSurvival object>, eliminate_duplicates=True, n_offsprings=None, display=<pymoo.util.display.SingleObjectiveDisplay object>, **kwargs)

This class represents the abstract class for any algorithm to be implemented. Most importantly it provides the solve method that is used to optimize a given problem.

The solve method provides a wrapper function which does validate the input.

Parameters
problemProblem

Problem to be solved by the algorithm

termination: :class:~pymoo.core.termination.Termination

Object that tells the algorithm when to terminate.

seedint

Random seed to be used. Same seed is supposed to return the same result. If set to None, a random seed is chosen randomly and stored in the result object to ensure reproducibility.

verbosebool

If true information during the algorithm execution are displayed

callbackfunc

A callback function can be passed that is executed every generation. The parameters for the function are the algorithm itself, the number of evaluations so far and the current population.

def callback(algorithm):

pass

save_historybool

If true, a current snapshot of each generation is saved.

pfnumpy.array

The Pareto-front for the given problem. If provided performance metrics are printed during execution.

return_least_infeasiblebool

Whether the algorithm should return the least infeasible solution, if no solution was found.

evaluatorEvaluator

The evaluator which can be used to make modifications before calling the evaluate function of a problem.