Biased Initialization#
One way of customizing an algorithm is a biased initial population. This can be very helpful if expert knowledge already exists, and known solutions should be improved. In the following, two different ways of initialization are provided: a) just providing the design space of the variables and b) a Population object where the objectives and constraints are provided and do not need to be calculated again.
NOTE: This works with all population-based algorithms in pymoo. Technically speaking, all algorithms which inherit from GeneticAlgorithm. For local-search based algorithm, the initial solution can be provided by setting x0 instead of sampling.
By Array#
[1]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
from pymoo.optimize import minimize
problem = get_problem("zdt2")
X = np.random.random((300, problem.n_var))
algorithm = NSGA2(pop_size=100, sampling=X)
minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 300 | 7 | 3.4498120761 | 3.5720363508 | 0.000000E+00
2 | 400 | 10 | 3.4498120761 | 3.6900721146 | 0.000000E+00
3 | 500 | 8 | 3.3156556714 | 3.6739753199 | 0.000000E+00
4 | 600 | 6 | 3.1791347805 | 3.3565188180 | 0.000000E+00
5 | 700 | 6 | 3.0018445382 | 3.1936330666 | 0.000000E+00
6 | 800 | 7 | 2.7697440492 | 3.0296347551 | 0.000000E+00
7 | 900 | 5 | 2.6794514616 | 2.8805573058 | 0.000000E+00
8 | 1000 | 6 | 2.5320222264 | 2.6748708277 | 0.000000E+00
9 | 1100 | 8 | 2.2682622087 | 2.3985960717 | 0.000000E+00
10 | 1200 | 12 | 2.1003467070 | 2.4401607727 | 0.000000E+00
[1]:
<pymoo.core.result.Result at 0x107731090>
By Population (pre-evaluated)#
[2]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.optimize import minimize
problem = get_problem("zdt2")
# create initial data and set to the population object
X = np.random.random((300, problem.n_var))
pop = Population.new("X", X)
Evaluator().eval(problem, pop)
algorithm = NSGA2(pop_size=100, sampling=pop)
minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 0 | 10 | 3.0731111570 | 3.5509953393 | 0.000000E+00
2 | 100 | 11 | 2.9880803008 | 3.6187327244 | 0.000000E+00
3 | 200 | 12 | 2.9880803008 | 3.3645507936 | 0.000000E+00
4 | 300 | 7 | 2.5790751253 | 3.2335589851 | 0.000000E+00
5 | 400 | 12 | 2.3565452599 | 3.0401088150 | 0.000000E+00
6 | 500 | 9 | 2.2763320737 | 2.6122510588 | 0.000000E+00
7 | 600 | 8 | 2.2509934597 | 2.4301576015 | 0.000000E+00
8 | 700 | 10 | 1.9639887257 | 2.3668197539 | 0.000000E+00
9 | 800 | 8 | 1.4716815079 | 1.9850583407 | 0.000000E+00
10 | 900 | 9 | 1.4716815079 | 1.8232881293 | 0.000000E+00
[2]:
<pymoo.core.result.Result at 0x11a450d00>