Biased Initialization¶
One way of customizing an algorithm is a biased initial population. This can be very helpful if expert knowledge already exists, and known solutions should be improved. In the following, two different ways of initialization are provided: a) just providing the design space of the variables and b) a Population
object where the objectives and constraints are provided and are not needed to be calculated again.
NOTE: This works with all population-based algorithms in pymoo. Technically speaking, all algorithms which inherit from GeneticAlgorithm
. For local-search based algorithm, the initial solution can be provided by setting x0
instead of sampling
.
By Array¶
[1]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
from pymoo.optimize import minimize
problem = get_problem("zdt2")
X = np.random.random((300, problem.n_var))
algorithm = NSGA2(pop_size=100, sampling=X)
minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 300 | 5 | 3.0138058941 | 3.7176787630 | 0.000000E+00
2 | 400 | 6 | 3.0138058941 | 3.6427723242 | 0.000000E+00
3 | 500 | 8 | 3.0138058941 | 3.4364790298 | 0.000000E+00
4 | 600 | 11 | 3.0138058941 | 3.4386766063 | 0.000000E+00
5 | 700 | 15 | 2.8667629021 | 3.3320777680 | 0.000000E+00
6 | 800 | 4 | 2.5658326517 | 2.8253961582 | 0.000000E+00
7 | 900 | 8 | 2.4141409799 | 2.6622877311 | 0.000000E+00
8 | 1000 | 8 | 2.1928645655 | 2.4698842478 | 0.000000E+00
9 | 1100 | 7 | 2.1863131225 | 2.4396968173 | 0.000000E+00
10 | 1200 | 9 | 2.0191938459 | 2.1108713214 | 0.000000E+00
[1]:
<pymoo.core.result.Result at 0x104ed8b30>
By Population (pre-evaluated)¶
[2]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.optimize import minimize
problem = get_problem("zdt2")
# create initial data and set to the population object
X = np.random.random((300, problem.n_var))
pop = Population.new("X", X)
Evaluator().eval(problem, pop)
algorithm = NSGA2(pop_size=100, sampling=pop)
minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 0 | 7 | 3.3187610973 | 3.9446592819 | 0.000000E+00
2 | 100 | 7 | 3.3187610973 | 3.6947717967 | 0.000000E+00
3 | 200 | 6 | 2.9518388038 | 3.6145482342 | 0.000000E+00
4 | 300 | 7 | 2.9518388038 | 3.5043045355 | 0.000000E+00
5 | 400 | 5 | 2.9432434290 | 3.3264026468 | 0.000000E+00
6 | 500 | 8 | 2.8173758892 | 2.8948286380 | 0.000000E+00
7 | 600 | 5 | 2.4927446235 | 2.6612847083 | 0.000000E+00
8 | 700 | 4 | 2.4859345973 | 2.6081961608 | 0.000000E+00
9 | 800 | 5 | 2.3022041851 | 2.4928715407 | 0.000000E+00
10 | 900 | 8 | 2.3022041851 | 2.2670321519 | 0.000000E+00
[2]:
<pymoo.core.result.Result at 0x104f0d640>