Biased Initialization¶
One way of customizing an algorithm is a biased initial population. This can be very helpful if expert knowledge already exists, and known solutions should be improved. In the following, two different ways of initialization are provided: a) just providing the design space of the variables and b) a Population
object where the objectives and constraints are provided and are not needed to be calculated again.
NOTE: This works with all population-based algorithms in pymoo. Technically speaking, all algorithms which inherit from GeneticAlgorithm
. For local-search based algorithm, the initial solution can be provided by setting x0
instead of sampling
.
By Array¶
[1]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
from pymoo.optimize import minimize
problem = get_problem("zdt2")
X = np.random.random((300, problem.n_var))
algorithm = NSGA2(pop_size=100, sampling=X)
minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 300 | 3 | 3.1420452829 | 3.1994780681 | 0.000000E+00
2 | 400 | 5 | 3.0414757847 | 3.4436017730 | 0.000000E+00
3 | 500 | 5 | 2.6567210587 | 3.3254690927 | 0.000000E+00
4 | 600 | 5 | 2.3229724294 | 2.6825979150 | 0.000000E+00
5 | 700 | 8 | 1.7015039477 | 2.3511355341 | 0.000000E+00
6 | 800 | 7 | 1.6630101217 | 1.7968312673 | 0.000000E+00
7 | 900 | 6 | 1.3226871000 | 1.9917156630 | 0.000000E+00
8 | 1000 | 4 | 1.3226871000 | 1.1836092686 | 0.000000E+00
9 | 1100 | 5 | 1.2824360868 | 1.4520198639 | 0.000000E+00
10 | 1200 | 9 | 1.1338264126 | 1.1647797734 | 0.000000E+00
[1]:
<pymoo.core.result.Result at 0x7f7f0a318fd0>
By Population (pre-evaluated)¶
[2]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.optimize import minimize
problem = get_problem("zdt2")
# create initial data and set to the population object
X = np.random.random((300, problem.n_var))
pop = Population.new("X", X)
Evaluator().eval(problem, pop)
algorithm = NSGA2(pop_size=100, sampling=pop)
minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 0 | 10 | 3.7658693587 | 3.9065078970 | 0.000000E+00
2 | 100 | 4 | 3.1955011925 | 3.3561578888 | 0.000000E+00
3 | 200 | 3 | 2.6711445903 | 2.9717542152 | 0.000000E+00
4 | 300 | 4 | 2.5318797772 | 2.7706347082 | 0.000000E+00
5 | 400 | 5 | 2.0923789500 | 2.1824969365 | 0.000000E+00
6 | 500 | 4 | 1.7572820023 | 1.9821361334 | 0.000000E+00
7 | 600 | 8 | 1.5130224748 | 2.0044436873 | 0.000000E+00
8 | 700 | 10 | 1.3591191014 | 1.8476065358 | 0.000000E+00
9 | 800 | 11 | 1.3214394872 | 1.6207413422 | 0.000000E+00
10 | 900 | 4 | 1.1839834216 | 0.9601325831 | 0.000000E+00
[2]:
<pymoo.core.result.Result at 0x7f7f0a172070>