Biased Initialization¶
One way of customizing an algorithm is a biased initial population. This can be very helpful if expert knowledge already exists, and known solutions should be improved. In the following, two different ways of initialization are provided: a) just providing the design space of the variables and b) a Population
object where the objectives and constraints are provided and are not needed to be calculated again.
NOTE: This works with all population-based algorithms in pymoo. Technically speaking, all algorithms which inherit from GeneticAlgorithm
. For local-search based algorithm, the initial solution can be provided by setting x0
instead of sampling
.
By Array¶
[1]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
from pymoo.optimize import minimize
problem = get_problem("zdt2")
X = np.random.random((300, problem.n_var))
algorithm = NSGA2(pop_size=100, sampling=X)
minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 300 | 9 | 3.4263166968 | 3.6065637248 | 0.000000E+00
2 | 400 | 9 | 3.4263166968 | 3.6065637248 | 0.000000E+00
3 | 500 | 9 | 3.1254966147 | 3.5042317638 | 0.000000E+00
4 | 600 | 8 | 3.1254966147 | 3.3424242973 | 0.000000E+00
5 | 700 | 9 | 2.7730924862 | 2.9476011711 | 0.000000E+00
6 | 800 | 6 | 2.7651185744 | 3.0604868293 | 0.000000E+00
7 | 900 | 6 | 2.7204373268 | 2.5981733455 | 0.000000E+00
8 | 1000 | 9 | 2.4744829151 | 2.6817757290 | 0.000000E+00
9 | 1100 | 10 | 2.0338516500 | 2.3534914670 | 0.000000E+00
10 | 1200 | 11 | 1.6560598627 | 2.2848483862 | 0.000000E+00
[1]:
<pymoo.core.result.Result at 0x109b9c3b0>
By Population (pre-evaluated)¶
[2]:
import numpy as np
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.optimize import minimize
problem = get_problem("zdt2")
# create initial data and set to the population object
X = np.random.random((300, problem.n_var))
pop = Population.new("X", X)
Evaluator().eval(problem, pop)
algorithm = NSGA2(pop_size=100, sampling=pop)
minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 0 | 12 | 3.2297464619 | 3.6925488405 | 0.000000E+00
2 | 100 | 11 | 3.2297464619 | 3.5903848768 | 0.000000E+00
3 | 200 | 7 | 3.0523599646 | 3.4791563522 | 0.000000E+00
4 | 300 | 8 | 2.6639663788 | 3.1920129823 | 0.000000E+00
5 | 400 | 10 | 2.6165004620 | 3.1022423363 | 0.000000E+00
6 | 500 | 2 | 2.5960444794 | 2.4970731136 | 0.000000E+00
7 | 600 | 9 | 2.4282609218 | 2.5870466534 | 0.000000E+00
8 | 700 | 8 | 2.1444465512 | 2.4875942051 | 0.000000E+00
9 | 800 | 9 | 2.1444465512 | 2.3615570636 | 0.000000E+00
10 | 900 | 9 | 1.9416099053 | 2.6063042431 | 0.000000E+00
[2]:
<pymoo.core.result.Result at 0x10c931550>