Checkpoints¶
Sometimes, it might be useful to store some checkpoints while executing an algorithm. In particular, if a run is very time-consuming. pymoo offers to resume a run by serializing the algorithm object and loading it. Resuming runs from checkpoints is possible
the functional way by calling the
minimize
method,the object-oriented way by repeatedly calling the
next()
method orfrom a text file (Biased Initialization from
Population
)
Functional¶
[1]:
import dill
from pymoo.problems import get_problem
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.optimize import minimize
from pymoo.termination.max_gen import MaximumGenerationTermination
problem = get_problem("zdt1", n_var=5)
algorithm = NSGA2(pop_size=100)
res = minimize(problem,
algorithm,
('n_gen', 5),
seed=1,
copy_algorithm=False,
verbose=True)
with open("checkpoint", "wb") as f:
dill.dump(algorithm, f)
with open("checkpoint", 'rb') as f:
checkpoint = dill.load(f)
print("Loaded Checkpoint:", checkpoint)
# only necessary if for the checkpoint the termination criterion has been met
checkpoint.termination = MaximumGenerationTermination(20)
res = minimize(problem,
checkpoint,
seed=1,
copy_algorithm=False,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 100 | 6 | 0.5914067243 | 2.8577180757 | 0.0841819857
2 | 200 | 11 | 0.4020636193 | 2.3258052118 | 0.2017028459
3 | 300 | 8 | 0.4014091479 | 2.2380261229 | 0.2017028459
4 | 400 | 10 | 0.3671965043 | 2.2070624810 | 0.2022059328
5 | 500 | 12 | 0.3350427005 | 1.6666495281 | 0.2397953014
Loaded Checkpoint: <pymoo.algorithms.moo.nsga2.NSGA2 object at 0x11b5b7bf0>
6 | 600 | 13 | 0.3227256544 | 0.9352859544 | 0.2659725563
7 | 700 | 11 | 0.3055304975 | 1.1478467381 | 0.3139739030
8 | 800 | 17 | 0.2005783077 | 0.4489098490 | 0.3858676610
9 | 900 | 22 | 0.1783441151 | 0.3113390864 | 0.4175602099
10 | 1000 | 19 | 0.1557197308 | 0.3059781469 | 0.4477686211
11 | 1100 | 21 | 0.1103558304 | 0.3417948992 | 0.4828691061
12 | 1200 | 28 | 0.0982890707 | 0.4542665371 | 0.5026069756
13 | 1300 | 35 | 0.0827463686 | 0.0927288571 | 0.5292750011
14 | 1400 | 42 | 0.0741488824 | 0.0798134409 | 0.5472131953
15 | 1500 | 34 | 0.0552699567 | 0.0623859263 | 0.5743751696
16 | 1600 | 41 | 0.0486735914 | 0.0568380485 | 0.5863266610
17 | 1700 | 44 | 0.0414556396 | 0.0527576674 | 0.5986381682
18 | 1800 | 47 | 0.0334624332 | 0.0459012539 | 0.6104355679
19 | 1900 | 60 | 0.0267201070 | 0.0363456877 | 0.6221119168
20 | 2000 | 74 | 0.0223180784 | 0.0283592645 | 0.6291423025
Object Oriented¶
[2]:
import dill
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
problem = get_problem("zdt1", n_var=5)
algorithm = NSGA2(pop_size=100)
algorithm.setup(problem, seed=1, termination=('n_gen', 20))
for k in range(5):
algorithm.next()
print(algorithm.n_gen)
with open("checkpoint", "wb") as f:
dill.dump(algorithm, f)
with open("checkpoint", 'rb') as f:
checkpoint = dill.load(f)
print("Loaded Checkpoint:", checkpoint)
while checkpoint.has_next():
checkpoint.next()
print(checkpoint.n_gen)
2
3
4
5
6
Loaded Checkpoint: <pymoo.algorithms.moo.nsga2.NSGA2 object at 0x11b5104a0>
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
From a Text File¶
First, load the data from a file. Usually, this will include the variables X
, the objective values F
(and the constraints G
). Here, they are created randomly. Always make sure the Problem
you are solving would return the same values for the given X
values. Otherwise the data might be misleading for the algorithm.
(This is not the case here. It is really JUST for illustration purposes)
[3]:
import numpy as np
from pymoo.problems.single import G1
problem = G1()
N = 300
np.random.seed(1)
X = np.random.random((N, problem.n_var))
# here F and G is re-evaluated - in practice you want to load them from files too
F, G = problem.evaluate(X, return_values_of=["F", "G"])
Then, create a population object using your data:
[4]:
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.problems.static import StaticProblem
# now the population object with all its attributes is created (CV, feasible, ...)
pop = Population.new("X", X)
pop = Evaluator().eval(StaticProblem(problem, F=F, G=G), pop)
And finally run it with a non-random initial population sampling=pop
:
[5]:
from pymoo.algorithms.soo.nonconvex.ga import GA
from pymoo.optimize import minimize
# the algorithm is now called with the population - biased initialization
algorithm = GA(pop_size=100, sampling=pop)
res = minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
=================================================================================================
n_gen | n_eval | cv_min | cv_avg | f_avg | f_min | f_gap
=================================================================================================
1 | 0 | 0.000000E+00 | 0.1192400898 | -1.037973E+00 | -3.869005E+00 | 1.113099E+01
2 | 100 | 0.000000E+00 | 0.000000E+00 | -2.232147E+00 | -3.889330E+00 | 1.111067E+01
3 | 200 | 0.000000E+00 | 0.000000E+00 | -2.814318E+00 | -4.444381E+00 | 1.055562E+01
4 | 300 | 0.000000E+00 | 0.000000E+00 | -3.400805E+00 | -5.284807E+00 | 9.7151931823
5 | 400 | 0.000000E+00 | 0.000000E+00 | -3.925938E+00 | -5.875509E+00 | 9.1244906583
6 | 500 | 0.000000E+00 | 0.000000E+00 | -4.527273E+00 | -6.098598E+00 | 8.9014020630
7 | 600 | 0.000000E+00 | 0.000000E+00 | -5.143766E+00 | -6.862846E+00 | 8.1371544108
8 | 700 | 0.000000E+00 | 0.000000E+00 | -5.748626E+00 | -7.936588E+00 | 7.0634122602
9 | 800 | 0.000000E+00 | 0.000000E+00 | -6.248923E+00 | -8.528763E+00 | 6.4712366199
10 | 900 | 0.000000E+00 | 0.000000E+00 | -6.759579E+00 | -8.604401E+00 | 6.3955990295