Checkpoints¶
Sometimes, it might be useful to store some checkpoints while executing an algorithm. In particular, if a run is very time-consuming. pymoo offers to resume a run by serializing the algorithm object and loading it. Resuming runs from checkpoints is possible
the functional way by calling the
minimize
method,the object-oriented way by repeatedly calling the
next()
method orfrom a text file (Biased Initialization from
Population
)
Functional¶
[1]:
import dill
from pymoo.problems import get_problem
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.optimize import minimize
from pymoo.termination.max_gen import MaximumGenerationTermination
problem = get_problem("zdt1", n_var=5)
algorithm = NSGA2(pop_size=100)
res = minimize(problem,
algorithm,
('n_gen', 5),
seed=1,
copy_algorithm=False,
verbose=True)
with open("checkpoint", "wb") as f:
dill.dump(algorithm, f)
with open("checkpoint", 'rb') as f:
checkpoint = dill.load(f)
print("Loaded Checkpoint:", checkpoint)
# only necessary if for the checkpoint the termination criterion has been met
checkpoint.termination = MaximumGenerationTermination(20)
res = minimize(problem,
checkpoint,
seed=1,
copy_algorithm=False,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 100 | 6 | 0.5914067243 | 2.8577180757 | 0.0841819857
2 | 200 | 11 | 0.5585768211 | 2.9109400192 | 0.0841819857
3 | 300 | 7 | 0.5585768211 | 1.4181629093 | 0.0841819857
4 | 400 | 10 | 0.3825358404 | 0.8914187867 | 0.2070607532
5 | 500 | 15 | 0.3261566436 | 0.6950892340 | 0.2438988346
Loaded Checkpoint: <pymoo.algorithms.moo.nsga2.NSGA2 object at 0x7f9f534a6d30>
6 | 600 | 14 | 0.2923258920 | 0.6466955820 | 0.2642929306
7 | 700 | 16 | 0.2463025717 | 0.4725884895 | 0.3260276052
8 | 800 | 15 | 0.2129582883 | 0.2431807423 | 0.3683041634
9 | 900 | 20 | 0.1923479589 | 0.2177922486 | 0.4110152031
10 | 1000 | 27 | 0.1415830515 | 0.1783638654 | 0.4561861955
11 | 1100 | 28 | 0.1035407442 | 0.1535122238 | 0.4937232514
12 | 1200 | 28 | 0.0880462696 | 0.1237908540 | 0.5205170196
13 | 1300 | 30 | 0.0678094011 | 0.0789824650 | 0.5500254394
14 | 1400 | 38 | 0.0590814807 | 0.0680016007 | 0.5649987773
15 | 1500 | 43 | 0.0433866819 | 0.0434990796 | 0.5919874829
16 | 1600 | 47 | 0.0338105422 | 0.0307276242 | 0.6109725279
17 | 1700 | 51 | 0.0296574791 | 0.0256044693 | 0.6181001831
18 | 1800 | 60 | 0.0250598414 | 0.0205876786 | 0.6268654140
19 | 1900 | 66 | 0.0195637417 | 0.0176046603 | 0.6341363564
20 | 2000 | 68 | 0.0164909665 | 0.0137287454 | 0.6390702006
Object Oriented¶
[2]:
import dill
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
problem = get_problem("zdt1", n_var=5)
algorithm = NSGA2(pop_size=100)
algorithm.setup(problem, seed=1, termination=('n_gen', 20))
for k in range(5):
algorithm.next()
print(algorithm.n_gen)
with open("checkpoint", "wb") as f:
dill.dump(algorithm, f)
with open("checkpoint", 'rb') as f:
checkpoint = dill.load(f)
print("Loaded Checkpoint:", checkpoint)
while checkpoint.has_next():
checkpoint.next()
print(checkpoint.n_gen)
2
3
4
5
6
Loaded Checkpoint: <pymoo.algorithms.moo.nsga2.NSGA2 object at 0x7f9f53531400>
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
From a Text File¶
First, load the data from a file. Usually, this will include the variables X
, the objective values F
(and the constraints G
). Here, they are created randomly. Always make sure the Problem
you are solving would return the same values for the given X
values. Otherwise the data might be misleading for the algorithm.
(This is not the case here. It is really JUST for illustration purposes)
[3]:
import numpy as np
from pymoo.problems.single import G1
problem = G1()
N = 300
np.random.seed(1)
X = np.random.random((N, problem.n_var))
# here F and G is re-evaluated - in practice you want to load them from files too
F, G = problem.evaluate(X, return_values_of=["F", "G"])
Then, create a population object using your data:
[4]:
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.problems.static import StaticProblem
# now the population object with all its attributes is created (CV, feasible, ...)
pop = Population.new("X", X)
pop = Evaluator().eval(StaticProblem(problem, F=F, G=G), pop)
And finally run it with a non-random initial population sampling=pop
:
[5]:
from pymoo.algorithms.soo.nonconvex.ga import GA
from pymoo.optimize import minimize
# the algorithm is now called with the population - biased initialization
algorithm = GA(pop_size=100, sampling=pop)
res = minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
=================================================================================================
n_gen | n_eval | cv_min | cv_avg | f_avg | f_min | f_gap
=================================================================================================
1 | 0 | 0.000000E+00 | 0.1192400898 | -1.037973E+00 | -3.869005E+00 | 1.113099E+01
2 | 100 | 0.000000E+00 | 0.000000E+00 | -2.313258E+00 | -3.889330E+00 | 1.111067E+01
3 | 200 | 0.000000E+00 | 0.000000E+00 | -3.011123E+00 | -5.681386E+00 | 9.3186144627
4 | 300 | 0.000000E+00 | 0.000000E+00 | -3.832939E+00 | -6.151585E+00 | 8.8484149398
5 | 400 | 0.000000E+00 | 0.000000E+00 | -4.687608E+00 | -6.525307E+00 | 8.4746928550
6 | 500 | 0.000000E+00 | 0.000000E+00 | -5.600223E+00 | -7.318898E+00 | 7.6811020464
7 | 600 | 0.000000E+00 | 0.000000E+00 | -6.224394E+00 | -7.318898E+00 | 7.6811020464
8 | 700 | 0.000000E+00 | 0.000000E+00 | -6.843656E+00 | -8.988414E+00 | 6.0115855450
9 | 800 | 0.000000E+00 | 0.000000E+00 | -7.405048E+00 | -9.148926E+00 | 5.8510736836
10 | 900 | 0.000000E+00 | 0.000000E+00 | -8.051612E+00 | -1.110301E+01 | 3.8969949615