Checkpoints#
Sometimes, it might be useful to store some checkpoints while executing an algorithm. In particular, if a run is very time-consuming. pymoo offers to resume a run by serializing the algorithm object and loading it. Resuming runs from checkpoints is possible
the functional way by calling the
minimizemethod,the object-oriented way by repeatedly calling the
next()method orfrom a text file (Biased Initialization from
Population)
Functional#
[1]:
import dill
from pymoo.problems import get_problem
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.optimize import minimize
from pymoo.termination.max_gen import MaximumGenerationTermination
problem = get_problem("zdt1", n_var=5)
algorithm = NSGA2(pop_size=100)
res = minimize(problem,
algorithm,
('n_gen', 5),
seed=1,
copy_algorithm=False,
verbose=True)
with open("checkpoint", "wb") as f:
dill.dump(algorithm, f)
with open("checkpoint", 'rb') as f:
checkpoint = dill.load(f)
print("Loaded Checkpoint:", checkpoint)
# only necessary if for the checkpoint the termination criterion has been met
checkpoint.termination = MaximumGenerationTermination(20)
res = minimize(problem,
checkpoint,
seed=1,
copy_algorithm=False,
verbose=True)
==========================================================================
n_gen | n_eval | n_nds | igd | gd | hv
==========================================================================
1 | 100 | 6 | 0.8742936714 | 1.9701230739 | 0.000000E+00
2 | 200 | 3 | 0.6930051537 | 2.5115499551 | 0.0287298304
3 | 300 | 6 | 0.3983530489 | 0.9573652368 | 0.1695873896
4 | 400 | 9 | 0.3980883998 | 1.0536023525 | 0.1697396964
5 | 500 | 9 | 0.2551072315 | 0.3807202663 | 0.2909694485
Loaded Checkpoint: <pymoo.algorithms.moo.nsga2.NSGA2 object at 0x110b97fd0>
6 | 600 | 13 | 0.2252873024 | 0.5289527997 | 0.3028264758
7 | 700 | 18 | 0.1737469155 | 0.4043226128 | 0.3826159675
8 | 800 | 15 | 0.1520133308 | 0.1899146865 | 0.4229656474
9 | 900 | 19 | 0.1427353066 | 0.4004651833 | 0.4433181273
10 | 1000 | 23 | 0.1211343603 | 0.1497953666 | 0.4712236136
11 | 1100 | 29 | 0.1042113275 | 0.1901471505 | 0.4945848094
12 | 1200 | 27 | 0.0804965627 | 0.0839135479 | 0.5330384810
13 | 1300 | 24 | 0.0656606491 | 0.0651949038 | 0.5602749504
14 | 1400 | 36 | 0.0512114551 | 0.0498005872 | 0.5844583784
15 | 1500 | 43 | 0.0435970954 | 0.0460358793 | 0.5966502391
16 | 1600 | 52 | 0.0375939154 | 0.0360078151 | 0.6065968144
17 | 1700 | 53 | 0.0292151025 | 0.0299334176 | 0.6190011330
18 | 1800 | 53 | 0.0234917771 | 0.0218485061 | 0.6280646427
19 | 1900 | 55 | 0.0191801943 | 0.0165098624 | 0.6353122677
20 | 2000 | 70 | 0.0157804814 | 0.0144835141 | 0.6402342634
Object Oriented#
[2]:
import dill
from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.problems import get_problem
problem = get_problem("zdt1", n_var=5)
algorithm = NSGA2(pop_size=100)
algorithm.setup(problem, seed=1, termination=('n_gen', 20))
for k in range(5):
algorithm.next()
print(algorithm.n_gen)
with open("checkpoint", "wb") as f:
dill.dump(algorithm, f)
with open("checkpoint", 'rb') as f:
checkpoint = dill.load(f)
print("Loaded Checkpoint:", checkpoint)
while checkpoint.has_next():
checkpoint.next()
print(checkpoint.n_gen)
2
3
4
5
6
Loaded Checkpoint: <pymoo.algorithms.moo.nsga2.NSGA2 object at 0x105d4e050>
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
From a Text File#
First, load the data from a file. Usually, this will include the variables X, the objective values F (and the constraints G). Here, they are created randomly. Always make sure the Problem you are solving would return the same values for the given X values. Otherwise the data might be misleading for the algorithm.
(This is not the case here. It is really JUST for illustration purposes)
[3]:
import numpy as np
from pymoo.problems.single import G1
problem = G1()
N = 300
np.random.seed(1)
X = np.random.random((N, problem.n_var))
# here F and G is re-evaluated - in practice you want to load them from files too
F, G = problem.evaluate(X, return_values_of=["F", "G"])
Then, create a population object using your data:
[4]:
from pymoo.core.evaluator import Evaluator
from pymoo.core.population import Population
from pymoo.problems.static import StaticProblem
# now the population object with all its attributes is created (CV, feasible, ...)
pop = Population.new("X", X)
pop = Evaluator().eval(StaticProblem(problem, F=F, G=G), pop)
And finally run it with a non-random initial population sampling=pop:
[5]:
from pymoo.algorithms.soo.nonconvex.ga import GA
from pymoo.optimize import minimize
# the algorithm is now called with the population - biased initialization
algorithm = GA(pop_size=100, sampling=pop)
res = minimize(problem,
algorithm,
('n_gen', 10),
seed=1,
verbose=True)
=================================================================================================
n_gen | n_eval | cv_min | cv_avg | f_avg | f_min | f_gap
=================================================================================================
1 | 0 | 0.000000E+00 | 0.1192400898 | -1.037973E+00 | -3.869005E+00 | 1.113099E+01
2 | 100 | 0.000000E+00 | 0.000000E+00 | -2.271906E+00 | -3.869005E+00 | 1.113099E+01
3 | 200 | 0.000000E+00 | 0.000000E+00 | -2.818306E+00 | -4.062760E+00 | 1.093724E+01
4 | 300 | 0.000000E+00 | 0.000000E+00 | -3.303222E+00 | -4.811746E+00 | 1.018825E+01
5 | 400 | 0.000000E+00 | 0.000000E+00 | -3.740373E+00 | -5.676009E+00 | 9.3239906574
6 | 500 | 0.000000E+00 | 0.000000E+00 | -4.239895E+00 | -5.676009E+00 | 9.3239906574
7 | 600 | 0.000000E+00 | 0.000000E+00 | -4.700497E+00 | -5.848033E+00 | 9.1519665825
8 | 700 | 0.000000E+00 | 0.000000E+00 | -5.200533E+00 | -6.552622E+00 | 8.4473781141
9 | 800 | 0.000000E+00 | 0.000000E+00 | -5.632671E+00 | -6.634862E+00 | 8.3651379271
10 | 900 | 0.000000E+00 | 0.000000E+00 | -6.069812E+00 | -7.208883E+00 | 7.7911169982