GradientsΒΆ

Warning

Not supported in the current version anymore. Gradient calculation needs to be reworked.

If the problem is implemented using autograd then the gradients through automatic differentiation are available out of the box. Let us consider the following problem definition for a simple quadratic function without any constraints:

[1]:
import numpy as np

import pymoo.gradient.toolbox as anp
from pymoo.core.problem import Problem
from pymoo.gradient.automatic import AutomaticDifferentiation


class MyProblem(Problem):

    def __init__(self):
        super().__init__(n_var=10, n_obj=1, xl=-5, xu=5)

    def _evaluate(self, x, out, *args, **kwargs):
        out["F"] = anp.sum(anp.power(x, 2), axis=1)

problem = AutomaticDifferentiation(MyProblem())

The gradients can be retrieved by appending F to the return_values_of parameter:

[2]:
X = np.array([np.arange(10)]).astype(float)
F, dF = problem.evaluate(X, return_values_of=["F", "dF"])
---------------------------------------------------------------------------
UnboundLocalError                         Traceback (most recent call last)
Cell In[2], line 2
      1 X = np.array([np.arange(10)]).astype(float)
----> 2 F, dF = problem.evaluate(X, return_values_of=["F", "dF"])

File ~/workspace/pymoo/pymoo/core/problem.py:257, in Problem.evaluate(self, X, return_values_of, return_as_dictionary, *args, **kwargs)
    254     only_single_value = not (isinstance(X, list) or isinstance(X, np.ndarray))
    256 # this is where the actual evaluation takes place
--> 257 _out = self.do(X, return_values_of, *args, **kwargs)
    259 out = {}
    260 for k, v in _out.items():
    261
    262     # copy it to a numpy array (it might be one of jax at this point)

File ~/workspace/pymoo/pymoo/gradient/automatic.py:54, in AutomaticDifferentiation.do(self, x, return_values_of, *args, **kwargs)
     51     from pymoo.gradient.grad_autograd import autograd_vectorized_value_and_grad
     52     out, grad = autograd_vectorized_value_and_grad(f, x)
---> 54 for k, v in grad.items():
     55     out["d" + k] = v
     57 return out

UnboundLocalError: cannot access local variable 'grad' where it is not associated with a value

The resulting gradients are stored in dF and the shape is (n_rows, n_objective, n_vars):

[ ]:
print(X, F)
print(dF.shape)
print(dF)

Analogously, the gradient of constraints can be retrieved by appending dG.