An implementation of well-known Hooke and Jeeves Pattern Search  for single-objective optimization which makes use of exploration and pattern moves in an alternating manner. For now, we like to refer to Wikipedia for more information such as pseudo code and visualizations in the search space.
from pymoo.algorithms.soo.nonconvex.pattern_search import PatternSearch from pymoo.factory import Himmelblau from pymoo.optimize import minimize problem = Himmelblau() algorithm = PatternSearch() res = minimize(problem, algorithm, verbose=False, seed=1) print("Best solution found: \nX = %s\nF = %s" % (res.X, res.F))
Best solution found: X = [ 3.58442689 -1.84812727] F = [1.25059633e-10]
PatternSearch(self, init_delta=0.25, rho=0.5, step_size=1.0, display=PatternSearchDisplay(), **kwargs)
An implementation of well-known Hooke and Jeeves Pattern Search.
The initial value where the local search should be initiated. If not provided n_sample_points are created created using latin hypercube sampling and the best solution found is set to x0.
Number of sample points to be used to determine the initial search point. (Only used of x0 is not provided)
The delta values which is used for the exploration move. If lower and upper bounds are provided the value is in relation to the overall search space. For instance, a value of 0.25 means that initially the pattern is created in 25% distance of the initial search point.
If the move was unsuccessful then the delta value is reduced by multiplying it with the value provided. For instance, explr_rho implies that with a value of delta/2 is continued.
After the exploration move the new center is determined by following a promising direction. This value defines how large to step on this direction will be.