Newsletter
* indicates required

Powered by MailChimp

header header
Version: 0.6.1.5
  • pymoo: Multi-objective Optimization in Python
  • News
  • Installation
  • Getting Started
    • Preface: Basics and Challenges
    • Part I: A Constrained Bi-objective Optimization Problem
    • Part II: Find a Solution Set using Multi-objective Optimization
    • Part III: Multi-Criteria Decision Making
    • Part IV: Analysis of Convergence
    • Part V: Some more useful Information
    • Source Code
  • Interface
    • Minimize
    • Problem
    • Algorithm
    • Termination Criterion
    • Callback
    • Display
    • Result
  • Problems
    • Definition
    • Test Problems
      • Ackley
      • Griewank
      • Zakharov
      • Rastrigin
      • Rosenbrock
      • BNH
      • ZDT
      • OSY
      • TNK
      • Truss2D
      • Welded Beam
      • Omni-test
      • SYM-PART
      • DTLZ
      • WFG
      • MW
      • DAS-CMOP
      • MODAct
      • DF: Benchmark Problems for CEC2018 Competition on Dynamic Multiobjective Optimisation
    • Parallelization
  • Algorithms
    • Initialization
    • Usage
    • List Of Algorithms
    • Hyperparameters
    • GA: Genetic Algorithm
    • BRKGA: Biased Random Key Genetic Algorithm
    • DE: Differential Evolution
    • Nelder Mead
    • PSO: Particle Swarm Optimization
    • Pattern Search
    • ES: Evolutionary Strategy
    • SRES: Stochastic Ranking Evolutionary Strategy
    • ISRES: Improved Stochastic Ranking Evolutionary Strategy
    • CMA-ES
    • G3PCX: A Computationally Efficient Evolutionary Algorithm for Real-Parameter Optimization
    • NSGA-II: Non-dominated Sorting Genetic Algorithm
    • R-NSGA-II
    • NSGA-III
    • U-NSGA-III
    • R-NSGA-III
    • MOEA/D
    • C-TAEA
    • AGE-MOEA: Adaptive Geometry Estimation based MOEA
    • AGE-MOEA2: Adaptive Geometry Estimation based MOEA
    • RVEA: Reference Vector Guided Evolutionary Algorithm
    • SMS-EMOA: Multiobjective selection based on dominated hypervolume
    • D-NSGA-II: Dynamic Multi-Objective Optimization Using Modified NSGA-II
    • KGB-DMOEA: Knowledge-Guided Bayesian Dynamic Multi-Objective Evolutionary Algorithm
  • Constraint Handling
    • Constrained Problem
    • Feasibility First (Parameter-less Approach)
    • Constraint Violation (CV) as Penalty
    • Constraint Violation (CV) As Objective
    • \(\epsilon\) -Constraint Handling
    • Repair Operator
  • Gradients
  • Customization
    • Binary Variable Problem
    • Discrete Variable Problem
    • Permutations
    • Mixed Variable Problem
    • Custom Variable Type
    • Biased Initialization
    • Subset Selection Problem
  • Operators
    • Sampling
    • Selection
    • Crossover
    • Mutation
    • Repair
  • Visualization
    • Scatter Plot
    • Parallel Coordinate Plots
    • Heatmap
    • Petal Diagram
    • Radar Plot
    • Radviz
    • Star Coordinate Plot
    • Video
  • Multi-Criteria Decision Making (MCDM)
  • Case Studies
    • Subset Selection Problem
    • Portfolio Allocation
  • Performance Indicator
  • Miscellaneous
    • Reference Directions
    • Convergence
    • Decomposition
    • Karush Kuhn Tucker Proximity Measure (KKTPM)
    • Checkpoints
  • FAQ
  • API Reference
    • Model
    • Algorithms
  • Versions
  • Contribute
  • Cite Us
  • Contact
  • License

Pattern Search¶

An implementation of well-known Hooke and Jeeves Pattern Search [22] for single-objective optimization which makes use of exploration and pattern moves in an alternating manner. For now, we like to refer to Wikipedia for more information such as pseudo code and visualizations in the search space.

[1]:
from pymoo.algorithms.soo.nonconvex.pattern import PatternSearch
from pymoo.problems.single import Himmelblau
from pymoo.optimize import minimize


problem = Himmelblau()

algorithm = PatternSearch()

res = minimize(problem,
               algorithm,
               verbose=False,
               seed=1)

print("Best solution found: \nX = %s\nF = %s" % (res.X, res.F))
Best solution found:
X = [ 3.58442865 -1.84812627]
F = [6.43655303e-12]

API¶

class pymoo.algorithms.soo.nonconvex.pattern.PatternSearch(self, init_delta=0.25, init_rho=0.5, step_size=1.0, output=SingleObjectiveOutput(), **kwargs)

An implementation of well-known Hooke and Jeeves Pattern Search.

Parameters
x0numpy.array

The initial value where the local search should be initiated. If not provided n_sample_points are created using latin hypercube sampling and the best solution found is set to x0.

n_sample_pointsint

Number of sample points to be used to determine the initial search point. (Only used of x0 is not provided)

deltafloat

The delta values which is used for the exploration move. If lower and upper bounds are provided the value is in relation to the overall search space. For instance, a value of 0.25 means that initially the pattern is created in 25% distance of the initial search point.

rhofloat

If the move was unsuccessful then the delta value is reduced by multiplying it with the value provided. For instance, explr_rho implies that with a value of delta/2 is continued.

step_sizefloat

After the exploration move the new center is determined by following a promising direction. This value defines how large to step on this direction will be.

Julian Blank © Copyright 2020.