Pattern search (optimization)

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

File:Direct search BROYDEN.gif
Example of convergence of a direct search method on Broyden function

Pattern search (PS) is a family of numerical optimization methods that do not require the gradient of the problem to be optimized. Hence PS can be used on functions that are not continuous or differentiable. Such optimization methods are also known as direct-search, derivative-free, or black-box methods.

The name, pattern search, was coined by Hooke and Jeeves.[1] An early and simple PS variant is attributed to Fermi and Metropolis when they worked at the Los Alamos National Laboratory as described by Davidon [2] who summarized the algorithm as follows:

<templatestyles src="Template:Blockquote/styles.css" />

They varied one theoretical parameter at a time by steps of the same magnitude, and when no such increase or decrease in any one parameter further improved the fit to the experimental data, they halved the step size and repeated the process until the steps were deemed sufficiently small.

Convergence

A convergent pattern-search method was proposed by Yu, who proved that it converged using the theory of positive bases.[3] Later, Torczon, Lagarias, and coauthors,[4][5] used positive-basis techniques to prove the convergence of another pattern search method, on a specific class of functions. Outside of such classes, pattern search is a heuristic that can provide useful approximate solutions for some problems, but can fail on others. Outside of such classes, pattern search is not an iterative method that converges to a solution; indeed, pattern search methods can converge to non-stationary points on some relatively tame problems.[6][7]

See also

  • Golden section search conceptually resembles PS in its narrowing of the search-range, only for single-dimensional search-spaces.
  • Nelder–Mead method aka. the simplex method conceptually resembles PS in its narrowing of the search-range for multi-dimensional search-spaces but does so by maintaining n+1 points for n-dimensional search-spaces whereas PS methods computes 2n+1 points (the central point and 2 points in each dimension).
  • Luus–Jaakola samples from a uniform distribution surrounding the current position and uses a simple formula for exponentially decreasing the sampling range.
  • Random search is a related family of optimization methods which sample from a hypersphere surrounding the current position.
  • Random optimization is a related family of optimization methods which sample from a normal distribution surrounding the current position.

Applications

Robust optimization for oil field development planning

Many of the optimization problems in science and engineering involve nonlinear objective functions. Advanced forms of pattern search optimization have been used for optimization of hydrocarbon field development planning.[8]

References

  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. *Yu, Wen Ci. 1979. “Positive basis and a class of direct search techniques”. Scientia Sinica [Zhongguo Kexue]: 53—68.
    • Yu, Wen Ci. 1979. “The convergent property of the simplex evolutionary technique”. Scientia Sinica [Zhongguo Kexue]: 69–77.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. * Powell, Michael J. D. 1973. ”On Search Directions for Minimization Algorithms.” Mathematical Programming 4: 193—201.
  7. * Lua error in package.lua at line 80: module 'strict' not found. (algorithm summary online).
  8. Lua error in package.lua at line 80: module 'strict' not found.