Optimization and Engineering

, Volume 5, Issue 2, pp 101–122

Convergence Results for Generalized Pattern Search Algorithms are Tight

  • Charles Audet

DOI: 10.1023/B:OPTE.0000033370.66768.a9

Cite this article as:
Audet, C. Optimization and Engineering (2004) 5: 101. doi:10.1023/B:OPTE.0000033370.66768.a9


The convergence theory of generalized pattern search algorithms for unconstrained optimization guarantees under mild conditions that the method produces a limit point satisfying first order optimality conditions related to the local differentiability of the objective function. By exploiting the flexibility allowed by the algorithm, we derive six small dimensional examples showing that the convergence results are tight in the sense that they cannot be strengthened without additional assumptions, i.e., that certain requirement imposed on pattern search algorithms are not merely artifacts of the proofs.

In particular, we first show the necessity of the requirement that some algorithmic parameters are rational. We then show that, even for continuously differentiable functions, the method may generate infinitely many limit points, some of which may have non-zero gradients. Finally, we consider functions that are not strictly differentiable. We show that even when a single limit point is generated, the gradient may be non-zero, and zero may be excluded from the generalized gradient, therefore, the method does not necessarily produce a Clarke stationary point.

pattern search algorithms convergence analysis unconstrained optimization non-smooth analysis Clarke derivatives 

Copyright information

© Kluwer Academic Publishers 2004

Authors and Affiliations

  • Charles Audet
    • 1
  1. 1.Département de Mathématiques et de Génie IndustrielGERAD and École Polytechnique de MontréalMontréal (Canada

Personalised recommendations