Advertisement

Soft Computing

, Volume 15, Issue 11, pp 2287–2298 | Cite as

VXQR: derivative-free unconstrained optimization based on QR factorizations

  • Arnold Neumaier
  • Hannes Fendl
  • Harald Schilly
  • Thomas Leitner
Focus

Abstract

This paper presents basic features of a new family of algorithms for unconstrained derivative-free optimization, based on line searches along directions generated from QR factorizations of past direction matrices. Emphasis is on fast descent with a low number of function values, so that the algorithm can be used for fairly expensive functions. The theoretical total time overhead needed per function evaluation is of order O(n 2), where n is the problem dimension, but the observed overhead is much smaller. Numerical results are given for a particular algorithm VXQR1 from this family, implemented in Matlab, and evaluated on the scalability test set of Herrera et al. (http://www.sci2s.ugr.es/eamhco/CFP.php, 2010) for problems in dimensions n ∈ {50, 100, 200, 500, 1,000}. Performance depends a lot on the graph \(\{(t,f(x+th))\mid t\in[0,1]\}\) of the function along line segments. The algorithm is typically very fast on smooth problems with not too rugged graphs, and on problems with a roughly separable structure. It typically performs poorly on problems where the graph along many directions is highly multimodal without pronounced overall slope (e.g., for smooth functions with superimposed oscillations of significant size), where the graphs along many directions are piecewise constant (e.g., for problems minimizing a maximum norm), or where the function overflows on the major part of the search region and no starting point with finite function value is known.

Keywords

Derivative-free optimization Black box optimization Scalability High-dimensional Global optimization Line search Expensive objective function 

References

  1. Ashlock D (2006) Evolutionary computation for modeling and optimization. Springer, BerlinGoogle Scholar
  2. Auger A, Hansen N (2005) A restart CMA evolution strategy with increasing population size. In: The 2005 IEEE congress on evolutionary computation, vol 2, pp 1769–1776Google Scholar
  3. Bélisle CJ, Romeijn HE, Smith RL (1993) Hit-and-run algorithms for generating multivariate distributions. Math Oper Res 18:255–266MATHCrossRefGoogle Scholar
  4. Conn AR, Scheinberg K, Vicente LN (2009) Introduction to derivative-free optimization. SIAM, Philadelphia, PAMATHCrossRefGoogle Scholar
  5. Csendes T, Pál L, Sendin JOH, Banga JR (2008) The GLOBAL optimization method revisited. Optim Lett 2:445–454MATHCrossRefGoogle Scholar
  6. Dorigo M, Stützle T (2004) Ant colony optimization. MIT Press, CambridgeGoogle Scholar
  7. Eberhart R, Kennedy J (1995) A new optimizer using particle swarm theory. In: Proceedings of sixth international symposium micro machine and human science. Nagoya, Japan, pp 39–43Google Scholar
  8. Eshelman LJ, Schaffer JD (1993) Real-coded genetic algorithm and interval schemata. In: Whitley D (ed) Foundations of genetic algorithms workshop (FOGA-92). Vail, CO, pp 187–202Google Scholar
  9. Fan SS, Zahara E (2007) A hybrid simplex search and particle swarm optimization for unconstrained optimization. Eur J Oper Res 181:527–548MATHCrossRefGoogle Scholar
  10. Gilmore P, Kelley CT (1995) An implicit filtering algorithm for optimization of functions with many local minima. SIAM J Optim 5:269–285MATHCrossRefGoogle Scholar
  11. Glover F (1995) Tabu thresholding: Improved search by nonmonotonic trajectories. ORSA J Comput 7:426–442MATHGoogle Scholar
  12. Hansen N (2006) The CMA evolution strategy: a comparing review. In: Lozano JA (ed) Towards a new evolutionary computation. Advances on estimation of distribution algorithms. Springer, Berlin, pp 75–102Google Scholar
  13. Hansen N, Auger A, Ros R, Finck S, Pošik P (2010) Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009, Manuscript. http://www.lri.fr/hansen/gecco09-results-2010.pdf
  14. Herrera F, Lozano M, Molina D (2010) Test suite for the special Issue of soft computing on scalability of evolutionary algorithms and other metaheuristics for large scale continuous optimization problems. http://www.sci2s.ugr.es/eamhco/CFP.php
  15. Hooke R, Jeeves TA (1961) “Direct Search” solution of numerical and statistical problems. J ACM 8:212–229MATHCrossRefGoogle Scholar
  16. Huyer W, Neumaier A (1999) Global optimization by multilevel coordinate search. J Glob Optim 14:331–355MATHCrossRefGoogle Scholar
  17. Huyer W, Neumaier A (2008) SNOBFIT—stable noisy optimization by branch and Fit. ACM Trans Math Softw 35 (Article 9)Google Scholar
  18. Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Glob Optim 13:455–492MATHCrossRefGoogle Scholar
  19. Jones DR, Perttunen CD, Stuckman BE (1993) Lipschitzian optimization without the Lipschitz constant. J Optim Theory Appl 79:157–181MATHCrossRefGoogle Scholar
  20. Kelley CT (1999) Detection and remediation of stagnation in the Nelder-Mead algorithm using a sufficient decrease condition. SIAM J Optim 10:43–55MATHCrossRefGoogle Scholar
  21. Kolda TG, Lewis RM, Torczon VJ (2003) Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev 45:385–482MATHCrossRefGoogle Scholar
  22. Michalewicz Z, Fogel DB (2004) How to solve it: modern heuristics. Springer, BerlinGoogle Scholar
  23. NAG Library Chapter Contents (2009) E05−global optimization of a function. NAG Library, Mark 22. http://www.nag.co.uk/numeric/FL/nagdoc_fl22/xhtml/E05/e05conts.xml
  24. Nelder JA, Mead R (1965) A simplex method for function minimization. Comput J 7:308–313MATHGoogle Scholar
  25. Neumaier A (2001) Introduction to numerical analysis. Cambridge University Press, CambridgeMATHCrossRefGoogle Scholar
  26. Pintér JD (1995) Global optimization in action: continuous and Lipschitz optimization. Implementations and applications. Kluwer, DordrechtGoogle Scholar
  27. Powell MJD (2008) Developments of NEWUOA for minimization without derivatives. IMA J Numer Anal 28:649–664MATHCrossRefGoogle Scholar
  28. Rios LM (2009) Algorithms for derivative-free optimization. PhD thesis, University of Illinois at Urbana-ChampaignGoogle Scholar
  29. Rios LM, Sahinidis NV (2009) Derivative-free optimization: a review of algorithms and comparison of software implementations. In: Advances in optimization II (AIChE 2009)Google Scholar
  30. Sacks J, Welch WJ, Mitchell TJ, Wynn HP (1989) Design and analysis of computer experiments (with discussion). Stat Sci 4:409–435MATHCrossRefGoogle Scholar
  31. Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11:341–359MATHCrossRefGoogle Scholar
  32. Torczon V (1991) On the convergence of multidirectional search algorithms. SIAM J Optim 1:123–145MATHCrossRefGoogle Scholar
  33. Törn A, Žilinskas A (1989) Global optimization. Springer, New YorkMATHGoogle Scholar
  34. Van Laarhoven PJM, Aarts EHL (1987) Simulated annealing: theory and applications. Kluwer, DordrechtMATHGoogle Scholar

Copyright information

© Springer-Verlag 2010

Authors and Affiliations

  • Arnold Neumaier
    • 1
  • Hannes Fendl
    • 1
  • Harald Schilly
    • 1
  • Thomas Leitner
    • 1
  1. 1.Fakultät für MathematikUniversität WienWienAustria

Personalised recommendations