Advertisement

Log-log Convergence for Noisy Optimization

  • S. Astete-MoralesEmail author
  • J. LiuEmail author
  • Olivier TeytaudEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8752)

Abstract

We consider noisy optimization problems, without the assumption of variance vanishing in the neighborhood of the optimum. We show mathematically that simple rules with exponential number of resamplings lead to a \(\log \)-\(\log \) convergence rate. In particular, in this case the \(\log \) of the distance to the optimum is linear on the \(\log \) of the number of resamplings. As well as with number of resamplings polynomial in the inverse step-size. We show empirically that this convergence rate is obtained also with polynomial number of resamplings. In this polynomial resampling setting, using classical evolution strategies and an ad hoc choice of the number of resamplings, we seemingly get the same rate as those obtained with specific Estimation of Distribution Algorithms designed for noisy setting.

We also experiment non-adaptive polynomial resamplings. Compared to the state of the art, our results provide (i) proofs of \(\log \)-\(\log \) convergence for evolution strategies (which were not covered by existing results) in the case of objective functions with quadratic expectations and constant noise, (ii) \(\log \)-\(\log \) rates also for objective functions with expectation \({\mathbb E}[f(x)] = ||x-x^*||^p\), where \(x^*\) represents the optimum (iii) experiments with different parameterizations than those considered in the proof. These results propose some simple revaluation schemes. This paper extends [1].

References

  1. 1.
    Morales, S.A., Liu, J., Teytaud, O.: Noisy optimization convergence rates. In: GECCO (Companion), pp. 223–224 (2013)Google Scholar
  2. 2.
    Jones, D., Schonlau, M., Welch, W.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Auger, A., Jebalia, M., Teytaud, O.: Algorithms (X, sigma, eta): quasi-random mutations for evolution strategies. In: Talbi, E.-G., Liardet, P., Collet, P., Lutton, E., Schoenauer, M. (eds.) EA 2005. LNCS, vol. 3871, pp. 296–307. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  4. 4.
    Jebalia, M., Auger, A., Hansen, N.: Log linear convergence and divergence of the scale-invariant (1+1)-ES in noisy environments. Algorithmica 59(3), 425–460 (2010)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Arnold, D.V., Beyer, H.G.: A general noise model and its effects on evolution strategy performance. IEEE Trans. Evol. Comput. 10, 380–391 (2006)CrossRefGoogle Scholar
  6. 6.
    Finck, S., Beyer, H.G., Melkozerov, A.: Noisy optimization: a theoretical strategy comparison of ES, EGS, SPSA & IF on the noisy sphere. In: GECCO, pp. 813–820 (2011)Google Scholar
  7. 7.
    Coulom, R.: CLOP: confident local optimization for noisy Black-Box parameter tuning. In: van den Herik, H.J., Plaat, A. (eds.) ACG 2011. LNCS, vol. 7168, pp. 146–157. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  8. 8.
    Coulom, R., Rolet, P., Sokolovska, N., Teytaud, O.: Handling expensive optimization with large noise. In: Foundations of Genetic Algorithms (2011)Google Scholar
  9. 9.
    Teytaud, O., Decock, J.: Noisy optimization complexity. In: FOGA - Foundations of Genetic Algorithms XII - 2013, Adelaide, Australie (2013)Google Scholar
  10. 10.
    Beyer, H.G.: The Theory of Evolution Strategies. Natural Computing Series. Springer, Heideberg (2001)CrossRefGoogle Scholar
  11. 11.
    Yang, X., Birkfellner, W., Niederer, P.: Optimized 2d/3d medical image registration using the estimation of multivariate normal algorithm (EMNA). In: Biomedical Engineering (2005)Google Scholar
  12. 12.
    Anderson, E.J., Ferris, M.C.: A direct search algorithm for optimization with noisy function evaluations. SIAM J. Optim. 11, 837–857 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Lucidi, S., Sciandrone, M.: A derivative-free algorithm for bound constrained optimization. Comp. Opt. Appl. 21, 119–142 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Kim, S., Zhang, D.: Convergence properties of direct search methods for stochastic optimization. In: Proceedings of the Winter Simulation Conference, WSC ’10, pp. 1003–1011 (2010)Google Scholar
  15. 15.
    Hansen, N., Niederberger, S., Guzzella, L., Koumoutsakos, P.: A method for handling uncertainty in evolutionary optimization with an application to feedback control of combustion. IEEE Trans. Evol. Comput. 13, 180–197 (2009)CrossRefGoogle Scholar
  16. 16.
    Villemonteix, J., Vazquez, E., Walter, E.: An informational approach to the global optimization of expensive-to-evaluate functions. J. Global Optim. 44, 509–534 (2008)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Fabian, V.: Stochastic approximation of minima with improved asymptotic speed. Ann. Math. Stat. 38, 191–200 (1967)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Rolet, P., Teytaud, O.: Bandit-based estimation of distribution algorithms for noisy optimization: rigorous runtime analysis. In: Proceedings of Lion4 (accepted); presented in TRSH 2009 in Birmingham, pp. 97–110 (2009)Google Scholar
  19. 19.
    Auger, A.: Convergence results for (1,\(\lambda \))-SA-ES using the theory of \(\varphi \)-irreducible Markov chainsGoogle Scholar
  20. 20.
    Fournier, H., Teytaud, O.: Lower bounds for comparison based evolution strategies using VC-dimension and sign patterns. Algorithmica 59, 387–408 (2010)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.TAO (Inria), LRI, UMR 8623 (CNRS - Université Paris-Sud)OrsayFrance

Personalised recommendations