Skip to main content

Preventing Premature Convergence and Proving the Optimality in Evolutionary Algorithms

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8752))

Abstract

Evolutionary Algorithms (EA) usually carry out an efficient exploration of the search-space, but get often trapped in local minima and do not prove the optimality of the solution. Interval-based techniques, on the other hand, yield a numerical proof of optimality of the solution. However, they may fail to converge within a reasonable time due to their exponential complexity and their inability to quickly compute a good approximation of the global minimum. The contribution of this paper is a hybrid algorithm called Charibde in which a particular EA, Differential Evolution, cooperates with a branch and bound algorithm endowed with interval propagation techniques. It prevents premature convergence toward local optima and is highly competitive with both deterministic and stochastic existing approaches. We demonstrate its efficiency on a benchmark of highly multimodal problems, for which we provide previously unknown global minima and certification of optimality.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Corresponding solutions are available upon request.

References

  1. Moore, R.E.: Interval Analysis. Prentice-Hall, Englewood Cliffs (1966)

    MATH  Google Scholar 

  2. Sotiropoulos, G.D., Stavropoulos, C.E., Vrahatis, N.M.: A new hybrid genetic algorithm for global optimization. In: Proceedings of Second World Congress on Nonlinear Analysts, pp. 4529–4538. Elsevier Science Publishers Ltd. (1997)

    Google Scholar 

  3. Holland, J.H.: Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor (1975)

    Google Scholar 

  4. Zhang, X., Liu, S.: A new interval-genetic algorithm. Int. Conf. Nat. Comput. 4, 193–197 (2007)

    Google Scholar 

  5. Lei, Y., Chen, S.: A reliable parallel interval global optimization algorithm based on mind evolutionary computation. In: 2012 Seventh ChinaGrid Annual Conference, pp. 205–209 (2009)

    Google Scholar 

  6. Alliot, J.M., Durand, N., Gianazza, D., Gotteland, J.B.: Finding and proving the optimum: cooperative stochastic and deterministic search. In: 20th European Conference on Artificial Intelligence (2012)

    Google Scholar 

  7. Hansen, E.: Global Optimization Using Interval Analysis. Dekker, New York (1992)

    MATH  Google Scholar 

  8. Chabert, G., Jaulin, L.: Contractor programming. Artif. Intell. 173, 1079–1100 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  9. Benhamou, F., Goualard, F., Granvilliers, L., Puget, J.F.: Revising hull and box consistency. In: International Conference on Logic Programming, pp. 230–244. MIT press, Cambridge (1999)

    Google Scholar 

  10. Araya, I., Trombettoni, G., Neveu, B.: Exploiting monotonicity in interval constraint propagation. In: Proceedings of the AAAI, pp. 9–14 (2010)

    Google Scholar 

  11. Storn, R., Price, K.: Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11, 341–359 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  12. Price, K., Storn, R., Lampinen, J.: Differential Evolution - A Practical Approach to Global Optimization. Natural Computing. Springer, New York (2006)

    Google Scholar 

  13. Rall, L.B.: Automatic Differentiation: Techniques and Applications. LNCS, vol. 120. Springer, Heidelberg (1981)

    Book  MATH  Google Scholar 

  14. Kearfott, R.B.: Interval extensions of non-smooth functions for global optimization and nonlinear systems solvers. Computing 57, 57–149 (1996)

    Article  MathSciNet  Google Scholar 

  15. Whitley, D., Mathias, K., Rana, S., Dzubera, J.: Evaluating evolutionary algorithms. Artif. Intell. 85, 245–276 (1996)

    Article  Google Scholar 

  16. Mishra, S.K.: Some new test functions for global optimization and performance of repulsive particle swarm method. Technical report, University Library of Munich, Germany (2006)

    Google Scholar 

  17. Sekaj, I.: Robust parallel genetic algorithms with re-initialisation. In: Yao, X., et al. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 411–419. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  18. Kim, Y.H., Lee, K.H., Yoon, Y.: Visualizing the search process of particle swarm optimization. In: Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation, pp. 49–56. ACM (2009)

    Google Scholar 

  19. Coello Coello, C.A.: Use of a self-adaptive penalty approach for engineering optimization problems. Comput. Ind. 41, 113–127 (1999)

    Article  Google Scholar 

  20. Zhang, J., Zhou, Y., Deng, H.: Hybridizing particle swarm optimization with differential evolution based on feasibility rules. In: ICGIP 2012, vol. 8768 (2013)

    Google Scholar 

  21. Aguirre, A., Muñoz Zavala, A., Villa Diharce, E., Botello Rionda, S.: COPSO: constrained optimization via PSO algorithm. Technical report, CIMAT (2007)

    Google Scholar 

  22. Duenez-Guzman, E., Aguirre, A.: The Baldwin effect as an optimization strategy. Technical report, CIMAT (2007)

    Google Scholar 

  23. Keane, A.J.: A brief comparison of some evolutionary optimization methods. In: Proceedings of the Conference on Applied Decision Technologies. Modern Heuristic Search Methods, Uxbridge, 1995, pp. 255–272. Wiley, Chichester (1996)

    Google Scholar 

  24. Mishra, S.K.: Minimization of Keane‘s bump function by the repulsive particle swarm and the differential evolution methods. Technical report, North-Eastern Hill University, Shillong (India) (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Charlie Vanaret .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Vanaret, C., Gotteland, JB., Durand, N., Alliot, JM. (2014). Preventing Premature Convergence and Proving the Optimality in Evolutionary Algorithms. In: Legrand, P., Corsini, MM., Hao, JK., Monmarché, N., Lutton, E., Schoenauer, M. (eds) Artificial Evolution. EA 2013. Lecture Notes in Computer Science(), vol 8752. Springer, Cham. https://doi.org/10.1007/978-3-319-11683-9_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-11683-9_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-11682-2

  • Online ISBN: 978-3-319-11683-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics