Evolutionary Elementary Cooperative Strategy for Global Optimization

  • Crina Grosan
  • Ajith Abraham
  • Monica Chis
  • Tae-Gyu Chang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4253)


Nonlinear functions optimization is still a challenging problem of great importance. This paper proposes a novel optimization technique called Evolutionary Elementary Cooperative Strategy (EECS) that integrates ideas form interval division in an evolutionary scheme. We compare the performances of the proposed algorithm with the performances of three well established global optimization techniques namely Interval Branch and Bound with Local Sampling (IVL), Advanced Scatter Search (ASS) and Simplex Coding Genetic Algorithm (SCGA). We also present the results obtained by EECS for higher dimension functions. Empirical results for the functions considered reveal that the proposed method is promising.


Global Optimization Scatter Search Dimension Number Global Optimization Technique Space Division 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Clausen, J., Zilinskas, A.: Subdivision, sampling and initialization strategies for simplical branch and bound in global optimization. Computers and Mathematics with Applications 44, 943–955 (2002)MATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Csallner, A.E.: Lipschitz continuity and the termination of interval methods for global optimization. Computers and Mathematics with Applications 42, 1035–1042 (2001)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Floudas, C.A., Pardalos, P.M.: A collection of test problems for constraint global optimization algorithms. Springer, Heidelberg (1990)Google Scholar
  4. 4.
    Grosan, C., Abraham, A.: A simple strategy for nonlinear optimization. In: Proceedings of the Third International Conference on Neural, Parallel and Scientific Computation, Atlanta, USA (in press, 2006)Google Scholar
  5. 5.
    Hedar, A.R., Fukushima, M.: Simplex coding genetic algorithm for the global optimization of nonlinear functions. In: Tanino, T., Tanaka, T., Inuiguchi, M. (eds.) Multi-Objective Programming and Goal Programming, pp. 135–140. Springer, Heidelberg (2003)Google Scholar
  6. 6.
    Laguna, M., Marti, R.: Scatter search: methodology and implementations. Kluwer Academic Publishers, Dordrecht (2003)Google Scholar
  7. 7.
    Laguna, M., Marti, R.: Experimental testing of advanced scatter search designs for global optimization of multimodal functions. Journal of Global Optimization 33, 235–255 (2005)MATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Pardalos, P.M., Romejin, H.E.: Handbook of global optimization. Kluwer Academic Publishers, Boston (2002)MATHGoogle Scholar
  9. 9.
    Sun, M., Johnson, A.W.: Interval branch and bound with local sampling for constrained global optimization. Journal of Global Optimization 33, 61–82 (2005)CrossRefMathSciNetGoogle Scholar
  10. 10.
    Tsoulos, I.G., Lagaris, I.E.: MinFinder: Locating all the local minima of a function. Computer Physics Communications 174, 166–179 (2006)MATHCrossRefMathSciNetGoogle Scholar
  11. 11.
    Van Voorhis, T.: A global optimization algorithm using Lagrangian underestimates and the interval Newton method. Journal of Global Optimization 24, 349–370 (2002)MATHCrossRefGoogle Scholar
  12. 12. (accessed on May 20, 2006)
  13. 13.

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Crina Grosan
    • 1
  • Ajith Abraham
    • 2
  • Monica Chis
    • 3
  • Tae-Gyu Chang
    • 2
  1. 1.Department of Computer ScienceBabeş-Bolyai UniversityCluj-NapocaRomania
  2. 2.School of Computer Science and Engineering Chung-Ang UniversitySeoulKorea
  3. 3.Avram Iancu UniversityCluj-NapocaRomania

Personalised recommendations