Journal of Heuristics

, Volume 7, Issue 3, pp 215–233 | Cite as

UEGO, an Abstract Clustering Technique for Multimodal Global Optimization

  • Márk Jelasity
  • Pilar Martínez Ortigosa
  • Inmaculada García


In this paper, UEGO, a new general technique for accelerating and/or parallelizing existing search methods is suggested. The skeleton of the algorithm is a parallel hill climber. The separate hill climbers work in restricted search regions (or clusters) of the search space. The volume of the clusters decreases as the search proceeds which results in a cooling effect similar to simulated annealing. Besides this, UEGO can be effectively parallelized; the communication between the clusters is minimal. The purpose of this communication is to ensure that one hill is explored only by one hill climber. UEGO makes periodic attempts to find new hills to climb. Empirical results are also presented which include an analysis of the effects of the user-given parameters and a comparison with a hill climber and a GA.


Artificial Intelligence Search Space Simulated Annealing Global Optimization Empirical Result 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Bäck, T. (Ed.). (1997). Proceedings of the Seventh International Conference on Genetic Algorithms, San Francisco, California: Morgan Kaufmann.Google Scholar
  2. Beasley, D., D.R. Bull, and R.R. Martin. (1993). “A Sequential Niche Technique for Multimodal Function Optimization.” Evolutionary Computation 1(2), 101–125.Google Scholar
  3. Deb, K. (1989). “Genetic Algorithms in Multimodal Function Optimization.” TCGA Report No. 89002, The University of Alabama, Dept. of Engineering Mechanics.Google Scholar
  4. Deb, K. and D.E. Goldberg. (1989). “An Investegation of Niche and Species Formation in Genetic Function Optimization.” In J.D. Schaffer (Ed.), The Proceedings of the Third International Conference on Genetic Algorithms. Morgan Kaufmann.Google Scholar
  5. Eiben, A.E. and J.K. van der Hauw. (1998). “Graph Coloring with Adaptive Genetic Algorithms.” Journal of Heuristics 4(1).Google Scholar
  6. Grefenstette, J.J. (1984). “Genesis: A System for Using Genetic Search Procedures.” In Proceedings of the 1984 Conference on Intelligent Systems and Machines, pp. 161–165.Google Scholar
  7. Hooker, J.N. (1995). “Testing Heuristics: We Have it All Wrong.” Journal of Heuristics 1(1), 33–42.Google Scholar
  8. Ishibuchi, H., T. Murata, and S. Tomioka. “Effectiveness of Genetic Local Search Algorithms.” In T. Bäck (Ed.), Proceedings of Seventh International Conference on Genetic Algorithms, San Francisco, California, pp. 505–512.Google Scholar
  9. Jelasity, M. “A Wave Analysis of the Subset Sum Problem.” In T. Bäck Proceedings of Seventh International Conference on Genetic Algorithms, San Francisco, California, pp. 89–96.Google Scholar
  10. Jelasity, M. and J. Dombi. (1998). “GAS, A Concept on Modeling Species in Genetic Algorithms.” Artificial Intelligence 99(1), 1–19.Google Scholar
  11. Juels, A. and M. Wattenberg. (1994). “Stochastic Hillclimbing as a Baseline Method for Evaluating Genetic Algorithms.” Technical Report, UC Berkeley.Google Scholar
  12. Khuri, S., T. Bäck, and J. Heitkötter. (1993). “An Evolutionary Approach to Combinatorial Optimization Problems.” In The Proceedings of CSC'94.Google Scholar
  13. Mitchell, M., J.H. Holland, and S. Forrest. (1994). “When Will a Genetic Algorithm Outperform Hillclimbing?” In J.D. Cowan et al. (Eds.), Advances in Neural Information Processing Systems 6. Morgan Kaufmann.Google Scholar
  14. Ortigosa, P.M. (1999). “Métodos estocásticos de Optimización Global. Procesamiento paralelo.” Ph.D. Thesis, Department of Computer Architecture and Electronics, University of Almería, Almería, Spain. Available as Scholar
  15. Ortigosa, P.M., I. García, and M. Jelasity. (Submitted). “Two Approaches for Parallelizing UEGO Algorithm.”Google Scholar
  16. Sokal, R.R. and F.J. Rohlf. (1981). Biometry. New York: W.H. Freeman and Company.Google Scholar
  17. Solis, F.J. and R.J.-B. Wets. (1981). “Minimization by Random Search Techniques.” Mathematics of Operations Research 6(1), 19–30.Google Scholar
  18. Wolpert, D.H. and W.G. Macready. (1997). “No Free Lunch Theorems for Optimization.” IEEE Transactions on Evolutionary Computation 1(1), 67–82.Google Scholar
  19. Yagiura, M. and T. Ibaraki. (1996). “Genetic and Local Search Algorithms as Robust and Simple Optimization Tools.” In I.H. Osman and J.P. Kelly (Eds.), Meta-Heuristics: Theory and Application. Kluwer Academic Publishers, pp. 63–82.Google Scholar

Copyright information

© Kluwer Academic Publishers 2001

Authors and Affiliations

  • Márk Jelasity
    • 1
  • Pilar Martínez Ortigosa
    • 2
  • Inmaculada García
    • 3
  1. 1.Research Group on Artificial IntelligenceMTA-JATESzegedHungary
  2. 2.Department of Computer Architecture and ElectronicsUniversity of AlmeríaAlmeríaSpain
  3. 3.Department of Computer Architecture and ElectronicsUniversity of AlmeríaAlmeríaSpain

Personalised recommendations