Advertisement

Particle Swarm Optimization with Resets – Improving the Balance between Exploration and Exploitation

  • Yenny Noa Vargas
  • Stephen Chen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6438)

Abstract

Exploration and exploitation are two important factors to consider in the design of optimization techniques. Two new techniques are introduced for particle swarm optimization: “resets” increase exploitation and “delayed updates” increase exploration. In general, the added exploitation with resets helps more with the lbest topology which is more explorative, and the added exploration with delayed updates helps more with the gbest topology which is more exploitive.

Keywords

Particle Swarm Optimization Search Intensification Search Diversification 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kennedy, J., Eberhart, R.C.: Particle Swarm Optimization. In: Proceedings of the 1995 IEEE International Conference on Neural Networks, Perth, Australia, pp. 1942–1948. IEEE Service Center, Piscataway (1995)Google Scholar
  2. 2.
    Cartwright, L., Hendtlass, T.: A Heterogeneous Particle Swarm. In: Korb, K., Randall, M., Hendtlass, T. (eds.) Proceedings of Fourth Australian Conference on Artificial Life, pp. 201–210. Springer, Heidelberg (2009)Google Scholar
  3. 3.
    Chen, S.: Locust Swarms – A New Multi-Optima Search Technique. In: Proceedings of the 2009 IEEE Congress on Evolutionary Computation, pp. 1745–1752 (2009)Google Scholar
  4. 4.
    Hendtlass, T.: WoSP: A Multi-Optima Particle Swarm Algorithm. In: Proceedings of the 2005 IEEE Congress on Evolutionary Computation, pp. 727–734 (2005)Google Scholar
  5. 5.
    Glover, F., Laguna, M.: Tabu Search. Kluwer Academic Publishers, Dordrecht (1997)CrossRefzbMATHGoogle Scholar
  6. 6.
    Glover, F.: Tabu Search. ORSA Journal on Computing 1, 190–206 (1989)CrossRefzbMATHGoogle Scholar
  7. 7.
    Glover, F.: Tabu Search Part II. ORSA Journal on Computing 2(1), 4–32 (1990)CrossRefzbMATHGoogle Scholar
  8. 8.
    Hansen, P., Mladenović, N.: An Introduction to variable neighborhood search. In: Voß, S., Martello, S., Osman, I., Roucairol, C. (eds.) Methaheuristics: Advances and trends in local search paradigms for optimization, ch.30, pp. 433–458. Kluwer Academic Publishers, Dordrecht (1999)Google Scholar
  9. 9.
    Stützle, T.: Iterated local search for the quadratic assignment problem. Technical report, aida-99-03, FG Intellektik, TU Darmstadt (1999)Google Scholar
  10. 10.
    Lourenço, H.R., Martin, O., Stützle, T.: A beginnerś introduction to Iterated Local Search. In: Proceedings of MIC 2001 Metaheuristics International Conference, Porto, Portugal, vol. 1, pp. 1–6 (2001)Google Scholar
  11. 11.
    Dorigo, M., Gambardella, L.: Ant Colony System: a cooperative learning approach to the traveling salesman problem. IEEE Transaction on Evolutionary Computation 1, 53–66 (1997)CrossRefGoogle Scholar
  12. 12.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison Wesley, Reading (1989)zbMATHGoogle Scholar
  13. 13.
    Mitchell, M.: An Introduction to Genetic Algorithms. MIT Press, Cambridge (1998)zbMATHGoogle Scholar
  14. 14.
    Rechenberg, I.: Evolutionsstrategie – Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (PhD thesis). Fromman-Holzboog (1973)Google Scholar
  15. 15.
    Eberhart, R.C., Kennedy, J.: A New Optimizer using Particle Swarm Theory. In: Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, pp. 39–43. IEEE Service Center, Piscataway (1995)CrossRefGoogle Scholar
  16. 16.
    Bratton, D., Kennedy, J.: Defining a Standard for Particle Swarm Optimization. In: Proceedings of the 2007 IEEE Swarm Intelligence Symposium (SIS 2007), pp. 120–127 (2007)Google Scholar
  17. 17.
    Kalyanmoy, D.: Multi-Objective Optimization using Evolutionary Algorithms. Department of Mechanical Engineering. Institute of Technology, Kanpur, India (2001)Google Scholar
  18. 18.
    Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220, 671–680 (1983)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Aarts, E.H.L., Korst, J.H.M., Laarhoven, P.J.M.: Simulated Annealing. In: Local Search in Combinatorial Optimization. In: Aarts, E.H.L., Lenstra, J.K. (eds.) Local Search in Combinatorial Optimization, pp. 91–120. Wiley Interscience, Chichester (1997)Google Scholar
  20. 20.
    Blum, C., Roli, A.: Metaheuristics in Combinatorial Optimization: Overview and Conceptual ComparisonGoogle Scholar
  21. 21.
    Moscato, P., Cotta, C.: An Introduction to Memetic Algorithms. Inteligencia Artificial, Revista Iberoamericana de Inteligencia Artificial 19, 131–148 (2003)zbMATHGoogle Scholar
  22. 22.
    Hansen, N., Finck, S., Ros, R., Auger, A.: Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. INRIA Technical Report RR-6829 (2009)Google Scholar
  23. 23.
    El-Abd, M., Kamel, M.S.: Black-Box Optimization Benchmarking for Noiseless Function Testbed using Particle Swarm Optimization. In: Proceedings of the 2009 Genetic and Evolutionary Computation Conference, pp. 2269–2273 (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Yenny Noa Vargas
    • 1
  • Stephen Chen
    • 2
  1. 1.Department of Artificial Intelligence and Computational SystemsUniversity of HavanaHavanaCuba
  2. 2.School of Information TechnologyYork UniversityTorontoCanada

Personalised recommendations