Frontiers of Computer Science

, Volume 12, Issue 1, pp 122–134 | Cite as

Distributed learning particle swarm optimizer for global optimization of multimodal problems

Research Article
  • 43 Downloads

Abstract

Particle swarm optimizer (PSO) is an effective tool for solving many optimization problems. However, it may easily get trapped into local optimumwhen solving complex multimodal nonseparable problems. This paper presents a novel algorithm called distributed learning particle swarm optimizer (DLPSO) to solve multimodal nonseparable problems. The strategy for DLPSO is to extract good vector information from local vectors which are distributed around the search space and then to form a new vector which can jump out of local optima and will be optimized further. Experimental studies on a set of test functions show that DLPSO exhibits better performance in solving optimization problems with few interactions between variables than several other peer algorithms.

Keywords

particle swarm optimizer (PSO) orthogonal experimental design (OED) swarm intelligence 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Acknowledgements

The authors would like to acknowledge the financial support of the National Natural Science Foundation of China (Grant Nos. 51575544, 51275353), Macao Science and Technology Development Fund (108/2012/A3 and 110/2013/A3) and Research Committee of University of Macau (MYRG2015-00194-FST, MYRG203(Y1-L4)-FST11-LYM).

Supplementary material

11704_2016_5373_MOESM1_ESM.ppt (230 kb)
Supplementary material, approximately 230 KB.

References

  1. 1.
    Eberhart R C, Kennedy J. A new optimizer using particle swarm theory. In: Proceedings of the 6th International Symposium of Micromachine Human Science. 1995, 39–43CrossRefGoogle Scholar
  2. 2.
    Kennedy J, Eberhart R C. Particle swarm optimization. In: Proceedings of IEEE International Conferences on Neural Networks. 1995, 1942–1948CrossRefGoogle Scholar
  3. 3.
    Liang J J, Qin A K, Suganthan P N, Baskar S. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Transactions on Evolutionary Computation, 2006, 10(3): 281–295CrossRefGoogle Scholar
  4. 4.
    Ho S Y, Lin H S, Liauh WH, Ho S J. OPSO: orthogonal particle swarm optimization and its application to task assignment problems. IEEE Transactions on Systems, Man, and Cybernetics, Part A (Systems and Humans), 2008, 38(2): 288–298Google Scholar
  5. 5.
    Zhan Z H, Zhang J, Li Y, Shi Y H. Orthogonal learning particle swarm optimization. IEEE Transactions on Evolutionary Computation, 2011, 15(6): 832–847CrossRefGoogle Scholar
  6. 6.
    Zhang G, Li Y M. Parallel and cooperative particle swarm optimizer for multimodal problems. Mathematical Problems in Engineering, 2015, 2015: 743671Google Scholar
  7. 7.
    Ho S Y, Shu L S, Chen J H. Intelligent evolutionary algorithms for large parameter optimization problems. IEEE Transactions on Evolutionary Computation, 2004, 8(6), 522–541CrossRefGoogle Scholar
  8. 8.
    Van den Bergh F, Engelbrecht A P. A cooperative approach to particle swarm optimization. IEEE Transactions on Evolutionary Computation, 2004, 8(3): 225–239CrossRefGoogle Scholar
  9. 9.
    Shi Y H, Eberhart R C. A modified particle swarm optimizer. In: Proceedings of IEEEWorld Congress on Evolutionary Computation. 1998, 69–73Google Scholar
  10. 10.
    Shi Y H, Eberhart R C. Parameter selection in particle swarm optimizer. In: Proceedings of the 7th Conference on Evolutionary Programming. 1998, 591–600Google Scholar
  11. 11.
    Suganthan P N. Particle swarm optimizer with neighborhood operator. In: Proceedings of IEEE World Congress on Evolutionary Computation. 1999, 1958–1962Google Scholar
  12. 12.
    Li C H, Yang S X, Nguyen T T. A Self-learning particle swarm optimizer for global optimization problems. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2012, 42(3): 627–643CrossRefGoogle Scholar
  13. 13.
    Shi Y H, Eberhart R C. Population diversity of particle swarms. In: Proceedings of IEEE World Congress on Evolutionary Computation. 2008, 1063–1067Google Scholar
  14. 14.
    Shi Y H, Eberhart R C. Monitoring of particle swarm optimization. Frontiers of Computer Science in China, 2009, 3(1): 31–37CrossRefGoogle Scholar
  15. 15.
    Wu Z J, Zhou J Z. A self-adaptive particle swarm optimization algorithm with individual coefficient adjustment. In: Proceedings of International Conference on Computational Intelligence and Security. 2007, 133–136Google Scholar
  16. 16.
    Parsopoulos K E, Vrahatis M N. UPSO: a unified particle swarm optimization scheme. Lecture Series on Computational Sciences, 2004, 868–873Google Scholar
  17. 17.
    Li X D. Niching without niching parameters: Particle swarm optimization using a ring topology. IEEE Transactions on Evolutionary Computation, 2010, 14(1): 150–169CrossRefGoogle Scholar
  18. 18.
    Kennedy J. Small worlds and mega-minds: effects of neighborhood topology on particle swarm performance. In: Proceedings of IEEE World Congress on Evolutionary Computation. 1999, 1931–1938Google Scholar
  19. 19.
    Kennedy J, Mendes R. Population structure and particle swarm performance. In: Proceedings of IEEE World Congress on Evolutionary Computation. 2002, 1671–1676Google Scholar
  20. 20.
    Jason J, Middendorf M. A hierarchical particle swarm optimizer and its adaptive variant. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2005, 35(6): 1272–1282CrossRefGoogle Scholar
  21. 21.
    Liang J J, Suganthan P N. Dynamic multi-swarm particle optimizer. In: Proceedings of IEEE Congress on Evolutionary Computation. 2005, 124–129Google Scholar
  22. 22.
    Mendes R, Kennedy J, Neves J. The fully informed particle swarm: simpler, maybe better. IEEE Transactions on Evolutionary Computation, 2004, 8(3): 204–210CrossRefGoogle Scholar
  23. 23.
    Peram T, Veeramachaneni K, Mohan C K. Fitness-distance-ratio based particle swarm optimization. In: Proceedings of IEEE Swarm Intelligence Symposium. 2003, 174–181Google Scholar
  24. 24.
    Angeline P J. Using selection to improve particle swarm optimization. In: Proceedings of IEEE World Congress on Evolutionary Computation. 1998, 84–89Google Scholar
  25. 25.
    Juang C F. A hybrid of genetic algorithm and particle swarm optimization for recurrent network design. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2004, 34(2): 997–1006CrossRefGoogle Scholar
  26. 26.
    Ling S H, Iu H H C, Chan K Y, Lam H K, Yeung B C W, Leung F H. Hybrid particle swarm optimization with wavelet mutation and its industrial applications. IEEE Transactions on Systems Man and Cyberntics Part B, 2008, 38(3): 743–763CrossRefGoogle Scholar
  27. 27.
    Ren Z G, Zhang A M, Wen C Y, Feng Z R. A scatter learning particle swarm optimization algorithm for multimodal problems. IEEE Transactions on Cyberntics, 2014, 44(7): 1127–1140CrossRefGoogle Scholar
  28. 28.
    Chen X, Li Y M. A modified PSO structure resulting in high exploration ability with convergence guaranteed. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2007, 37(5): 1271–1289CrossRefGoogle Scholar
  29. 29.
    Chen X, Li Y.M. On convergence and parameters selection of an improved particle swarm optimization. International Journal of Control, Automation, and Systems, 2008, 6(4): 559–570Google Scholar
  30. 30.
    Ratnaweera A, Halgamuge S K, Watson H C. Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients. IEEE Transactions on Evolutionary Computation, 2004, 8(3): 240–255CrossRefGoogle Scholar
  31. 31.
    Shen Y X, Wei L N, Zeng C H. Swarm diversity analysis of particle swarm optimization. In: Tan Y, Shi Y H, Buarque F, et al. eds. Advances in Swarm and Computational Intelligence. Lecture Notes in Compute Science, Vol 9140. Springer, 2015, 99–106CrossRefGoogle Scholar
  32. 32.
    Tang K, Yang P, Yao X. Negatively correlated search. IEEE Journal on Selected Areas in Communications, 2016, 34(3): 540–550CrossRefGoogle Scholar
  33. 33.
    Montgomery D C. Design and Analysis of Experiments. 5th ed. New York: Wiley, 2000Google Scholar
  34. 34.
    Ho S Y, Shu L S, Chen J H. Intelligent evolutionary algorithms for large parameter optimization problems. IEEE Transaction on Evolutionary Computation, 2004, 8(6): 522–541CrossRefGoogle Scholar
  35. 35.
    Liang J J, Suganthan P N, Deb K. Novel composition test functions for numerical global optimization. In: Proceedings of IEEE Swarm Intelligence Symposium. 2005, 68–75Google Scholar
  36. 36.
    Salomon R. Reevaluating genetic algorithm performance under coordinate rotation of benchmark functions. Biosystems, 1996, 39(3): 263–278CrossRefGoogle Scholar
  37. 37.
    Lee K S, Green Z W. A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice. Computer Methods in Applied Mechanics and Engineering, 2005, 194(36): 3902–3933CrossRefMATHGoogle Scholar
  38. 38.
    Sun J Y, Zhang Q F, Tsang E P K. DE/EDA: a new evolutionary algorithm for global optimization. Information Science, 2004, 169(3): 249–262MathSciNetGoogle Scholar

Copyright information

© Higher Education Press and Springer-Verlag GmbH Germany 2018

Authors and Affiliations

  1. 1.Department of Electromechanical Engineering, Faculty of Science and TechnologyUniversity of MacauMacaoChina
  2. 2.Industrial and Systems Engineering, Faculty of EngineeringThe Hong Kong Polytechnic UniversityHong KongChina
  3. 3.Department of Computer Science and EngineeringSouthern University of Science and TechnologyShenzhenChina

Personalised recommendations