Advertisement

Arabian Journal for Science and Engineering

, Volume 39, Issue 6, pp 4683–4697 | Cite as

Autonomous Particles Groups for Particle Swarm Optimization

  • Seyedali Mirjalili
  • Andrew Lewis
  • Ali Safa Sadiq
Research Article - Computer Engineering and Computer Science

Abstract

In this paper, a modified particle swarm optimization (PSO) algorithm called autonomous groups particles swarm optimization (AGPSO) is proposed to further alleviate the two problems of trapping in local minima and slow convergence rate in solving high-dimensional problems. The main idea of AGPSO algorithm is inspired by individuals’ diversity in bird flocking or insect swarming. In natural colonies, individuals are not basically quite similar in terms of intelligence and ability, but they all do their duties as members of a colony. Each individual’s ability can be useful in a particular situation. In this paper, a mathematical model of diverse particles groups called autonomous groups is proposed. In other words different functions with diverse slopes, curvatures, and interception points are employed to tune the social and cognitive parameters of the PSO algorithm to give particles different behaviors as in natural colonies. The results show that PSO with autonomous groups of particles outperforms the conventional and some recent modifications of PSO in terms of escaping local minima and convergence speed. The results also indicate that dividing particles in groups and allowing them to have different individual and social thinking can improve the performance of PSO significantly.

Keywords

PSO Social behavior Social coefficient Cognitive coefficient Function optimization Autonomous particles groups 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Eberhart, R.C.; Kennedy, J.: A new optimizer using particles swarm theory. In: Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, pp. 39–43 (1995)Google Scholar
  2. 2.
    Eberhart R.C.; Kennedy, J.: Particle swarm optimization. In: IEEE International Conference on Neural Network, Perth, Australia, pp. 1942–1948 (1995)Google Scholar
  3. 3.
    Chandra, S.; Bhat, R.; Singh, H.: A PSO based method for detection of brain tumors from MRI. In: Nature & Biologically Inspired Computing, Coimbatore, pp. 666–671 (2009)Google Scholar
  4. 4.
    Mathiyalagan, M.; Dhepthie, U.; Sivanandam, S.: Grid Scheduling Using Enhanced PSO Algorithm. vol. 2, pp. 140–145 (2010)Google Scholar
  5. 5.
    Masehian, E.; Sedighizadeh, D.: A multi-objective PSO-based algorithm for robot path planning. In: IEEE International Conference on Industrial Technology Vi a del Mar, pp. 465–470 (2010)Google Scholar
  6. 6.
    Fayk, M.; El Nemr, H.; Moussa, M.: Particle swarm optimisation based video abstraction. J. Adv. Res. 1, 163–167 (2010)CrossRefGoogle Scholar
  7. 7.
    Mirjalili, S.; Rawlins, T.; Hettenhausen, J.; Lewis, A.: A comparison of multi-objective optimisation metaheuristics on the 2D airfoil design problem. ANZIAM J. 54, C345–C360 (2013)MathSciNetGoogle Scholar
  8. 8.
    Mirjalili, S.; Sadiq, A.S.: Magnetic Optimization algorithm for training multi layer perceptron. In: 2011 IEEE 3rd international conference on communication software and networks (ICCSN), pp. 42–46 (2011)Google Scholar
  9. 9.
    Mirjalili, S.; Mohd Hashim, S.Z.; Moradian Sardroudi, H.: Training feed forward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Appl. Math. Comput. 218, 11125–11137 (2012)Google Scholar
  10. 10.
    Mirjalili, S.; Mirjalili, S.M.; Lewis, A.: Let a biogeography-based optimizer train your Multi-Layer Perceptron. Inform Sci. 269, 188–209 (2014)CrossRefMathSciNetGoogle Scholar
  11. 11.
    Mirjalili, S.; Lewis, A.: S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol. Comput. 9, 1–14 (2013)Google Scholar
  12. 12.
    Mirjalili, S.; Lewis, A.: Adaptive gbest guided Gravitational Search Algorithm. Neural Comput. Appl. (2014). doi: 10.1007/s00521-014-1640-y
  13. 13.
    Saremi, S.; Mirjalili, S.; Lewis, A.: Biogeography-based optimisation with chaos. Neural Comput. Appl. 1–21 (2014). doi: 10.1007/s00521-014-1597-x
  14. 14.
    Wang, G.; Guo, L.; Wang, H.; Duan, H.; Liu, L.; Li, J.: Incorporating mutation scheme into krill herd algorithm for global numerical optimization. Neural Comput. Appl. 1–19 (2012)Google Scholar
  15. 15.
    Wang, G.-G.; Gandomi, A.H.; Alavi, A.H.: A chaotic particle-swarm krill herd algorithm for global numerical optimization. Kybernetes 42, 9–9 (2013)Google Scholar
  16. 16.
    Wang, G.-G.; Gandomi, A.H.; Alavi, A.H.: An effective krill herd algorithm with migration operator in biogeography-based optimization. Appl. Math. Model. (2013)Google Scholar
  17. 17.
    Wang, G.-G.; Gandomi, A.H.; Alavi, A.H.: Stud krill herd algorithm. Neurocomputing 128, 363–370 (2014)CrossRefGoogle Scholar
  18. 18.
    Wang, G.-G.; Gandomi, A.H.; Alavi, A.H.; Hao, G.-S.: Hybrid krill herd algorithm with differential evolution for global numerical optimization. Neural Comput. Appl. 1–12 (2013). doi: 10.1007/s00521-013-1485-9
  19. 19.
    Wang, G.-G.; Guo, L.; Gandomi, A.H.; Hao, G.-S.; Wang, H.: Chaotic Krill Herd algorithm. Inform. Sci. 274, 17–34 (2014)CrossRefMathSciNetGoogle Scholar
  20. 20.
    Guo, L.; Wang, G.-G.; Gandomi, A.H.; Alavi, A.H.; Duan, H.: A new improved krill herd algorithm for global numerical optimization. Neurocomputing 138, 392–402 (2013)CrossRefGoogle Scholar
  21. 21.
    Saremi, S.; Mirjalili, S.: Integrating chaos to biogeography-based optimization algorithm. Int. J. Comput. Commun. Eng. 2, 655–658 (2013)CrossRefGoogle Scholar
  22. 22.
    Saremi, S.; Mirjalili, S. M.; Mirjalili, S.: Chaotic krill herd optimization algorithm. Procedia Technol. 12, 180–185 (2014)CrossRefGoogle Scholar
  23. 23.
    Premalatha, K.; Natarajan, A.M.: Hybrid PSO and GA for global maximization. Int. J. Open Probl. Comput. Sci Math. 2, 597–608 (2009)MathSciNetGoogle Scholar
  24. 24.
    Mirjalili, S.; Mohd Hashim, S.Z.: A new hybrid PSOGSA algorithm for function optimization. In: International Conference on Computer and Information Application (ICCIA 2010), pp. 374–377 (2010)Google Scholar
  25. 25.
    Shuang, B.; Chen, J.; Li, Z.: Study on hybrid PS-ACO algorithm. Appl. Intell. 34, pp. 64–73 (2011)Google Scholar
  26. 26.
    Toscano-Pulido, G.; Reyes-Medina, A.; Ramirez-Torres, J.: A statistical study of the effects of neighborhood topologies in particle swarm optimization. Comput. Intell. 343, 179–192 (2011)CrossRefGoogle Scholar
  27. 27.
    Matsushita, H.; Nishio, Y.: Network-structured particle swarm optimizer with various topology and its behaviors. Adv. Self-Organ. Maps. 5629, 163–171 (2009)CrossRefGoogle Scholar
  28. 28.
    Mo, S.; Zeng, J.; Tan, Y.: Particle swarm optimisation based on self-organisation topology driven by different fitness rank. Int. J. Comput. Sci. Eng. 6, 24–33 (2011)CrossRefGoogle Scholar
  29. 29.
    Cai, X.: A new modified PSO based on black stork foraging process. In: 8th IEEE international conference on cognitive informatics, Kowloon, Hong Kong, pp. 509–513 (2009)Google Scholar
  30. 30.
    Cai a, X.; Cui, Z.; Zeng, J.; Tana, Y.: Dispersed particle swarm optimization. Inform. Process. Lett. 105, 231–235 (2008)CrossRefMathSciNetGoogle Scholar
  31. 31.
    Cai, X.; Cui, Y.; Tan, Y.: Predicted modified PSO with time-varying accelerator coefficients. Int. J. Bio-Inspired Comput. 1, 50–60 (2009)CrossRefGoogle Scholar
  32. 32.
    Ziyu, T.; Dingxue, Z.: A modified particle swarm optimization with an adaptive acceleration coefficients. In: Asia-Pacific Conference on Information Processing, Shenzhen, pp. 330–332 (2009)Google Scholar
  33. 33.
    Bao, G.Q.; Mao, K.F.: Particle swarm optimization algorithm with asymmetric time varying acceleration coefficients. In: IEEE International Conference on Robotics and Biomimetics, Guilin, pp. 2134–2139 (2009)Google Scholar
  34. 34.
    Cui, Z.; Zeng, J.; Yin, Y.: An improved PSO with time-varying accelerator coefficients. In: Eighth international conference on intelligent systems design and applications, Kaohsiung, pp 638–643 (2008)Google Scholar
  35. 35.
    Fukuyama, Y.; Yoshida, H.: A particle swarm optimization for reactive power and voltage control in electric power system. In: IEEE congress on evolutionary computation, Seoul, Korea, pp. 87–93 (2001)Google Scholar
  36. 36.
    Molga, M.; Smutnicki, C.: In: Test functions for optimization needs (2005)Google Scholar
  37. 37.
    Yang, X.-S.: In: An introduction with metaheuristic applications Xin-She, Y. (ed.) Test Problems in Optimization. Wiley, New York (2010)Google Scholar
  38. 38.
    Yao, X.; Liu, Y.; Lin, G.: Evolutionary programming made faster. IEEE Tran. Evol. Comput. 3, 82–102 (1999)CrossRefGoogle Scholar
  39. 39.
    Digalakis, J.G.; Margaritis, K.G.: On benchmarking functions for genetic algorithms. Intern. J. Comput. Math. 77, 481–506 (2001)CrossRefzbMATHMathSciNetGoogle Scholar
  40. 40.
    Mirjalili, S.; Mirjalili, S.M.; Lewis, A.: Grey wolf optimizer. Adv. Eng. Soft. 69, 46–61 (2014)CrossRefGoogle Scholar
  41. 41.
    Mirjalili, S.; Mirjalili, S.M.; Yang, X.-S.: Binary bat algorithm. Neural Comput. Appl. 1–19 (2013). doi: 10.1007/s00521-013-1525-5

Copyright information

© King Fahd University of Petroleum and Minerals 2014

Authors and Affiliations

  • Seyedali Mirjalili
    • 1
  • Andrew Lewis
    • 1
  • Ali Safa Sadiq
    • 2
  1. 1.School of Information and Communication TechnologyGriffith UniversityBrisbane, QLDAustralia
  2. 2.Faculty of ComputingUniversiti Teknologi MalaysiaUTM SkudaiMalaysia

Personalised recommendations