An adaptive parallel particle swarm optimization for numerical optimization problems
- 82 Downloads
The parallelization of particle swarm optimization (PSO) is an efficient way to improve the performance of PSO. The multiple population parallelization is one way to parallelize PSO, in which three parameters need to be manually set in advance. They are migration interval, migration rate, and migration direction, which decide when, how many and from which subpopulation to which subpopulation particles will be migrated, respectively. However, there are two shortcomings concerning manually setting these three parameters in advance. One is that good particles cannot be migrated in time since particles can only be migrated every a given interval and in a given direction in parallel PSO. The other is that a large number of unnecessary migrations will take place since a given rate of particles in each subpopulation will be migrated every a given interval in a given direction. Both may be bad for parallel PSO to find high-quality solutions as quickly as possible, and this will result in a huge communication cost. Inspired by the phenomenon of osmosis, this paper presents a multiple population parallel version of PSO based on osmosis. It can adaptively decide when, how many, and from which subpopulation to which subpopulation particles will be migrated. Its usefulness, especially for high-dimensional functions, is demonstrated by numerical experiments.
KeywordsPSO Parallel Multiple population Osmosis Migration Adaptive
This paper is supported by the National Natural Science Foundation of China (Grant numbers 61562071, 61773410, 61165003, 61472143), the Scientific Research Special Plan of Guangzhou Science and Technology Programme (Grant no. 201607010045), and the Natural Science Foundation of Jiangxi Province (Grant no. 20151BAB207020).
Compliance with ethical standards
Conflict of interest
The authors declare that there is no conflict of interest regarding the publication of this paper.
- 1.Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, pp 1942–1948Google Scholar
- 7.Li H (2014) A teaching quality evaluation model based on a wavelet neural network improved by particle swarm optimization. Cybern Inf Technol 14(3):110–120Google Scholar
- 11.Pant M, Thangaraj R, Abraham A (2009) Particle swarm optimization: performance tuning and empirical analysis. In: Abraham A, Hassanien A-E, Siarry P, Engelbrecht A (eds) Foundations of computational intelligence, vol 3. Springer, Berlin Heidelberg, pp 101–128Google Scholar
- 13.Deep K, Arya M, Barak S (2010) A new multi-swarm particle swarm optimization and its application to Lennard-Jones problem. INFOCOMP 9(3):52–60Google Scholar
- 26.Chang JF, Chu SC, Roddick JF, Pan JS (2005) A parallel particle swarm optimization algorithm with communication strategies. J Inf Sci Eng 21(4):809–818Google Scholar
- 27.Yao X, Liu Y (1996) Fast evolutionary programming. In: Proceedings of the fifth annual congress on evolutionary computation, pp 451–460Google Scholar
- 29.Vesterstrom J, Thomsen R (2004) A comparative study of differential evolution, particle swarm optimization, and evolutionary algorithms on numerical benchmark problems. In: Proceedings of congress on evolutionary programming, pp 1980–1987Google Scholar
- 30.Shi Y, Eberhart R (1998) A modified particle swarm optimization. In: Proceedings of IEEE international conference on evolutionary computation, pp 69–73Google Scholar
- 32.Bratton D, Kennedy J (2007) Defining a standard for particle swarm optimization. In: Proceedings of the 2007 IEEE swarm intelligence symposium, pp 120–127Google Scholar