Gradient-based adaptive particle swarm optimizer with improved extremal optimization
- 216 Downloads
Most real-world applications can be formulated as optimization problems, which commonly suffer from being trapped into the local optima. In this paper, we make full use of the global search capability of particle swarm optimization (PSO) and local search ability of extremal optimization (EO), and propose a gradient-based adaptive PSO with improved EO (called GAPSO-IEO) to overcome the issue of local optima deficiency of optimization in high-dimensional search and reduce the time complexity of the algorithm. In the proposed algorithm, the improved EO (IEO) is adaptively incorporated into PSO to avoid the particles being trapped into the local optima according to the evolutional states of the swarm, which are estimated based on the gradients of the fitness functions of the particles. We also improve the mutation strategy of EO by performing polynomial mutation (PLM) on each particle, instead of on each component of the particle, therefore, the algorithm is not sensitive to the dimension of the swarm. The proposed algorithm is tested on several unimodal/multimodal benchmark functions and Berkeley Segmentation Dataset and Benchmark (BSDS300). The results of experiments have shown the superiority and efficiency of the proposed approach compared with those of the state-of-the-art algorithms, and can achieve better performance in high-dimensional tasks.
KeywordsGradient Mutation strategy Adaptive particle swarm optimizer Improving extremal optimization
The authors would like to thank the anonymous referees for their useful comments. This work is supported by the National Nature Science Foundation of China (No.61461021) and Shanghai Science and Technology Committee (No. 15590501300).
- 3.Sun W, Su F, Wang L (2017) Improving deep neural networks with multi-layer maxout networks and a novel initialization method. NeurocomputingGoogle Scholar
- 4.Kennedy J (2010) Particle swarm optimization. In: Encyclopedia of machine learning, Springer, pp 760–766Google Scholar
- 5.Lin Q, Liu S, Zhu Q, Tang C, Song R, Chen J, Coello CAC, Wong KC, Zhang J (2016) Particle swarm optimization with a balanceable fitness estimation for many-objective optimization problems. IEEE Trans Evol Comput 22(1):23Google Scholar
- 8.Hatamlou A (2017) A hybrid bio-inspired algorithm and its application. Appl Intell (8):1–9Google Scholar
- 10.Boettcher S, Percus AG (1999) Extremal optimization: methods derived from co-evolution. In: Genetic and evolutionary computation conference, pp 825–832Google Scholar
- 16.Chen MR, Weng J, Li X, Zhang X (2014) Handling multiple objectives with integration of particle swarm optimization and extremal optimizationGoogle Scholar
- 17.Khakmardan P, Akbarzadeh T (2011) Solving traveling salesman problem by a hybrid combination of pso and extremal optimization pp 1501–1507Google Scholar
- 18.Wang W (2012) Research on particle swarm optimization and its applicationGoogle Scholar
- 31.Zeng GQ, Chen J, Li LM, Chen MR, Wu L, Dai YX, Zheng CW (2015) An improved multi-objective population-based extremal optimization algorithm with polynomial mutation. Information Sciences An International Journal 330(C):49Google Scholar
- 32.Hati AN, Darbar R, Jana ND, Sil J (2013) Modified artificial bee colony algorithm using differential evolution and polynomial mutation for real-parameter optimization.. In: International conference on advances in computing, communications and informatics, pp 534–539Google Scholar