Abstract
Particle swarm optimization (PSO) is one of the most commonly used stochastic optimization algorithms for many researchers and scientists of the last two decades, and the pattern search (PS) method is one of the most important local optimization algorithms. In this paper, we test three methods of hybridizing PSO and PS to improve the global minima and robustness. All methods let PSO run first followed by PS. The first method lets PSO use a large number of particles for a limited number of iterations. The second method lets PSO run normally until tolerance is reached. The third method lets PSO run normally until the average particle distance from the global best location is within a threshold. Numerical results using non-differentiable test functions reveal that all three methods improve the global minima and robustness versus PSO. The third hybrid method was also applied to a basin network optimization problem and outperformed PSO with filter method and genetic algorithm with implicit filtering.
Similar content being viewed by others
References
Almomani A (2012) Constraint handling for derivative-free optimization. PhD thesis, Clarkson University, Potsdam, NY
Alsumait JS, Sykulski JK, Al-Othman AK (2010) A hybrid ga-ps-sqp method to solve power system valve-point economic dispatch problems. Appl Energy 87(5):1773–1781
Ansari Ardeh M. Benchmarkfcns toolbox
Beauregard J, Ritz B, W Jenkins E, R Kavanagh K, W Farthing M (2018) Optimization of a basin network using hybridized global search algorithms. J Irrig Drain Eng 144(8):04018017
Clerc M et al. (2007) Standard pso 2007. Particle Swarm Central Website
Hooke R, A Jeeves T (1961) ”direct search” solution of numerical and statistical problems. J ACM (JACM) 8(2):212–229
Ismael F. Vaz A, Vicente LN (2007) A particle swarm pattern search method for bound constrained global optimization. J Global Optim 39(2):197–219
Jamil M, Yang X-S (2013). A literature survey of benchmark functions for global optimization problems. arXiv preprint arXiv:1308.4008
Kennedy J (2010) Particle swarm optimization. Encyclopedia of machine learning, pp 760–766
Khadanga RK, Satapathy JK (2018) A hybrid gravitational search and pattern search algorithm for tuning damping controller parameters for a unified power flow controller-a comparative approach. Int J Numer Modell Electron Netw Dev Fields 31(3):e231
Long W, Zhang W, Huang Y, Chen Y (2014) A hybrid cuckoo search algorithm with feasibility-based rule for constrained structural optimization. J Central South Univ 21(8):3197–3204
Matlab, MATLAB2017b, The MathWorks, Natick, MA, USA
McCart J, Almomani A (2019) New criteria for comparing global stochastic derivative-free optimization algorithms. (IJACSA) Int J Adv Comput Sci Appl 10(7):614–625
Meftahi M, Jazi SH (2012) A new hybrid algorithm of pattern search and abc for optimization. In: The 16th CSI international symposium on artificial intelligence and signal processing (AISP 2012), pp 403–406
Mirjalili SA, Mohd Hashim SZ (2012) Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Appl Math Comput 218(22):11125–11137
Ritz B (2017) A hybrid genetic algorithm with implicit filtering for mixed integer optimization problem. PhD thesis, Clarkson University, Potsdam, NY
Wahid F, Ghazali R (2018) Hybrid of firefly algorithm and pattern search for solving optimization problems. Evolut Intell
Weihang Z, Curry J (2009) Particle swarm with graphics hardware acceleration and local pattern search on bound constrained problems. In: 2009 IEEE swarm intelligence symposium, SIS 2009–proceedings, pp 1 – 8
Acknowledgements
This work was partially supported by funds made available under a State University of New York Expanded Investment and Performance Award to the State University of New York at Geneseo.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
Benchmark Functions
Name | Function; Bounds; Optimal Value; Optimal Location |
---|---|
Alpine 1 | \(f({\mathbf {x}})=\sum _{i=1}^D|x_i\sin (x_i)+0.1 x_i|\) ; |
\(x_i\in [-10,10]\) ; 0 ; (0, ..., 0) | |
Bartels conn | \(f({\mathbf {x}})=|x_1^2 + x_2^2 + x_1 x_2| + |\sin (x_1)| + |\cos (x_2)|\) ; |
\(x_i\in [-500,500]\); 1 ; (0,0) | |
Bukin N. 4 | \(f({\mathbf {x}})=100x_2^2+0.01|x_1+10|\) ; |
\(x_1\in [-15,-5]\), \(x_2\in [-3,3]\) ; 0 ; (-10,0) | |
Bukin N. 6 | \(f({\mathbf {x}})=100\sqrt{|x_2-0.01x_1^2|}+0.01|x_1+10|\) ; |
\(x_1\in [-15,-5]\); 0 ; (-10,1) | |
\(x_2\in [-3,3]\) | |
Corana | \(f({\mathbf {x}})=0.15(z_i-0.05 {z_i}^2)d_i\) if \(|v+i|<A\), else \(d_i x_i^2\) ; |
\(x_i\in [-500,500]\) ; 0 ; (0, 0, 0, 0) | |
Cosine mixture | \(f({\mathbf {x}})=-0.1\sum _{i=1}^n \cos (5 \pi x_i)-\sum _{i=1}^n x_i^2\); |
\(x_i\in [-1,1]\) ; 0.2 if \(n=2\), 0.4 if \(n=4\) ; (0, 0) | |
Cross-in-tray | \(f({\mathbf {x}})=-0.0001(|\sin (x_1)\sin (x_2)\exp (|100-\frac{\sqrt{x_1^2+x_2^2}}{\pi }|)|+1)^{0.1}\) ; |
\(x_1,\,x_2\in [-10,10]\) ; -2.06261218 ; \((\pm 1.349406685353340\), | |
\(\pm 1.349406608602084)\) | |
Holder-table | \(f({\mathbf {x}})=-|\sin (x_1)\cos (x_2)\exp (|1-\frac{\sqrt{x_1^2+x_2^2}}{\pi }|)|\) ; |
\(x_1,\,x_2\in [-10,10]\) ; -19.2085 ; \((\pm 8.05502,\pm 9.66459)\) | |
Powell sum | \(f({\mathbf {x}})=\sum _{i=1}^{n}|x_i|^{i+1}\) ; |
\(x_i\in [-1,1]\) ; 0 ; (0, ..., 0) | |
Price 1 | \(f({\mathbf {x}})=(|x_1|-5)^2+(|x_2|-5)^2\) ; |
\(x_i\in [-500,500]\) ; 0 ; \((\pm 5,\pm 5)\) | |
Schwefel | \(f({\mathbf {x}}) = 418.9829d -{\sum _{i=1}^{n} x_i \sin (\sqrt{|x_i|})}\) ; |
\(x_i\in [-500,500]\) ; 0 ; (420.9687, ..., 420.9687) | |
Schwefel 2.20 | \(f({\mathbf {x}})=\sum _{i=1}^n |x_i|\) \(x_i\in [-100,100]\) ; 0 ; (0, ..., 0) |
Schwefel 2.21 | \(f({\mathbf {x}})=\max _{i=1,...,n}|x_i|\) ; |
\(x_i\in [-100,100]\) ; 0 ; (0, ..., 0) | |
Schwefel 2.22 | \(f({\mathbf {x}})=\sum _{i=1}^{n}|x_i|+\prod _{i=1}^{n}|x_i|\) ; |
\(x_i\in [-100,100]\) ; 0 ; (0, ..., 0) | |
Step | \(f({\mathbf {x}})=\sum _{i=1}^D \lfloor |x_i| \rfloor\) ; |
\(x_i\in [-100,100]\) ; 0 ; (0, ..., 0) | |
Step 2 | \(f({\mathbf {x}})=\sum _{i=1}^D (\lfloor x_i+0.5 \rfloor )^2\) ; |
\(x_i\in [-100,100]\) ; 0 ; (0.5, ..., 0.5) | |
Step 3 | \(f({\mathbf {x}})=\sum _{i=1}^D \lfloor x_i^2 \rfloor\) ; |
\(x_i\in [-100,100]\) ; 0 ; (0, ..., 0) | |
Stepint | \(f({\mathbf {x}})=25+\sum _{i=1}^D \lfloor x_i \rfloor\) ; |
\(x_i\in [-5.12,5.12]\) ; 0 ; (0, ..., 0) | |
Xin-She Yang | \(f({\mathbf {x}})=\sum _{i=1}^{n}\epsilon _i|x_i|^i\) ; |
\(x_i\in [-5,5]\) ; 0 ; (0, ..., 0) | |
Xin-She Yang N.2 | \(f({\mathbf {x}})=(\sum _{i=1}^{n}|x_i|)\exp (-\sum _{i=1}^{n}\sin (x_i^2))\) ; |
\(x_i\in [-2\pi ,2\pi ]\) ; 0 ; (0, ..., 0) | |
Xin-She Yang N.4 | \(f({\mathbf {x}})=\left[ \sum _{i=1}^{n}\sin ^2(x_i)-\exp \left( -\sum _{i=1}^{n}x_i^2\right) \right] \exp \left( -\sum _{i=1}^{n}{\sin ^2\sqrt{|x_i|}}\right)\) ; |
\(x_i\in [-10,10]\) ; -1 ; (0, ..., 0) |
Rights and permissions
About this article
Cite this article
Koessler, E., Almomani, A. Hybrid particle swarm optimization and pattern search algorithm. Optim Eng 22, 1539–1555 (2021). https://doi.org/10.1007/s11081-020-09534-7
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11081-020-09534-7