Abstract
In this study, we adopted a novel hybrid global–local optimization algorithm called NPSOG, which combines particle swarm optimization (PSO) [16] and a gradient method [21, 23] to solve a class of global optimization problems for continuously differentiable functions. In this method, at each iteration of the PSO algorithm, and under specific condition given by the notion of loudness parameter, we perform an exploitation step by a gradient method. The loudness parameter was introduced for the first time by Yang in [28]. Our experimental results for the test functions indicate that the usage of NPSOG algorithms can improve the performance of PSO considerably. In addition, its performance as a viable optimization method is demonstrated by comparing it with classical kind of hybridization.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
The loudness parameter was introduced for the first time by Yang in his famous article about bat algorithm [28].
- 3.
The standard deviation is used to indicate the stability of an algorithm, a more stable algorithm should produce a smaller value of this measurement.
References
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)
Armijo, L.: Minimization of functions having Lipschitz first partial derivatives. Pac. J. Math. 16(1), 1–3 (1966)
Barzilai, J., Borwein, J.M.: Two point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)
Bansal, J.C., Singh, P.K., Saraswat, M., Verma, A., Jadon, S.S., Abraham, A.: Inertia Weight Strategies in Particle Swarm Optimization, pp. 633–640. Third World Congress on Nature and Biologically Inspired Computing, Salamanca (2011)
Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone Spectral Projected Gradient Methods on Convex Sets. SIAM J Optim. 10(4), 1196–1211 (2000)
Birgin, E.G., Raydan, M.: SPG software for convex-constrained optimization. ACM Trans. Math. Soft. 27(3), 340–349 (2001)
Chakri, A., Khelif, R., Benouaret, M., Yang, X.Y.: New directional bat algorithm for continuous optimization problems. Expert Syst. Appl. 69, 159–175 (2017)
Clerc, M., Kennedy, J.: The particle swarm - explosion, stability, and convergence in a multidimensional complex space. IEEE T Evol. Comput. 6(1), 58–73 (2002)
Colorni, A., Dorigo, M., Maniezzo, V.: An investigation of some proprieties of an “ant algorithm”. In: Proc. Parallel Problem Solving from Nature Conference, pp. 509–520. Elsevier Publishing (1992)
Dai, Y.H., Hager, W.W., Schitkowski, K., Zhang.: The cyclic Barzilai-Borwein method for unconstrained optimization. IMA J. Numer. Anal. 26(3), 1–24 (2006)
Fletcher, R.: Low storage methods for unconstrained optimization. In: Lect. Appl. Math., vol. 26, pp. 165–179. American Mathematical Society, Providence, RI (1990)
Glover, F.: Future paths for integer programming and links to artificial intelligence. Comput. Oper. Res. 13(5), 533–549 (1986)
Goldstein, A.A.: On steepest descent. SIAM J. Control. 3, 147–151 (1965)
Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)
Holland, J.H.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. University Michigan Press, Oxford, England (1975)
Kennedy, J., Eberhart, R.: Particle Swarm Optimization. In: IEEE Int. Conf. Neural Networks, Perth, WA, Australia, vol. 4, pp. 1942–1948 (1995)
Kennedy, J., Mendes, R.: Population structure and particle swarm performance. In: Proc. Congress on Evolutionary Computation (CEC’02), vol. 2, pp. 1671–1676 (2002)
Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science, New Series 220(4598), 671–680 (1983)
La Cruz, W., Noguera, N.: Hybrid spectral gradient method for the unconstrained minimization problem. J. Global Optim. 44, 193–212 (2009)
Lakhbab, H., El Bernoussi, S.: Hybrid nonmonotone spectral gradient method for the unconstrained minimization problem. Comput. Appl. Math. 36(3), 1421–1430 (2017)
Lakhbab, H., El Bernoussi, S.: A hybrid method based on particle swarm optimization and nonmonotone spectral gradient method for unconstrained optimization problem. Int. J. Math. Anal. 6(60), 2963–2976 (2012)
Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993)
Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)
Shi, Y., Eberhart, R.C.: A modified particle swarm optimizer. In: Proc. IEEE Inter. Conf. on Evolutionary Computation, Anchorage, AK, USA, pp. 69–73 (1998)
Shi, Y., Eberhart, R.C.: Experimental study of particle swarm optimization. In: Proc. 1999 Congress on Evol. Comput-CEC99, Washington, DC, USA, vol. 3, pp. 1945–1950 (1999)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)
Xin, J., Chen, G., Hai, Y.: A particle swarm optimizer with multistage linearly-decreasing inertia weight. IEEE Inter. Conf. Comput. Sci. Opt. CSO 2009. 1, 505–508 (2009)
Yang, X.S.: A new metaheuristic bat-inspired algorithm. In: González, J.R., Pelta, D.A., Cruz, C., Terrazas, G., Krasnogor, N. (eds) Nature Inspired Cooperative Strategies for Optimization. Studies in Computational Intelligence, vol. 284. Springer, Berlin, Heidelberg (2010)
Zhang, Y., Sun, W., Qi, L.: A nonmonotone filter Barzilai-Borwein method for optimization. Asia Pac. J. Oper. Res. 27(1), 55–69 (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Lakhbab, H. (2020). NPSOG: A New Hybrid Method for Unconstrained Differentiable Optimization. In: Machado, J., Özdemir, N., Baleanu, D. (eds) Numerical Solutions of Realistic Nonlinear Phenomena. Nonlinear Systems and Complexity, vol 31. Springer, Cham. https://doi.org/10.1007/978-3-030-37141-8_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-37141-8_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-37140-1
Online ISBN: 978-3-030-37141-8
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)