Skip to main content

NPSOG: A New Hybrid Method for Unconstrained Differentiable Optimization

  • Chapter
  • First Online:
Numerical Solutions of Realistic Nonlinear Phenomena

Part of the book series: Nonlinear Systems and Complexity ((NSCH,volume 31))

  • 443 Accesses

Abstract

In this study, we adopted a novel hybrid global–local optimization algorithm called NPSOG, which combines particle swarm optimization (PSO) [16] and a gradient method [21, 23] to solve a class of global optimization problems for continuously differentiable functions. In this method, at each iteration of the PSO algorithm, and under specific condition given by the notion of loudness parameter, we perform an exploitation step by a gradient method. The loudness parameter was introduced for the first time by Yang in [28]. Our experimental results for the test functions indicate that the usage of NPSOG algorithms can improve the performance of PSO considerably. In addition, its performance as a viable optimization method is demonstrated by comparing it with classical kind of hybridization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The idea of this work was inspired by [19, 20].

  2. 2.

    The loudness parameter was introduced for the first time by Yang in his famous article about bat algorithm [28].

  3. 3.

    The standard deviation is used to indicate the stability of an algorithm, a more stable algorithm should produce a smaller value of this measurement.

References

  1. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  2. Armijo, L.: Minimization of functions having Lipschitz first partial derivatives. Pac. J. Math. 16(1), 1–3 (1966)

    Article  MathSciNet  Google Scholar 

  3. Barzilai, J., Borwein, J.M.: Two point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  4. Bansal, J.C., Singh, P.K., Saraswat, M., Verma, A., Jadon, S.S., Abraham, A.: Inertia Weight Strategies in Particle Swarm Optimization, pp. 633–640. Third World Congress on Nature and Biologically Inspired Computing, Salamanca (2011)

    Google Scholar 

  5. Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone Spectral Projected Gradient Methods on Convex Sets. SIAM J Optim. 10(4), 1196–1211 (2000)

    Article  MathSciNet  Google Scholar 

  6. Birgin, E.G., Raydan, M.: SPG software for convex-constrained optimization. ACM Trans. Math. Soft. 27(3), 340–349 (2001)

    Article  Google Scholar 

  7. Chakri, A., Khelif, R., Benouaret, M., Yang, X.Y.: New directional bat algorithm for continuous optimization problems. Expert Syst. Appl. 69, 159–175 (2017)

    Article  Google Scholar 

  8. Clerc, M., Kennedy, J.: The particle swarm - explosion, stability, and convergence in a multidimensional complex space. IEEE T Evol. Comput. 6(1), 58–73 (2002)

    Article  Google Scholar 

  9. Colorni, A., Dorigo, M., Maniezzo, V.: An investigation of some proprieties of an “ant algorithm”. In: Proc. Parallel Problem Solving from Nature Conference, pp. 509–520. Elsevier Publishing (1992)

    Google Scholar 

  10. Dai, Y.H., Hager, W.W., Schitkowski, K., Zhang.: The cyclic Barzilai-Borwein method for unconstrained optimization. IMA J. Numer. Anal. 26(3), 1–24 (2006)

    Article  MathSciNet  Google Scholar 

  11. Fletcher, R.: Low storage methods for unconstrained optimization. In: Lect. Appl. Math., vol. 26, pp. 165–179. American Mathematical Society, Providence, RI (1990)

    Google Scholar 

  12. Glover, F.: Future paths for integer programming and links to artificial intelligence. Comput. Oper. Res. 13(5), 533–549 (1986)

    Article  MathSciNet  Google Scholar 

  13. Goldstein, A.A.: On steepest descent. SIAM J. Control. 3, 147–151 (1965)

    MathSciNet  MATH  Google Scholar 

  14. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)

    Article  MathSciNet  Google Scholar 

  15. Holland, J.H.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. University Michigan Press, Oxford, England (1975)

    MATH  Google Scholar 

  16. Kennedy, J., Eberhart, R.: Particle Swarm Optimization. In: IEEE Int. Conf. Neural Networks, Perth, WA, Australia, vol. 4, pp. 1942–1948 (1995)

    Google Scholar 

  17. Kennedy, J., Mendes, R.: Population structure and particle swarm performance. In: Proc. Congress on Evolutionary Computation (CEC’02), vol. 2, pp. 1671–1676 (2002)

    Google Scholar 

  18. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science, New Series 220(4598), 671–680 (1983)

    MathSciNet  MATH  Google Scholar 

  19. La Cruz, W., Noguera, N.: Hybrid spectral gradient method for the unconstrained minimization problem. J. Global Optim. 44, 193–212 (2009)

    Article  MathSciNet  Google Scholar 

  20. Lakhbab, H., El Bernoussi, S.: Hybrid nonmonotone spectral gradient method for the unconstrained minimization problem. Comput. Appl. Math. 36(3), 1421–1430 (2017)

    Article  MathSciNet  Google Scholar 

  21. Lakhbab, H., El Bernoussi, S.: A hybrid method based on particle swarm optimization and nonmonotone spectral gradient method for unconstrained optimization problem. Int. J. Math. Anal. 6(60), 2963–2976 (2012)

    MathSciNet  MATH  Google Scholar 

  22. Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993)

    Article  MathSciNet  Google Scholar 

  23. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)

    Article  MathSciNet  Google Scholar 

  24. Shi, Y., Eberhart, R.C.: A modified particle swarm optimizer. In: Proc. IEEE Inter. Conf. on Evolutionary Computation, Anchorage, AK, USA, pp. 69–73 (1998)

    Google Scholar 

  25. Shi, Y., Eberhart, R.C.: Experimental study of particle swarm optimization. In: Proc. 1999 Congress on Evol. Comput-CEC99, Washington, DC, USA, vol. 3, pp. 1945–1950 (1999)

    Google Scholar 

  26. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)

    Article  MathSciNet  Google Scholar 

  27. Xin, J., Chen, G., Hai, Y.: A particle swarm optimizer with multistage linearly-decreasing inertia weight. IEEE Inter. Conf. Comput. Sci. Opt. CSO 2009. 1, 505–508 (2009)

    Google Scholar 

  28. Yang, X.S.: A new metaheuristic bat-inspired algorithm. In: González, J.R., Pelta, D.A., Cruz, C., Terrazas, G., Krasnogor, N. (eds) Nature Inspired Cooperative Strategies for Optimization. Studies in Computational Intelligence, vol. 284. Springer, Berlin, Heidelberg (2010)

    Google Scholar 

  29. Zhang, Y., Sun, W., Qi, L.: A nonmonotone filter Barzilai-Borwein method for optimization. Asia Pac. J. Oper. Res. 27(1), 55–69 (2010)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Lakhbab, H. (2020). NPSOG: A New Hybrid Method for Unconstrained Differentiable Optimization. In: Machado, J., Özdemir, N., Baleanu, D. (eds) Numerical Solutions of Realistic Nonlinear Phenomena. Nonlinear Systems and Complexity, vol 31. Springer, Cham. https://doi.org/10.1007/978-3-030-37141-8_9

Download citation

Publish with us

Policies and ethics