A sensitivity analysis method aimed at enhancing the metaheuristics for continuous optimization

  • Peio Loubière
  • Astrid Jourdan
  • Patrick Siarry
  • Rachid Chelouah
Article
  • 154 Downloads

Abstract

An efficient covering of the search space is an important issue when dealing with metaheuristics. Sensitivity analysis methods aim at evaluating the influence of each variable of a problem on a model (i.e. objective function) response. Such methods provide knowledge on the function behavior and would be suitable for guiding metaheuristics. To evaluate correctly the dimensions influences, usual sensitivity analysis methods need a lot of evaluations of the objective function or are constrained with an experimental design. In this paper, we propose a new method, with a low computational cost, which can be used into metaheuristics to improve their search process. This method is based on two global sensitivity analysis methods: the linear correlation coefficient technique and Morris’ method. We propose to transform the global study of a non linear model into a local study of quasi-linear sub-parts of the model, in order to evaluate the global influence of each input variable on the model. This sensitivity analysis method will use evaluations of the objective function done by the metaheuristic to compute a weight of each variable. Then, the metaheuristic will generate new solutions choosing dimensions to offset, according to these weights. The tests done on usual benchmark functions of sensitivity analysis and continuous optimization (CEC 2013) reveal two issues. Firstly, our sensitivity analysis method provides good results, it correctly ranks each dimension’s influence. Secondly, integrating a sensitivity analysis method into a metaheuristic (here, Differential Evolution and ABC with modification rate) improves its results.

Keywords

Continuous optimization Metaheuristics Sensitivity analysis Linear correlation coefficients Morris’ method 

References

  1. Akay B, Karaboga D (2012) A modified artificial bee colony algorithm for real-parameter optimization. Inf Sci 192:120–142CrossRefGoogle Scholar
  2. Bartz-Beielstein T, Preuss M (2007) Experimental research in evolutionary computation. In: Proceedings of the 9th annual conference companion on genetic and evolutionary computation, (chp 5), GECCO ’07, pp 3001–3020. ACM, New York, NY, USAGoogle Scholar
  3. Campolongo F, Cariboni J, Saltelli A (2007) An effective screening design for sensitivity analysis of large models. Environ Modell Softw 22(10):1509–1518CrossRefGoogle Scholar
  4. Chelouah R, Siarry P (2000a) A continuous genetic algorithm designed for the global optimization of multimodal functions. J Heuristics 6(2):191–213Google Scholar
  5. Chelouah R, Siarry P (2000b) Tabu search applied to global optimization. Eur J Oper Res 123(2):256–270Google Scholar
  6. Cropp RA, Braddock RD (2002) The new Morris method: an efficient second-order screening method. Reliab Eng Syst Saf 78(1):77–83CrossRefGoogle Scholar
  7. Das S, Mullick SS, Suganthan P (2016) Recent advances in differential evolution-an updated survey. Swarm Evolut Comput 27:1–30CrossRefGoogle Scholar
  8. Dréo J, Siarry P (2007) Hybrid continuous interacting ant colony aimed at enhanced global optimization. Algorithmic Oper Res 2(1):52–64MathSciNetMATHGoogle Scholar
  9. Iooss B, Lemaître P (2015) Uncertainty management in simulation-optimization of complex systems: Algorithms and applications. chap. A Review on Global Sensitivity Analysis Methods, pp. 101–122. Springer US, Boston, MAGoogle Scholar
  10. Jourdan A (2012) Global sensitivity analysis using complex linear models. Stat Comput 22(3):823–831MathSciNetCrossRefMATHGoogle Scholar
  11. Jourdan A, Franco J (2010) Optimal Latin hypercube designs for the Kullback–Leibler criterion. AStA Adv Stat Anal 94(4):341–351MathSciNetCrossRefGoogle Scholar
  12. Juan AA, Faulin J, Grasman SE, Rabe M, Figueira G (2015) A review of simheuristics: extending metaheuristics to deal with stochastic combinatorial optimization problems. Oper Res Perspect 2:62–72MathSciNetCrossRefGoogle Scholar
  13. Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Tech Rep TR06, Erciyes UniversityGoogle Scholar
  14. Karaboga D, Gorkemli B, Ozturk C, Karaboga N (2012) A comprehensive survey: artificial bee colony (abc) algorithm and applications. Artif Intell Rev 42(1):21–57CrossRefGoogle Scholar
  15. Lai KK, Yu L, Huang W, Wang S (2006) A novel support vector machine metamodel for business risk identification. In: Pacific rim international conference on artificial intelligence, pp. 980–984. Springer, BerlinGoogle Scholar
  16. Li G, Niu P, Xiao X (2012) Development and investigation of efficient artificial bee colony algorithm for numerical function optimization. Appl Soft Comput 12(1):320–332CrossRefGoogle Scholar
  17. Liang JJ, Qu BY, Suganthan PN, Hernández-Díaz AG (2013) Problem definitions and evaluation criteria for the CEC 2013 special session on real-parameter optimization. Technical Report 201212, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Nanyang Technological University, SingaporeGoogle Scholar
  18. Loubière P, Jourdan A, Siarry P, Chelouah R (2016) A sensitivity analysis method for driving the artificial bee colony algorithm’s search process. Appl Soft Comput 41:515–531CrossRefGoogle Scholar
  19. Marrel A, Iooss B, Laurent B, Roustant O (2009) Calculations of sobol indices for the gaussian process metamodel. Reliab Eng Syst Saf 94(3):742–751CrossRefGoogle Scholar
  20. Price K, Storn RM, Lampinen JA (2005) Differential evolution: a practical approach to global optimization (Natural computing series). Springer, SecaucusMATHGoogle Scholar
  21. Queipo NV, Haftka RT, Shyy W, Goel T, Vaidyanathan R, Tucker PK (2005) Surrogate-based analysis and optimization. Prog Aerosp Sci 41(1):1–28CrossRefGoogle Scholar
  22. Rana S, Jasola S, Kumar R (2011) A review on particle swarm optimization algorithms and their applications to data clustering. Artif Intell Rev 35(3):211–222CrossRefGoogle Scholar
  23. Robinson T, Eldred M, Willcox K, Haimes R (2008) Surrogate-based optimization using multifidelity models with variable parameterization and corrected space mapping. Aiaa J 46(11):2814–2822CrossRefGoogle Scholar
  24. Saltelli A (2002) Sensitivity analysis for importance assessment. Risk Anal 22(3):579–590CrossRefGoogle Scholar
  25. Simon D (2008) Biogeography-based optimization. IEEE Trans Evolut Comput 12(6):702–713CrossRefGoogle Scholar
  26. Soares J, Borges N, Vale Z, Oliveira PdM (2016) Enhanced multi-objective energy optimization by a signaling method. Energies 9(10):807CrossRefGoogle Scholar
  27. Sudret B (2008) Global sensitivity analysis using polynomial chaos expansions. Reliab Eng Syst Saf 93(7):964–979CrossRefGoogle Scholar
  28. van Griensven A, Meixner T, Grunwald S, Bishop T, Diluzio M, Srinivasan R (2006) A global sensitivity analysis tool for the parameters of multi-variable catchment models. J Hydrol 324(1–4):10–23Google Scholar
  29. van der Merwe R, Leen TK, Lu Z, Frolov S, Baptista AM (2007) Fast neural network surrogates for very high dimensional physics-based models in computational oceanography. Neural Netw 20(4):462–478CrossRefGoogle Scholar
  30. Zhang H, Qin C, Luo Y (2014) Neural-network-based constrained optimal control scheme for discrete-time switched nonlinear system using dual heuristic programming. IEEE Trans Autom Sci Eng 11(3):839–849CrossRefGoogle Scholar
  31. Zhang H, Wang Z, Liu D (2014) A comprehensive review of stability analysis of continuous-time recurrent neural networks. IEEE Trans Neural Netw Learn Syst 25(7):1229–1262CrossRefGoogle Scholar
  32. Ziliani L, Surian N, Coulthard T, Tarantola S (2013) Reduced-complexity modeling of braided rivers: assessing model performance by sensitivity analysis, calibration, and validation. J Geophys Res Earth Surf 118(4):2243–2262CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2017

Authors and Affiliations

  • Peio Loubière
    • 1
  • Astrid Jourdan
    • 1
  • Patrick Siarry
    • 2
  • Rachid Chelouah
    • 1
  1. 1.École internationale des sciences du traitement de l’information (EISTI)Cergy-PontoiseFrance
  2. 2.Université de Paris-Est, LISSI, UPECVitry sur SeineFrance

Personalised recommendations