Evolutionary Intelligence

, Volume 4, Issue 2, pp 81–97 | Cite as

Surrogate-assisted clonal selection algorithms for expensive optimization problems

  • Heder S. Bernardino
  • Helio J. C. BarbosaEmail author
  • Leonardo G. Fonseca
Special Issue


Clonal selection algorithms are computational methods inspired by the behavior of the immune system which can be applied to solve optimization problems. However, like other nature inspired algorithms, they can require a large number of objective function evaluations in order to reach a satisfactory solution. When those evaluations involve a computationally expensive simulation model their cost becomes prohibitive. In this paper we analyze the use of surrogate models in order to enhance the performance of a clonal selection algorithm. Computational experiments are conducted to assess the performance of the presented techniques using a benchmark with 22 test-problems under a fixed budget of objective function evaluations. The comparisons show that for most cases the use of surrogate models improve significantly the performance of the baseline clonal selection algorithm.


Clonal selection Artificial immune system Optimization Surrogate model 



The authors acknowledge the support received from CNPq (308317/2009-2) and FAPERJ (grants E-26/102.825/2008 and E-26/100 .308/2010).


  1. 1.
    Acar E, Rais-Rohani M (2009) Ensemble of metamodels with optimized weight factors. Struct Multidisc Optim 37(3):279–294CrossRefGoogle Scholar
  2. 2.
    Aha DW (1997) Editorial. Artif Intell Rev 11(1–5):1–6 Special issue on lazy learningGoogle Scholar
  3. 3.
    AISWeb (2008): the online home of artificial immune systems., accessed 11/09/2008
  4. 4.
    Barbosa HJC, Bernardino HS, Barreto AMS (2010) Using performance profiles to analyze the results of the 2006 CEC constrained optimization competition. In: IEEE world congress on computational intelligence. Barcelona, SpainGoogle Scholar
  5. 5.
    Bernardino HS, Barbosa HJ, Fonseca LG (2010) A faster clonal selection algorithm for expensive optimization problems. In: Hart E, McEwan C, Timmis J, Hone A (eds) Artificial immune systems, lecture notes in computer science, vol 6209. Springer, Berlin / Heidelberg, pp 130–143Google Scholar
  6. 6.
    Bernardino HS, Barbosa HJC (2009) Artificial immune systems for optimization. In: Chiong R (ed) Nature-inspired algorithms for optimisation.. Springer, Berlin, pp 389–411CrossRefGoogle Scholar
  7. 7.
    Bernardino HS, Fonseca LG, Barbosa HJC (2009) Surrogate-assisted artificial immune systems for expensive optimization problems. In: dos Santos WP (ed) Evolutionary computation. IntechWeb, pp 179–198Google Scholar
  8. 8.
    Blanning RW (1974) The source and uses of sensivity information. Interfaces 4(4):32–38CrossRefGoogle Scholar
  9. 9.
    de Castro LN, von Zuben FJ (2002) Learning and optimization using the clonal selection principle. IEEE Trans Evol Comput 6(3):239–251CrossRefGoogle Scholar
  10. 10.
    Custodio FL, Barbosa HJC, Dardenne LE (2010) Full-atom ab initio protein structure prediction with a genetic algorithm using a similarity-based surrogate model. In: IEEE world congress on computational intelligence. Barcelona, SpainGoogle Scholar
  11. 11.
    Cutello V, Narzisi G, Nicosia G, Pavone M (2005) Clonal selection algorithms: A comparative case study using effective mutation potentials. In: Proc. of the intl. conf. on artificial immune systems—ICARIS 2005, LNCS, vol 3627. Springer, Banff, Canada, pp 13–28Google Scholar
  12. 12.
    Cutello V, Nicosia G, Pavone M (2006) Real coded clonal selection algorithm for unconstrained global optimization using a hybrid inversely proportional hypermutation operator. In: Proc. of the ACM symposium on applied computing—SAC ’06. ACM Press, New York, pp 950–954Google Scholar
  13. 13.
    Dolan E, Moré JJ (2002) Benchmarcking optimization software with performance profiles. Math Program 91(2):201–213MathSciNetzbMATHCrossRefGoogle Scholar
  14. 14.
    Emmerich M, Giannakoglou K, Naujoks B (2006) Single- and multiobjective evolutionary optimization assisted by gaussian random field metamodels. Evol Comput 10(4):421–439CrossRefGoogle Scholar
  15. 15.
    Ferrari S, Stengel RF (2005) Smooth function approximation using neural networks. IEEE Trans Neural Netw 16(1):24–38CrossRefGoogle Scholar
  16. 16.
    Fonseca LG, Barbosa HJC, Lemonge ACC (2007) Metamodel assisted genetic algorithm for truss weight minimization. In: ICMOSPS’07. Durban, South Africa. CD-ROMGoogle Scholar
  17. 17.
    Fonseca LG, Barbosa HJC, Lemonge ACC (2009) A similarity-based surrogate model for enhanced performance in genetic algorithms. Opsearch 46:89–107MathSciNetzbMATHCrossRefGoogle Scholar
  18. 18.
    Fonseca LG, Barbosa HJC, Lemonge ACC (2010) On similarity-based surrogate models for expensive single- and multi-objective evolutionary optimization. In: Tenne Y, Goh CK (eds) Computational intelligence in expensive optimization problems. Springer, New York, pp 219–248CrossRefGoogle Scholar
  19. 19.
    Forrester AI, Keane AJ (2009) Recent advances in surrogate-based optimization. Prog Aerosp Sci 45:50–79CrossRefGoogle Scholar
  20. 20.
    Garrett SM (2004) Parameter-free, adaptive clonal selection. IEEE Congr Evol Comput 1:1052–1058Google Scholar
  21. 21.
    Giannakoglou KC (2002) Design of optimal aerodynamic shapes using stochastic optimization methods and computational intelligence. Prog Aerosp Sci 38(1):43–76CrossRefGoogle Scholar
  22. 22.
    Grefenstette J, Fitzpatrick J (2009) Genetic search with approximate fitness evaluations. In: Proc. of the intl. conf. on genetic algorithms and their applications, pp 112–120Google Scholar
  23. 23.
    Hu H, Lee DL (2006) Range nearest-neighbor query. IEEE Trans Knowl Data Eng 18(1):78–91MathSciNetCrossRefGoogle Scholar
  24. 24.
    Jin Y, Branke J (2005) Evolutionary optimization in uncertain environments-a survey. IEEE Trans Evol Comput 9(3):303–317CrossRefGoogle Scholar
  25. 25.
    Kecman V (2001) Learning and soft computing: support vector machines, neural networks, and fuzzy logic models. Complex adaptive systems. MIT Press, CambridgeGoogle Scholar
  26. 26.
    Liang Y, Leung KS (2002) Two-way mutation evolution strategies, pp 789 –794Google Scholar
  27. 27.
    Lim D, Jin Y, Ong YS, Sendhoff B (2010) Generalizing surrogate-assisted evolutionary computation. IEEE Trans Evol Comput 14(3):329–355Google Scholar
  28. 28.
    Ong Y, Nair P, Keane A (2003) Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA J 41(4):687–696CrossRefGoogle Scholar
  29. 29.
    Praveen C, Duvigneau R (2009) Low cost PSO using metamodels and inexact pre-evaluation: application to aerodynamic shape design. Comput Methods Appl Mech Eng 198(9–12):1087–1096CrossRefGoogle Scholar
  30. 30.
    Rasheed K, Vattam S, Ni X (2002) Comparison of methods for using reduced models to speed up design optimization. In: Proc. of genetic and evolutionary computation conference. Morgan Kaufmann, New York, pp 1180–1187Google Scholar
  31. 31.
    Regis RG, Shoemaker CA (2004) Local function approximation in evolutionary algorithms for the optimization of costly functions. IEEE Trans Evol Comput 8(5):490–505CrossRefGoogle Scholar
  32. 32.
    Runarsson T (2006) Approximate evolution strategy using stochastic ranking. In: Yen GG et al (eds) IEEE world congress on computational intelligence. Vancouver, Canada, pp 745–752Google Scholar
  33. 33.
    Runarsson TP (2004) Constrained evolutionary optimization by approximate ranking and surrogate models. In: Yao X et al (eds) Proc. of 8th parallel problem solving from nature. Springer, Heidelberg, pp 401–410Google Scholar
  34. 34.
    Salami M, Hendtlass T (2003) A fast evaluation strategy for evolutionary algorithms. Appl Soft Comput 2:156–173CrossRefGoogle Scholar
  35. 35.
    Sanchez E, Pintos S, Queipo N (2007) Toward an optimal ensemble of kernel-based approximations with engineering applications. Structural and multidisciplinary optimization, pp 1–15Google Scholar
  36. 36.
    Sastry K, Lima CF, Goldberg DE (2006) Evaluation relaxation using substructural information and linear estimation. In: Proc. of the 8th annual conference on genetic and evolutionary computation. ACM Press, New York, pp 419–426Google Scholar
  37. 37.
    Shepard D (1968) A two-dimensional interpolation function for irregularly-spaced data. In: Proc. of the 1968 23rd ACM national conference. ACM Press, New York, pp 517–524Google Scholar
  38. 38.
    Smith RE, Dike BA, Stegmann, SA (1995) Fitness inheritance in genetic algorithms. In: Proc. of the ACM symposium on applied computing, pp 345–350Google Scholar
  39. 39.
    Suganthan PN (2010) Benchmarks for evaluation of evolutionary algorithms., accessed in 2010
  40. 40.
    Suganthan PN, Hansen N, Liang JJ, Deb K, Chen YP, Auger A, Tiwari S (2005) Problem definitions and evaluation criteria for the cec 2005 special session on real-parameter optimization. Tech. Rep. 2005005, Nanyang Technological UniversityGoogle Scholar
  41. 41.
    Sun XY, Gong D, Li S (2009) Classification and regression-based surrogate model-assisted interactive genetic algorithm with individual’s fuzzy fitness. In: Proc. of the 11th annual conference on genetic and evolutionary computation. ACM Press, New York, pp 907–914Google Scholar
  42. 42.
    Tang K, Li X, Suganthan PN, Yang Z, Weise T (2009) Benchmark functions for the cec’2010 special session and competition on large scale global optimization. Tech. rep., Nature Inspired Computation and Applications LaboratoryGoogle Scholar
  43. 43.
    Tang K, Yao X, Suganthan PN, MacNish C, Chen YP, Chen CM, Yang Z (2007) Benchmark functions for the cec’2008 special session and competition on large scale global optimization. Tech. rep., Nature Inspired Computation and Applications LaboratoryGoogle Scholar
  44. 44.
    Wanner EF, Guimaraes FG, Takahashi RHC, Lowther DA, Ramirez JA (2008) Multiobjective memetic algorithms with quadratic approximation-based local search for expensive optimization in electromagnetics. IEEE Trans Magn 44(6):1126–1129CrossRefGoogle Scholar
  45. 45.
    Whitley D (2010) Test functions., accessed in 2010
  46. 46.
    Yan X, Su XG (2009) Linear regression analysis: theory and computing. World Scientific Publishing Company, SingaporezbMATHGoogle Scholar
  47. 47.
    Yang D, Flockton SJ (1995) Evolutionary algorithms with a coarse-to-fine function smoothing. IEEE Intl Conf Evol Comput 2:657–662Google Scholar
  48. 48.
    Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput 3:82–102CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2011

Authors and Affiliations

  • Heder S. Bernardino
    • 1
  • Helio J. C. Barbosa
    • 1
    • 2
    Email author
  • Leonardo G. Fonseca
    • 2
  1. 1.Laboratório Nacional de Computação Científica, LNCCPetrópolisBrazil
  2. 2.Universidade Federal de Juiz de Fora, UFJFJuiz de ForaBrazil

Personalised recommendations