Engineering with Computers

, Volume 26, Issue 1, pp 81–98 | Cite as

Multiobjective global surrogate modeling, dealing with the 5-percent problem

  • Dirk GorissenEmail author
  • Ivo Couckuyt
  • Eric Laermans
  • Tom Dhaene
Original Article


When dealing with computationally expensive simulation codes or process measurement data, surrogate modeling methods are firmly established as facilitators for design space exploration, sensitivity analysis, visualization, prototyping and optimization. Typically the model parameter (=hyperparameter) optimization problem as part of global surrogate modeling is formulated in a single objective way. Models are generated according to a single objective (accuracy). However, this requires an engineer to determine a single accuracy target and measure upfront, which is hard to do if the behavior of the response is unknown. Likewise, the different outputs of a multi-output system are typically modeled separately by independent models. Again, a multiobjective approach would benefit the domain expert by giving information about output correlation and enabling automatic model type selection for each output dynamically. With this paper the authors attempt to increase awareness of the subtleties involved and discuss a number of solutions and applications. In particular, we present a multiobjective framework for global surrogate model generation to help tackle both problems and that is applicable in both the static and sequential design (adaptive sampling) case.


Surrogate modeling Metamodeling Error estimation Performance measures Model selection Multiobjective optimization Adaptive sampling Hyperparameter optimization 



The authors wish to thank Markus Ganser and Karen Grossenbacher from BMW motor company for making the automotive data available and the fruitful discussions. The authors also thank Jeroen Croon from the NXP-TSMC Research Center, Device Modeling Department, Eindhoven, The Netherlands for the LNA simulation code. Finally, the authors also thank Matthias Ihme from Standford University for providing the chemistry combustion data.


  1. 1.
    Abd El-Sallam A, Kayhan S, Zoubir A (2003) Bootstrap and backward elimination based approaches for model selection. In: Image and signal processing and analysis, 2003. ISPA 2003. In: Proceedings of the 3rd international symposium on, vol 1, pp 152–157. doi: 10.1109/ISPA.2003.1296885
  2. 2.
    Alexandrov NM (1998) On managing the use of surrogates in general nonlinear optimization and mdo. In: 7th AIAA/USAF/NASA/ISSMO symposium on multidisciplinary analysis and optimization, St. LouisGoogle Scholar
  3. 3.
    Armstrong JS, Collopy F (1992) Error measures for generalizing about forecasting methods: empirical comparisons. Int J Forecast 8(1):69–80.
  4. 4.
    Bartlett PL, Boucheron S, Lugosi G (2002) Model selection and error estimation. Mach Learn 48(1–3):85–113zbMATHCrossRefGoogle Scholar
  5. 5.
    Barton RR (1997) Design of experiments for fitting subsystem metamodels. In: WSC ’97: proceedings of the 29th conference on winter simulation. ACM Press, New York, pp 303–310. doi: 10.1145/268437.268495
  6. 6.
    Booker AJ, Dennis JE, Frank PD, Serafini DB, Torczon V, Trosset MW (1999) A rigorous framework for optimization of expensive functions by surrogate. Struct Multidiscip Optim 17(1):1–13Google Scholar
  7. 7.
    Busby D, Farmer CL, Iske A (2007) Hierarchical nonlinear approximation for experimental design and statistical data fitting. SIAM J Sci Comput 29(1):49–69. doi: 10.1137/050639983 zbMATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Chen H, Huang S (2005) A comparative study on model selection and multiple model fusion. In: 8th international conference on information fusion, vol 1. doi: 10.1109/ICIF.2005.1591938
  9. 9.
    Chen PW, Wang JY, Lee HM (2004) Model selection of SVMs using GA approach. In: Proceedings of IEEE international joint conference on neural networks, vol 3, pp 2035–2040Google Scholar
  10. 10.
    Conti S, O’Hagan A (2007) Bayesian emulation of complex multi-output and dynamic computer models. Research report no. 569/07. J Stat Plan Inference (submitted)Google Scholar
  11. 11.
    De Geest J, Dhaene T, Faché N, De Zutter D (1999) Adaptive CAD-model building algorithm for general planar microwave structures. IEEE Trans Microw Theory Tech 47(9):1801–1809CrossRefGoogle Scholar
  12. 12.
    Deb K, Nain PKS (2007) An evolutionary multi-objective adaptive meta-modeling procedure using artificial neural networks. In: Yang S, Ong YS, Jin Y (eds) Evolutionary computation in dynamic and uncertain environments, Studies in Computational Intelligence, vol 51, Springer, Berlin, pp 297–322.
  13. 13.
    Deb K, Pratap A, Agarwal S, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197. doi: 10.1109/4235.996017 CrossRefGoogle Scholar
  14. 14.
    Devabhaktuni V, Chattaraj B, Yagoub M, Zhang QJ (2003) Advanced microwave modeling framework exploiting automatic model generation, knowledge neural networks, and space mapping. IEEE Trans Microw Theory Tech 51(7):1822–1833. doi: 10.1109/TMTT.2003.814318 CrossRefGoogle Scholar
  15. 15.
    Devabhaktuni VK, Zhang QJ (2000) Neural network training-driven adaptive sampling algorithm. In: Proceedings of 30th European microwave conference, vol 3, Paris, France, pp 222–225Google Scholar
  16. 16.
    Ding M, Vemur R (2005) An active learning scheme using support vector machines for analog circuit feasibility classification. In: 18th international conference on VLSI design, pp 528–534. doi: 10.1109/ICVD.2005.47
  17. 17.
    Efron B (2004) The estimation of prediction error: covariance penalties and cross-validation. J Am Stat Assoc 99:619–632.
  18. 18.
    Eldred MS, Dunlavy DM (2006) Formulations for surrogate-based optimization wiht data fit, multifidelity, and reduced-order models. In: 11th AIAA/ISSMO multidisciplinary analysis and optimization conference, Protsmouth, VirginiaGoogle Scholar
  19. 19.
    Eldred T, Willcox M, Robinson K, Haimes R (2006) Strategies for multifidelity optimization with variable dimensional hierarchical models. In: Proceedings of the 47th AIAA/ASME/ASCE/AHS/ASC structures, structural dynamics, and materials conference (2nd AIAA multidisciplinary design optimization specialist conference), Newport, Rhode IslandGoogle Scholar
  20. 20.
    English T (2000) Optimization is easy and learning is hard in the typical function. In: Proceedings of the 2000 congress on evolutionary computation, vol 2, pp 924–931. doi: 10.1109/CEC.2000.870741
  21. 21.
    Falas T, Stafylopatis AG (1999) The impact of the error function selection in neural network-based classifiers. IJCNN ’99, International joint conference on neural networks, vol 3, pp 1799–1804. doi: 10.1109/IJCNN.1999.832651
  22. 22.
    Fang H, Rais-Rohani M, Liu Z, Horstemeyer M (2005) A comparative study of metamodeling methods for multiobjective crashworthiness optimization. Comput Struct 83:2121–2136CrossRefGoogle Scholar
  23. 23.
    Farhang-Mehr A, Azarm S (2005) Bayesian meta-modelling of engineering design simulations: a sequential approach with adaptation to irregularities in the response behaviour. Int J Numer Meth Eng 62(15):2104–2126. doi: 10.1002/nme.1261 zbMATHCrossRefGoogle Scholar
  24. 24.
    Fenicia F, Solomatine DP, Savenije HHG, Matgen P (2007) Soft combination of local models in a multi-objective framework. Hydrol Earth Syst Sci Discus 4(1):91–123.
  25. 25.
    Fieldsend JE (2008) Multi-objective supervised learning. In: Knowles J, Corne D, Deb K (eds) Multiobjective problem solving from nature from concepts to applications, Natural Computing Series. Springer LNCSGoogle Scholar
  26. 26.
    Fieldsend JE, Singh S (2002) Pareto multiobjective nonlinear regression modelling to aid capm analogous forecasting. In: IJCNN ’02, Proceedings of the 2002 international joint conference on neural networks, vol 1, pp 388–393. doi: 10.1109/IJCNN.2002.1005503
  27. 27.
    Fieldsend JE, Singh S (2005) Pareto evolutionary neural networks. IEEE Trans Neural Netw 16(2):338–354 . doi: 10.1109/TNN.2004.841794 CrossRefGoogle Scholar
  28. 28.
    Foresee F, Hagan M (1997) Gauss-newton approximation to bayesian regularization. In: Proceedings of the 1997 international joint conference on neural networks, pp 1930–1935Google Scholar
  29. 29.
    Forrester AIJ, Bressloff NW, Keane AJ (2006) Optimization using surrogate models and partially converged computational fluid dynamics simulations. Proc R Soc 462:2177–2204zbMATHCrossRefGoogle Scholar
  30. 30.
    Furukawa T, Lee CJK, Michopoulos J (2006) Regularization for parameter identification using multi-objective optimization. In: Multi-objective machine learning, pp 125–149Google Scholar
  31. 31.
    Ganser M, Grossenbacher K, Schutz M, Willmes L, Back T (2007) Simulation meta-models in the early phases of the product development process. In: Proceedings of efficient methods for robust design and optimization (EUROMECH 07)Google Scholar
  32. 32.
    Goel T, Haftka R, Shyy W (2009) Comparing error estimation measures for polynomial and kriging approximation of noise-free functions. J Struct Multidiscip Optim 28(5):429–442CrossRefGoogle Scholar
  33. 33.
    Goel T, Haftka R, Shyy W, Queipo N (2007) Ensemble of surrogates. Struct Multidiscip Optim 33:199–216. doi: 10.1007/s00158-006-0051-9 CrossRefGoogle Scholar
  34. 34.
    Gorissen D (2007) Heterogeneous evolution of surrogate models. Master’s thesis, Master of AI, Katholieke Universiteit Leuven (KUL)Google Scholar
  35. 35.
    Gorissen D (2009) Grid-enabled adaptive metamodeling and active learning for computer based design. In: Proceedings of the 22nd Canadian conference on artificial intelligence (AI 2009), Kelowna, pp 266–269 (Springer, Lecture Notes in Artificial Intelligence, vol. LNCS 5549)Google Scholar
  36. 36.
    Gorissen D, Couckuyt I, Crombecq K, Dhaene T (2009) Pareto-based multi-output model type selection. In: Proceedings of the 4th international conference on hybrid artificial intelligence (HAIS 2009), Salamanca, Spain, pp 442–449 (Springer, Lecture Notes in Artificial Intelligence, vol. LNCS 5572)Google Scholar
  37. 37.
    Gorissen D, Couckuyt I, Laermans E, Dhaene T (2009) Pareto-based multi-output metamodeling with active learning. In: Proceedings of the 11th international conference on engineering applications of neural networks (EANN 2009), London, EnglandGoogle Scholar
  38. 38.
    Gorissen D, De Tommasi L, Crombecq K, Dhaene T (2009) Sequential modeling of a low noise amplifier with neural networks and active learning. Neural Comput Appl 18(5):485–494CrossRefGoogle Scholar
  39. 39.
    Gorissen D, De Tommasi L, Croon J, Dhaene T (2008) Automatic model type selection with heterogeneous evolution: an application to rf circuit block modeling. In: Proceedings of the IEEE congress on evolutionary computation, WCCI 2008, Hong KongGoogle Scholar
  40. 40.
    Gorissen D, De Tommasi L, Hendrickx W, Croon J, Dhaene T (2008) Rf circuit block modeling via kriging surrogates. In: Proceedings of the 17th international conference on microwaves, radar and wireless communications (MIKON 2008)Google Scholar
  41. 41.
    Hamad H, Al-Smadi A (2007) Space partitioning in engineering design via metamodel acceptance score distribution. Eng Comput Lond 23(3):175–185CrossRefGoogle Scholar
  42. 42.
    Hatzakis I, Wallace D (2006) Dynamic multi-objective optimization with evolutionary algorithms: a forward-looking approach. In: GECCO ’06: proceedings of the 8th annual conference on genetic and evolutionary computation, pp 1201–1208. ACM, New York, NY, USA (2006). doi: 10.1145/1143997.1144187
  43. 43.
    Hennig C, Kutlukaya M (2007) Some thoughts about the design of loss functions. REVSTAT Stat J 5:19–39zbMATHMathSciNetGoogle Scholar
  44. 44.
    Ihme M, Marsden AL, Pitsch H (2008) Generation of optimal artificial neural networks using a pattern search algorithm: application to approximation of chemical systems. Neural Comput 20:573–601zbMATHCrossRefMathSciNetGoogle Scholar
  45. 45.
    Jensen M (2003) Reducing the run-time complexity of multiobjective EAs: The NSGA-II and other algorithms. IEEE Trans Evol Comput 7(5):503–515. doi: 10.1109/TEVC.2003.817234 CrossRefGoogle Scholar
  46. 46.
    Jin Y (ed) (2006) Multi-objective machine learning. Studies in computational intelligence, vol 16. Springer, BerlinGoogle Scholar
  47. 47.
    Jin Y (2007) Pareto-based multi-objective machine learning. In: HIS 2007, 7th International Conference on hybrid intelligent systems, pp 2–2. doi: 10.1109/HIS.2007.73
  48. 48.
    Jin Y, Okabe T, Sendhoff B (2001) Dynamic weighted aggregation for evolutionary multi-objective optimization: why does it work and how? In: Spector L, Goodman ED, Wu A, Langdon W, Voigt HM, Gen M, Sen S, Dorigo M, Pezeshk S, Garzon MH, Burke E (eds) Proceedings of the genetic and evolutionary computation conference (GECCO’2001). Morgan Kaufmann Publishers, San Francisco, pp 1042–1049Google Scholar
  49. 49.
    Jin Y, Sendhoff B (2008) Pareto-based multiobjective machine learning: An overview and case studies. IEEE Trans Syst Man Cybern C Appl Rev 38(3):397–415. doi: 10.1109/TSMCC.2008.919172 CrossRefGoogle Scholar
  50. 50.
    Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Global Optim 13(4):455–492. doi: 10.1023/A:1008306431147 Google Scholar
  51. 51.
    Keane AJ (2006) Statistical improvement criteria for use in multiobjective design optimization. AIAA J 44(4):879–891CrossRefGoogle Scholar
  52. 52.
    Kenneth P, Burnham DA (2003) Model selection and multi-model inference. Springer, BerlinGoogle Scholar
  53. 53.
    Keys AC, Rees LP (2004) A sequential-design metamodeling strategy for simulation optimization. Comput Oper Res 31(11): 1911–1932. doi: 10.1016/S0305-0548(03)00146-1 Google Scholar
  54. 54.
    Keys AC, Rees LP, Greenwood AG (2007) Performance measures for selection of metamodels to be used in simulation optimization. Decis Sci 33:31–58CrossRefGoogle Scholar
  55. 55.
    Kleijnen JP, Sanchez SM, Lucas TW, Cioppa TM (2005) State-of-the-art review: a user’s guide to the brave new world of designing simulation experiments. INFORMS J Comput 17(3):263–289CrossRefGoogle Scholar
  56. 56.
    Knowles J (2006) Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans Evol Comput 10(1):50–66CrossRefGoogle Scholar
  57. 57.
    Knowles J, Nakayama H (2008) Meta-modeling in multiobjective optimization. In: Multiobjective optimization—interactive and evolutionary approaches. Springer LNCS (in press)Google Scholar
  58. 58.
    Lee TH (2004) The design of CMOS radio-frequency integrated circuits, 2nd edn. Cambridge University Press, CambridgeGoogle Scholar
  59. 59.
    Lessmann S, Stahlbock R, Crone S (2006) Genetic algorithms for support vector machine model selection. In: IJCNN ’06. International joint conference on neural networks, pp 3063–3069Google Scholar
  60. 60.
    Li XR, Zhao Z (2006) Evaluation of estimation algorithms part I: incomprehensive measures of performance. IEEE Trans Aerosp Electron Syst 42(4):1340–1358. doi: 10.1109/TAES.2006.314576 CrossRefGoogle Scholar
  61. 61.
    Liao, Xingtao, Li, Qing, Yang, Xujing, Zhang, Weigang, Li, Wei (2008) Multiobjective optimization for crash safety design of vehicles using stepwise regression model. Struct Multidiscip Optim 35(6):561–569. doi: 10.1007/s00158-007-0163-x
  62. 62.
    Lin Y (2004) An efficient robust concept exploration method and sequential exploratory experimental design. Ph.D. thesis, Georgia Institute of TechnologyGoogle Scholar
  63. 63.
    Liu C, Wang Y (2007) Dynamic multi-objective optimization evolutionary algorithm. In: ICNC 2007, Third international conference on natural computation, vol 4, pp 456–459. doi: 10.1109/ICNC.2007.340
  64. 64.
    Liu G, Kadirkamanathan V (1995) Learning with multi-objective criteria. In: Fourth international conference on artificial neural networks, pp 53–58Google Scholar
  65. 65.
    Lophaven SN, Nielsen HB, Søndergaard J (2002) Aspects of the matlab toolbox DACE. Tech. rep., Informatics and Mathematical Modelling, Technical University of Denmark, DTU, Richard Petersens Plads, Building 321, DK-2800 Kgs. LyngbyGoogle Scholar
  66. 66.
    MacKay DJC (1992) Bayesian model comparison and backprop nets. In: Moody JE, Hanson SJ, Lippmann RP (eds) Advances in neural information processing systems 4, pp 839–846. Morgan Kaufmann, San Mateo, CaliforniaGoogle Scholar
  67. 67.
    Matsuyama Y (1996) Harmonic competition: a self-organizing multiple criteria optimization. IEEE Trans Neural Netw 7(3):652–668. doi: 10.1109/72.501723 CrossRefMathSciNetGoogle Scholar
  68. 68.
    Meckesheimer M, Booker AJ, Barton R, Simpson T (2002) Computationally inexpensive metamodel assessment strategies. AIAA J 40(10):2053–2060CrossRefGoogle Scholar
  69. 69.
    Mehnen J, Wagner T, Rudolph G (2006) Evolutionary optimization of dynamic multi-objective test functions. In: Cagnoni S, Vanneschi L (eds) Digital proceedings of the 3° workshop Italiano di vita artificiale e della 2a giornata di studio Italiana sul calcolo evoluzionistico. LabSEC—Laboratorio Simulazioni di fenomeni Socio Economici ComplessiGoogle Scholar
  70. 70.
    Mierswa I (2007) Controlling overfitting with multi-objective support vector machines. In: GECCO ’07: proceedings of the 9th annual conference on genetic and evolutionary computation, ACM, New York, NY, USA, pp 1830–1837. doi: 10.1145/1276958.1277323
  71. 71.
    Molinaro AM, Simon R, Pfeiffer RM (2005) Prediction error estimation: a comparison of resampling methods. Bioinformatics 21(15):3301–3307. doi: 10.1093/bioinformatics/bti499 Google Scholar
  72. 72.
    Mullur A, Messac A (2006) Metamodeling using extended radial basis functions: a comparative approach. Eng Comput 21(3): 203–217. doi: 10.1007/s00366-005-0005-7 Google Scholar
  73. 73.
    O’Hagan A (2006) Bayesian analysis of computer code outputs: a tutorial. Reliab Eng Systc Saf 91:1290–1300CrossRefGoogle Scholar
  74. 74.
    Ong YS, Nair P, Lum K (2006) Max-min surrogate-assisted evolutionary algorithm for robust design. IEEE Trans Evol Comput 10(4):392–404. doi: 10.1109/TEVC.2005.859464 CrossRefGoogle Scholar
  75. 75.
    Ou YY, Chen CY, Hwang SC, Oyang YJ (2003) Expediting model selection for support vector machines based on data reduction. In: IEEE International Conference on Systems, Man and Cybernetics, 2003. vol 1, pp 786–791. doi: 10.1109/ICSMC.2003.1243910
  76. 76.
    Peters N (1984) Laminar diffusion flamelet models in non-premixed turbulent combustion. Progress Energy Combust Sci 10(3):319–339. doi: 10.1016/0360-1285(84)90114-X
  77. 77.
    Price A, Voutchkov I, Pound G, Edwards N, Lenton T, Cox S (2006) Multiobjective tuning of grid-enabled earth system models using a non-dominated sorting genetic algorithm (NSGA-II). In: e-Science ’06. Second IEEE International Conference on e-science and grid computing, pp 117–117. doi: 10.1109/E-SCIENCE.2006.261050
  78. 78.
    Queipo NV, Arévalo CJ, Pintos SA (2005) The integration of design of experiments, surrogate modeling and optimization for thermoscience research. Eng Comput Lond 20(4):309–315CrossRefGoogle Scholar
  79. 79.
    Queipo NV, Haftka RT, Shyy W, Goel T, Vaidyanathan R, Tucker PK (2005) Surrogate-based analysis and optimization. Prog Aerosp Sci 41:1–28CrossRefGoogle Scholar
  80. 80.
    Robertazzi TG, Schwartz SC (1989) An accelerated sequential algorithm for producing d-optimal designs. Siam J Sci Comput 10:341–358CrossRefMathSciNetGoogle Scholar
  81. 81.
    Rubinstein RY (1981) Simulation and the Monte Carlo Method. Wiley, New YorkzbMATHCrossRefGoogle Scholar
  82. 82.
    Sanchez E, Pintos S, Queipo N (2006) Toward an optimal ensemble of kernel-based approximations with engineering applications. In: Proceedings of the international joint conference on neural networks, 2006. IJCNN ’06, pp 2152–2158. doi: 10.1109/IJCNN.2006.246987
  83. 83.
    Sangkawelert N, Chaiyaratana N (2003) Diversity control in a multi-objective genetic algorithm. In: CEC ’03. The 2003 congress on evolutionary computation 2003, vol 4, pp 2704–2711. doi: 10.1109/CEC.2003.1299430
  84. 84.
    Sasena MJ, Papalambros PY, Goovaerts P (2000) Metamodeling sampling criteria in a global optimization framework. In: 8th AIAA/ USAF/NASA/ISSMO symposium on multidisciplinary analysis and optimization, Long Beach, CA, AIAA Paper, pp 2000–4921Google Scholar
  85. 85.
    Sastry K, Goldberg D, Pelikan M (2005) Limits of scalability of multiobjective estimation of distribution algorithms. In: The 2005 IEEE congress on evolutionary computation, 2005, vol 3, pp 2217–2224. doi: 10.1109/CEC.2005.1554970
  86. 86.
    Schwaighofer A, Tresp V (2002) Transductive and inductive methods for approximate gaussian process regression. In: NIPS, pp 953–960Google Scholar
  87. 87.
    Sheriffa NM, Guptab N, Velmuruganc R, Shanmugapriyand N (2008) Optimization of thin conical frusta for impact energy absorption. Thin-Walled Struct 46(6):653–666CrossRefGoogle Scholar
  88. 88.
    Simpson TW, Poplinski JD, Koch PN, Allen JK (2001) Metamodels for computer-based engineering design: survey and recommendations. Eng Comput Lond 17(2):129–150zbMATHCrossRefGoogle Scholar
  89. 89.
    Simpson TW, Toropov V, Balabanov V, Viana FAC (2008) Design and analysis of computer experiments in multidisciplinary design optimization: a review of how far we have come or not. In: Proceedings of the 12th AIAA/ISSMO multidisciplinary analysis and optimization conference, 2008 MAO, Victoria, CanadaGoogle Scholar
  90. 90.
    Smets K, Verdonk B, Jordaan EM (2007) Evaluation of performance measures for SVR hyperparameter selection. In: Proceedings of the international joint conference on neural networks (IJCNN2007)Google Scholar
  91. 91.
    Solomatine DP, Ostfeld A (2008) Data-driven modelling : some past experiences and new approaches. J Hydroinf 10(1):3–22CrossRefGoogle Scholar
  92. 92.
    Suttorp T, Igel C (2006) Multi-objective optimization of support vector machines. In: Multi-objective machine learning, pp 199–220Google Scholar
  93. 93.
    Suykens J, Gestel TV, Brabanter JD, Moor BD, Vandewalle J (2002) Least squares support vector machines. World Scientific Publishing Co., Pte, Ltd., SingaporezbMATHGoogle Scholar
  94. 94.
    Toal DJ, Bressloff NW, Keane AJ (2008) Kriging hyperparameter tuning strategies. AIAA J 46(5):1240–1252CrossRefGoogle Scholar
  95. 95.
    Tomioka S, Nisiyama S, Enoto T (2007) Nonlinear least square regression by adaptive domain method with multiple genetic algorithms. IEEE Trans Evol Comput 11(1):1–16CrossRefGoogle Scholar
  96. 96.
    Turner CJ, Crawford RH, Campbell MI (2007) Multidimensional sequential sampling for nurbs-based metamodel development. Eng Comput 23(3):155–174. doi: 10.1007/s00366-006-0051-9 Google Scholar
  97. 97.
    Vasanth Kumar K, Porkodi K, Rocha F (2008) Comparison of various error functions in predicting the optimum isotherm by linear and non-linear regression analysis for the sorption of basic red 9 by activated carbon. J Hazard Mater 150:158–165CrossRefGoogle Scholar
  98. 98.
    Voutchkov I, Keane A (2006) Multiobjective optimization using surrogates. In: Parmee I (ed) Adaptive computing in design and manufacture 2006. Proceedings of the seventh international conference, Bristol, UK, pp 167–175Google Scholar
  99. 99.
    Wang GG, Shan S (2007) Review of metamodeling techniques in support of engineering design optimization. J Mech Design 129(4): 370–380. doi: 10.1115/1.2429697 CrossRefMathSciNetGoogle Scholar
  100. 100.
    Yang C, Meza JC, Wang LW (2007) A trust region direct constrained minimization algorithm for the kohn-sham equation. SIAM J Sci Comput 29(5): 1854–1875 (2007). doi: 10.1137/060661442 Google Scholar
  101. 101.
    Yang Y, Barron A (1998) An asymptotic property of model selection criteria. IEEE Trans Inf Theory 44(1):95–116. doi: 10.1109/18.650993 zbMATHCrossRefMathSciNetGoogle Scholar
  102. 102.
    Yao X (1999) Evolving artificial neural networks. Proc IEEE 87(9):1423–1447. doi: 10.1109/5.784219 CrossRefGoogle Scholar
  103. 103.
    Yao X, Xu Y (2006) Recent advances in evolutionary computation. J Comput Sci Technol 21(1):1–18CrossRefMathSciNetGoogle Scholar
  104. 104.
    Zhou Z, Ong YS, Nair P (2004) Hierarchical surrogate-assisted evolutionary optimization framework. In: Congress on evolutionary computation (CEC2004), vol 2, pp 1586–1593. doi: 10.1109/CEC.2004.1331085
  105. 105.
    Zucchini W (2000) An introduction to model selection. J Math Psychol 44:41–61zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2009

Authors and Affiliations

  • Dirk Gorissen
    • 1
    Email author
  • Ivo Couckuyt
    • 1
  • Eric Laermans
    • 1
  • Tom Dhaene
    • 1
  1. 1.Department of Information Technology (INTEC)IBBT, Ghent UniversityGhentBelgium

Personalised recommendations