Skip to main content

A Classifier-Assisted Framework for Expensive Optimization Problems: A Knowledge-Mining Approach

  • Conference paper
Learning and Intelligent Optimization (LION 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6683))

Included in the following conference series:

Abstract

Real-world engineering design optimization problems often rely on computationally-expensive simulations to replace laboratory experiments. A common optimization approach is to approximate the expensive simulation with a computationally cheaper model resulting in a model-assisted optimization algorithm. A prevalent issue in such optimization problems is that the simulation may crash for some input vectors, a scenario which increases the optimization difficulty and results in wasted computer resources. While a common approach to handle such vectors is to assign them a penalized fitness and incorporate them in the model training set this can result in severe model deformation and degrade the optimization efficacy. As an alternative we propose a classifier-assisted framework where a classifier is incorporated into the optimization search and biases the optimizer away from vectors predicted to crash to simulator and with no model deformation. Performance analysis shows the proposed framework improves performance with respect to the penalty approach and that it may be possible to ‘knowledge-mine’ the classifier as a post-optimization stage to gain new insights into the problem being solved.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, New York (1995)

    MATH  Google Scholar 

  2. Büche, D., Schraudolph, N.N., Koumoutsakos, P.: Accelerating evolutionary algorithms with Gaussian process fitness function models. IEEE Transactions on Systems, Man, and Cybernetics–Part C 35(2), 183–194 (2005)

    Article  Google Scholar 

  3. Buhmann, M.D.: Radial Basis Functions Theory and Implementations. Cambridge Monographs on Applied and Computational Mathematics, vol. (12). Cambridge University Press, Cambridge (2003)

    Book  MATH  Google Scholar 

  4. Chipperfield, A., Fleming, P., Pohlheim, H., Fonseca, C.: Genetic Algorithm TOOLBOX For Use with MATLAB, Version 1.2. Department of Automatic Control and Systems Engineering, University of Sheffield, Sheffield (1994)

    Google Scholar 

  5. Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust Region Methods. SIAM, Philadelphia (2000)

    Book  MATH  Google Scholar 

  6. Conn, A.R., Scheinberg, K., Toint, P.L.: A derivative free optimization algorithm in practice. In: Proceedings of the Seventh AIAA/USAF/NASA/ISSMO Symposium on Multidisciplinary Analysis and Optimization. American Institute of Aeronautics and Astronautics, Reston (1998); AIAA Paper AIAA-1998-4718

    Google Scholar 

  7. Cressie, N.A.C.: Statistics for Spatial Data. Wiley, New York (1993)

    MATH  Google Scholar 

  8. Drela, M., Youngren, H.: XFOIL 6.9 User Primer. Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cambridge, MA (2001)

    Google Scholar 

  9. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley, second edn (2001)

    MATH  Google Scholar 

  10. Emmerich, M.T.M., Giotis, A., Özdemir, M., Bäck, T., Giannakoglou, K.: Metamodel-assisted evolution strategies. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 361–370. Springer, Heidelberg (2002)

    Google Scholar 

  11. Handoko, S., Kwoh, C.K., Ong, Y.S.: Feasibility structure modeling: An effective chaperon for constrained memetic algorithms. IEEE Transactions on Evolutionary Computation 14(5), 740–758 (2010)

    Article  Google Scholar 

  12. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Transactions on Evolutionary Computation 6(5), 481–494 (2002)

    Article  Google Scholar 

  13. Johnson, M.E., Moore, L.M., Ylvisaker, D.: Minimax and maximin distance designs. Journal of Statistical Planning and Inference 26(2), 131–148 (1990)

    Article  MathSciNet  Google Scholar 

  14. Koehler, J.R., Owen, A.B.: Computer experiments. In: Ghosh, S., Rao, C.R., Krishnaiah, P.R. (eds.) Handbook of Statistics, pp. 261–308. Elsevier, Amsterdam (1996)

    Google Scholar 

  15. Linhart, H., Zucchini, W.: Model Selection. Wiley Series in Probability and Mathematical Statistics. Wiley-Interscience Publication, New York (1986)

    MATH  Google Scholar 

  16. MacQueen, J.B.: Some methods for classification and analysis of multivariate observations. In: Proceedings of 5th Berkeley Symposium on Mathematical Statistics and Probability, pp. 281–297. University of California Press, Berkeley (1967)

    Google Scholar 

  17. McKay, M.D., Beckman, R.J., Conover, W.J.: A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2), 239–245 (1979)

    MathSciNet  MATH  Google Scholar 

  18. Myers, R.H., Montgomery, D.C.: Response Surface Methodology: Process and Product Optimization Using Designed Experiments. John Wiley and Sons, New York (1995)

    MATH  Google Scholar 

  19. Ong, Y.S., Nair, P.B., Keane, A.J.: Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA Journal 41(4), 687–696 (2003)

    Article  Google Scholar 

  20. Rasheed, K., Hirsh, H., Gelsey, A.: A genetic algorithm for continuous design space search. Artificial Intelligence in Engineering 11, 295–305 (1997)

    Article  Google Scholar 

  21. Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Statistical Science 4(4), 409–435 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  22. Sammon, J. J.W.: A nonlinear mapping for data structure analysis. IEEE Transactions on Computers C-18(5), 401–409 (1969)

    Article  Google Scholar 

  23. Sobieszczansk-Sobieski, J., Haftka, R.: Multidisciplinary aerospace design optimization: Survey of recent developments. Structural Optimization 14(1), 1–23 (1997)

    Article  Google Scholar 

  24. Tenne, Y., Armfield, S.W.: A versatile surrogate-assisted memetic algorithm for optimization of computationally expensive functions and its engineering applications. In: Yang, A., Shan, Y., Thu Bui, L. (eds.) Success in Evolutionary Computation. SCI, vol. 92, pp. 43–72. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  25. Tenne, Y., Armfield, S.W.: A framework for memetic optimization using variable global and local surrogate models. Journal of Soft Computing 13(8) (2009)

    Google Scholar 

  26. Tenne, Y., Goh, C.K. (eds.): Computational Intelligence in Expensive Optimization Problems, Evolutionary Learning and Optimization, vol. 2. Springer, Heidelberg (2010), http://www.springerlink.com/content/v81864

    MATH  Google Scholar 

  27. Tenne, Y., Izui, K., Nishiwaki, S.: Handling undefined vectors in expensive optimization problems. In: Di Chio, C., Cagnoni, S., Cotta, C., Ebner, M., Ekárt, A., Esparcia-Alcazar, A.I., Goh, C.-K., Merelo, J.J., Neri, F., Preuß, M., Togelius, J., Yannakakis, G.N. (eds.) EvoApplicatons 2010. LNCS, vol. 6024, pp. 582–591. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  28. Vapnik, V.N.: Statistical Learning Theory. Wiley-Interscience Publication, Hoboken (1998)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tenne, Y., Izui, K., Nishiwaki, S. (2011). A Classifier-Assisted Framework for Expensive Optimization Problems: A Knowledge-Mining Approach. In: Coello, C.A.C. (eds) Learning and Intelligent Optimization. LION 2011. Lecture Notes in Computer Science, vol 6683. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25566-3_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-25566-3_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-25565-6

  • Online ISBN: 978-3-642-25566-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics