Advertisement

Multiobjective Optimization on a Budget of 250 Evaluations

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3410)

Abstract

In engineering and other ‘real-world’ applications, multiobjective optimization problems must frequently be tackled on a tight evaluation budget — tens or hundreds of function evaluations, rather than thousands. In this paper, we investigate two algorithms that use advanced initialization and search strategies to operate better under these conditions. The first algorithm, Bin_MSOPS, uses a binary search tree to divide up the decision space, and tries to sample from the largest empty regions near ‘fit’ solutions. The second algorithm, ParEGO, begins with solutions in a latin hypercube and updates a Gaussian processes surrogate model of the search landscape after every function evaluation, which it uses to estimate the solution of largest expected improvement. The two algorithms are tested using a benchmark suite of nine functions of two and three objectives — on a budget of only 250 function evaluations each, in total. Results indicate that the two algorithms search the space in very different ways and this can be used to understand performance differences. Both algorithms perform well but ParEGO comes out on top in seven of the nine test cases after 100 function evaluations, and on six after the first 250 evaluations.

Keywords

multiobjective optimization expensive black-box functions ParEGO DACE Bin_MSOPS landscape approximation response surfaces test suites 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Büche, D., Guidati, G., Stoll, P., Kourmoursakos, P.: Self-organizing maps for Pareto optimization of airfoils. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 122–131. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  2. 2.
    Chen, J.-J., Goldberg, D.E., Ho, S.-Y., Sastry, K.: Fitness inheritance in multi-objective optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO’2002), July 2002, pp. 319–326. Morgan Kaufmann Publishers, San Francisco (2002)Google Scholar
  3. 3.
    Deb, K., Beyer, H.: Self-adaptive genetic algorithms with simulated binary crossover. Evolutionary Computation 9(2), 197–221 (2001)CrossRefGoogle Scholar
  4. 4.
    Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable test problems for evolutionary multi-objective optimization. Technical Report 112, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Switzerland (2001)Google Scholar
  5. 5.
    Ducheyne, E.I., De Baets, B., De Wulf, R.: Is fitness inheritance useful for real-world applications? In: Fonseca, C.M., Fleming, P.J., Zitzler, E., Deb, K., Thiele, L. (eds.) EMO 2003. LNCS, vol. 2632, pp. 31–42. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  6. 6.
    Evans, J.R.G., Edirisinghe, M.J., Eames, P.V.C.J.: Combinatorial searches of inorganic materials using the inkjet printer: science philosophy and technology. Journal of the European Ceramic Society 21, 2291–2299 (2001)CrossRefGoogle Scholar
  7. 7.
    Fonseca, C.M., Fleming, P.J.: On the performance assessment and comparison of stochastic multiobjective optimizers. In: Ebeling, W., Rechenberg, I., Voigt, H.-M., Schwefel, H.-P. (eds.) PPSN 1996. LNCS, vol. 1141, pp. 584–593. Springer, Heidelberg (1996)CrossRefGoogle Scholar
  8. 8.
    Gaspar-Cunha, A., Vieira, A.: A multi-objective evolutionary algorithm using neural networks to approximate fitness evaluations. International Journal of Computers, Systems, and Signals (2004) (in press)Google Scholar
  9. 9.
    Gaspar-Cunha, A., Vieira, A.S.: A hybrid multi-objective evolutionary algorithm using an inverse neural network. In: Hybrid Metaheuristics (HM 2004) Workshop at ECAI 2004, pp. 25–30 (2004), http://iridia.ulb.ac.be/hm2004/proceedings/
  10. 10.
    Hansen, M.P., Jaszkiewicz, A.: Evaluating the quality of approximations of the nondominated set. Technical Report IMM-REP-1998-7, Technical University of Denmark (1998)Google Scholar
  11. 11.
    Hughes, E.J.: Multi-objective binary search optimisation. In: Fonseca, C.M., Fleming, P.J., Zitzler, E., Deb, K., Thiele, L. (eds.) EMO 2003. LNCS, vol. 2632, pp. 102–117. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  12. 12.
    Hughes, E.J.: Multiple single objective Pareto sampling. In: Congress on Evolutionary Computation 2003, December 2003, pp. 2678–2684. IEEE, Los Alamitos (2003)CrossRefGoogle Scholar
  13. 13.
    Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Transactions on Evolutionary Computation 6(5), 481–494 (2002)CrossRefGoogle Scholar
  14. 14.
    Jones, D., Schonlau, M., Welch, W.: Efficient global optimization of expensive black-box functions. Journal of Global Optimization 13, 455–492 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  15. 15.
  16. 16.
    Knowles, J.: ParEGO: A hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. Technical Report TR-COMPSYSBIO-2004-01, University of Manchester, UK (2004), Available from http://dbk.ch.umist.ac.uk/knowles/pubs.html
  17. 17.
    Laumanns, M., Ocenasek, J.: Bayesian optimization algorithms for multi-objective optimization. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 298–307. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  18. 18.
    Myers, R., Montgomery, D.: Response Surface Methodology. Wiley, New York (1995)zbMATHGoogle Scholar
  19. 19.
    Nain, P.K.S., Deb, K.: A computationally effective multi-objective search and optimization technique using coarse-to-fine grain modeling. Technical Report Kangal Report No. 2002005, IITK, Kanpur, India (2002)Google Scholar
  20. 20.
    O’Hagan, S., Dunn, W., Brown, M., Knowles, J., Kell, D.: Closed-loop, multiobjective optimization of analytical instrumentation: gas chromatography/time-of-flight mass spectrometry of the metabolomes of human serum and of yeast fermentations. Analytical Chemistry (in press) (2004), http://pubs.acs.org/cgi-bin/asap.cgi/ancham/asap/html/ac049146x.html
  21. 21.
    Okabe, T., Jin, Y., Olhofer, M., Sendhoff, B.: On test functions for evolutionary multi-objective optimization. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 792–802. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  22. 22.
    Sacks, J., Welch, W., Mitchell, T., Wynn, H.: Design and analysis of computer experiments (with discussion). Statistical Science 4, 409–435 (1989)zbMATHCrossRefMathSciNetGoogle Scholar
  23. 23.
    Steuer, R.E., Choo, E.-U.: An interactive weighted Tchebycheff procedure for multiple objective programming. Mathematical Programming 25, 326–344 (1983)CrossRefMathSciNetGoogle Scholar
  24. 24.
    Vaidyanathan, S., Broadhurst, D.I., Kell, D.B., Goodacre, R.: Explanatory optimization of protein mass spectrometry via genetic search. Analytical Chemistry 75(23), 6679–6686 (2003)CrossRefGoogle Scholar
  25. 25.
    Veldhuizen, D.A.V., Lamont, G.B.: Multiobjective evolutionary algorithm test suites. In: Proceedings of the 1999 ACM Symposium on Applied Computing, pp. 351–357. ACM, New York (1999)CrossRefGoogle Scholar
  26. 26.
    Weuster-Botz, D., Wandrey, C.: Medium optimization by genetic algorithm for continuous production of formate dehydrogenase. Process Biochemistry 30, 563–571 (1995)Google Scholar
  27. 27.
    Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., da Fonseca, V.G.: Performance assessment of multiobjective optimizers: An analysis and review. IEEE Transactions on Evolutionary Computation 7(2), 117–132 (2003)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  1. 1.School of ChemistryUniversity of ManchesterManchesterUK
  2. 2.Cranfield UniversityShrivenham, SwindonUK

Personalised recommendations