Skip to main content

Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles

  • Conference paper
Genetic and Evolutionary Computation – GECCO 2004 (GECCO 2004)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3102))

Included in the following conference series:

Abstract

In many real-world applications of evolutionary computation, it is essential to reduce the number of fitness evaluations. To this end, computationally efficient models can be constructed for fitness evaluations to assist the evolutionary algorithms. When approximate models are involved in evolution, it is very important to determine which individuals should be re-evaluated using the original fitness function to guarantee a faster and correct convergence of the evolutionary algorithm. In this paper, the k-means method is applied to group the individuals of a population into a number of clusters. For each cluster, only the individual that is closest to the cluster center will be evaluated using the expensive original fitness function. The fitness of other individuals are estimated using a neural network ensemble, which is also used to detect possible serious prediction errors. Simulation results from three test functions show that the proposed method exhibits better performance than the strategy where only the best individuals according to the approximate model are re-evaluated.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Branke, J., Schmidt, C.: Fast convergence by means of fitness estimation. Soft Computing (2003) (to appear)

    Google Scholar 

  2. El-Beltagy, M.A., Nair, P.B., Keane, A.J.: Metamodeling techniques for evolutionary optimization of computationally expensive problems: promises and limitations. In: Proceedings of Genetic and Evolutionary Conference, Orlando, pp. 196–203. Morgan Kaufmann, San Francisco (1999)

    Google Scholar 

  3. Emmerich, M., Giotis, A., Özdenir, M., Bäck, T., Giannakoglou, K.: Metamodel-assisted evolution strategies. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 371–380. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  4. Jain, A.K., Murty, M.N., Flynn, P.J.: Data clustering: A review. ACM Computing Surveys 31(3), 264–323 (1999)

    Article  Google Scholar 

  5. Jimenez, D.: Dynamically weighted ensemble neural networks for classification. In: Proceedings of International Joint Conference on Neural Networks, Anchorage, pp. 753–756. IEEE Press, Los Alamitos (1998)

    Google Scholar 

  6. Jin, Y., Okabe, T., Sendhoff, B.: Neural network regularization and ensembling using multi-objective evolutionary algorithms. In: Proceedings of IEEE Congress on Evolutionary Computation, Portland, IEEE, Los Alamitos (2004) (to appear)

    Google Scholar 

  7. Jin, Y., Olhofer, M., Sendhoff, B.: On evolutionary optimization with approximate fitness functions. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 786–792. Morgan Kaufmann, San Francisco (2000)

    Google Scholar 

  8. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Transactions on Evolutionary Computation 6(5), 481–494 (2002)

    Article  Google Scholar 

  9. Jin, Y., Sendhoff, B.: Fitness approximation in evolutionary computation - A survey. In: Proceedings of the Genetic and Evolutionary Computation Conference, New York City, NY, pp. 1105–1112 (2002)

    Google Scholar 

  10. Kim, H.-S., Cho, S.-B.: An efficient genetic algorithms with less fitness evaluation by clustering. In: Proceedings of IEEE Congress on Evolutionary Computation, Piscataway, NJ, pp. 887–894. IEEE, Los Alamitos (2001)

    Google Scholar 

  11. Liu, Y., Yao, X.: Negatively correlated neural networks can produce best ensemble. Australian Journal of Intelligent Information Processing System 4(3-4), 176–185 (1997)

    Google Scholar 

  12. Liu, Y., Yao, X., Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation 4(4), 380–387 (2000)

    Article  Google Scholar 

  13. Opitz, D.W., Shavlik, J.W.: Generating accurate and diverse members of a neural network ensemble. In: Advances in Neural Information Processing Systems, Cambridge, MA, vol. 8, pp. 535–541. MIT Press, Cambridge (1996)

    Google Scholar 

  14. Perrone, M.P., Cooper, L.N.: When networks disagree: Ensemble methods for hybrid neural networks. In: Mammone, R.J. (ed.) Artificial Neural Networks for Speech and Vision, pp. 126–142. Chapman & Hall, London (1993)

    Google Scholar 

  15. Ratle, A.: Accelerating the convergence of evolutionary algorithms by fitness landscape approximation. In: Eiben, A., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) Parallel Problem Solving from Nature, vol. V, pp. 87–96 (1998)

    Google Scholar 

  16. Rosen, B.E.: Ensemble learning using decorrelated neural networks. Connection Science 8(3-4), 373–384 (1996)

    Article  Google Scholar 

  17. Rousseeuw, P.J.: Silhouettes: A graphical aid to the interpretation and validation of cluster analysis. Journal of Computational and Applied Mathematics 20, 53–65 (1987)

    Article  MATH  Google Scholar 

  18. Ulmer, H., Streicher, F., Zell, A.: Model-assisted steady-state evolution strategies. In: Cantú-Paz, E., Foster, J.A., Deb, K., Davis, L., Roy, R., O’Reilly, U.-M., Beyer, H.-G., Kendall, G., Wilson, S.W., Harman, M., Wegener, J., Dasgupta, D., Potter, M.A., Schultz, A., Dowsland, K.A., Jonoska, N., Miller, J., Standish, R.K. (eds.) GECCO 2003. LNCS, vol. 2723, pp. 610–621. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  19. Yao, X., Liu, Y.: Making use of population information in evolutionary artificial neural networks. IEEE Transactions on Systems, Man, and Cybernetics-Part B:Cybernetics 28(3), 417–425 (1998)

    MathSciNet  Google Scholar 

  20. Zhang, B.-T., Joung, J.G.: Building optimal committee of genetic programs. In: Parallel Problem Solving from Nature, vol. VI, pp. 231–240. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jin, Y., Sendhoff, B. (2004). Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles. In: Deb, K. (eds) Genetic and Evolutionary Computation – GECCO 2004. GECCO 2004. Lecture Notes in Computer Science, vol 3102. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24854-5_71

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-24854-5_71

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22344-3

  • Online ISBN: 978-3-540-24854-5

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics