Journal of Mathematical Modelling and Algorithms

, Volume 5, Issue 4, pp 417–445 | Cite as

Ensemble Learning Using Multi-Objective Evolutionary Algorithms

Article

Abstract

Multi-objective evolutionary algorithms for the construction of neural ensembles is a relatively new area of research. We recently proposed an ensemble learning algorithm called DIVACE (DIVerse and ACcurate Ensemble learning algorithm). It was shown that DIVACE tries to find an optimal trade-off between diversity and accuracy as it searches for an ensemble for some particular pattern recognition task by treating these two objectives explicitly separately. A detailed discussion of DIVACE together with further experimental studies form the essence of this paper. A new diversity measure which we call Pairwise Failure Crediting (PFC) is proposed. This measure forms one of the two evolutionary pressures being exerted explicitly in DIVACE. Experiments with this diversity measure as well as comparisons with previously studied approaches are hence considered. Detailed analysis of the results show that DIVACE, as a concept, has promise.

Mathematical Subject Classification (2000)

68T05 68Q32 68Q10 

Key words

ensemble learning diversity multi-objective learning neural networks neuroevolution 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Abbass, H. A.: A memetic pareto evolutionary approach to artificial neural networks, in Proceedings of the 14th Australian Joint Conference on Artificial Intelligence Springer-Verlag, Berlin Heidelberg New York, 2000, pp. 1–12.Google Scholar
  2. 2.
    Abbass, H. A.: The self-adaptive pareto differential evolution algorithm, in D. B. Fogel, M.A. El-Sharkawi, X. Yao, G. Greenwood, H. Iba, P. Marrow and M. Shackleton (eds.),Proceedings of the 2002 Congress on Evolutionary Computation CEC2002, IEEE, 2002, pp. 831–836.Google Scholar
  3. 3.
    Abbass, H. A.: Pareto neuro-evolution: Constructing ensemble of neural networks using multi-objective optimization, in The IEEE 2003 Conference on Evolutionary Computation, Vol. 3, IEEE, 2003, pp. 2074–2080.Google Scholar
  4. 4.
    Abbass, H. A.: Speeding up backpropagation using multiobjective evolutionary algorithms, Neural Comput. 15(11) (November 2003), 2705–2726.CrossRefMATHGoogle Scholar
  5. 5.
    Abbass, H. A.: Pareto neuro-ensemble, in 16th Australian Joint Conference on Artificial Intelligence, Perth, Australia, Springer, 2003, pp. 554–566.Google Scholar
  6. 6.
    Abbass, H. A. and Deb, K.: Searching under multi-evolutionary pressures, in Proceedings of the 2003 Evolutionary Multiobjective Optimization Conference (EMO03), LNCS-2632, Springer, Berlin, Heidelberg, New York, 2003, pp. 391–404.Google Scholar
  7. 7.
    Abbass, H. A., Sarker, R. and Newton, C.: PDE: A pareto-frontier differential evolution approach for multi-objective optimization problems, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC2001), Vol. 2, IEEE, 2001, pp. 971–978.Google Scholar
  8. 8.
    Baldwin, J.: A new factor in evolution, Am. Nat. 30 (1896), 441–451.CrossRefGoogle Scholar
  9. 9.
    Blake, C. and Merz, C.: UCI repository of machine learning databases, 1998.Google Scholar
  10. 10.
    Boers, E., Borst, M. and Sprinkhuizen-Kuyper, I.: Evolving artificial neural networks using the “baldwin effect,” Technical Report 95–14, Leiden Unversity, Department of Computer Science, The Netherlands, 1995.Google Scholar
  11. 11.
    Brown, G.: Diversity in Neural Network Ensembles, PhD thesis, School of Computer Science, University of Birmingham, 2004.Google Scholar
  12. 12.
    Brown, G., Wyatt, J., Harris, R. and Yao, X.: Diversity creation methods: A survey and categorisation, Inform. Fusion 6(1) (March 2005), 5–20.CrossRefGoogle Scholar
  13. 13.
    Brown, G. and Wyatt, J. L.: Negative correlation learning and the ambiguity family of ensemble methods, in Proc. Int. Workshop on Multiple Classifier Systems (LNCS 2709), Springer, Guildford, Surrey, June 2003, pp. 266–275.Google Scholar
  14. 14.
    Brown, G. and Wyatt, J. L.: The use of the ambiguity decomposition in neural network ensemble learning methods, in T. Fawcett and N. Mishra (eds.), 20th International Conference on Machine Learning (ICML'l03), Washington DC, USA, August 2003.Google Scholar
  15. 15.
    Chandra, A.: Evolutionary approach to tackling the trade-off between diversity and accuracy in neural network ensembles, Technical report, School of Computer Science, The University of Birmingham, UK, 2004.Google Scholar
  16. 16.
    Chandra, A. and Yao, X.: DIVACE: Diverse and accurate ensemble learning algorithm, in Proc. 5th Intl. Conference on Intelligent Data Engineering and Automated Learning (LNCS 3177), Berlin, Heidelberg, New York, Springer, August 2004, pp. 619–625.Google Scholar
  17. 17.
    Chandra, A. and Yao, X.: Evolutionary framework for the construction of diverse hybrid ensembles, in Proc. 13th European Symposium on Artificial Neural Networks, d-side, Brugge, Belgium, April 2005, pp. 253–258.Google Scholar
  18. 18.
    Darwen, P. J.: Co-Evolutionary Learning by Automatic Modularisation with Speciation, PhD thesis, University of New South Wales, November 1996.Google Scholar
  19. 19.
    Darwen, P. J. and Yao, X.: A dilemma for fitness sharing with a scaling function, in Proceedings of the Second IEEE International Conference on Evolutionary Computation, IEEE, Piscataway, New Jersey, 1995.Google Scholar
  20. 20.
    Darwen, P. J. and Yao, X.: Automatic modularization by speciation, in IEEE International Conference on Evolutionary Computation, IEEE, May 1996, pp. 88–93.Google Scholar
  21. 21.
    Darwen, P. J. and Yao, X.: Every niching method has its niche: Fitness sharing and implicit sharing compared, in Proc. of the 4th International Conference on Parallel Problem Solving from Nature (PPSN-IV), (LNCS-1141), Berlin, Heidelberg, New York, Springer, September 1996, pp. 398–407.Google Scholar
  22. 22.
    Darwen, P. J. and Yao, X.: Speciation as automatic categorical modularization, IEEE Trans. Evol. Comput., 1(2) (1997), 100–108.Google Scholar
  23. 23.
    Deb, K.: Multi-objective evolutionary algorithms: Introducing bias among pareto-optimal solutions, Technical Report 99002, Kanpur Genetic Algorithm Group, Department of Mechanical Engineering, Indian Institute of Technology, Kanpur, India, 1999.Google Scholar
  24. 24.
    Deb, K.: Multi-Objective Optimization Using Evolutionary Algorithms, Wiley, Chichester, UK, 2001.MATHGoogle Scholar
  25. 25.
    Deb, K., Agrawal, S., Pratab, A. and Meyarivan, T.: A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II, in M. Schoenauer, K. Deb, G. Rudolph, X. Yao, E. Lutton, J. J. Merelo and H.-P. Schwefel (eds.), Proceedings of the Parallel Problem Solving from Nature VI Conference, Paris, France, Springer, 2000, pp. 849–858. Lecture Notes in Computer Science No. 1917.Google Scholar
  26. 26.
    Dietterich, T. G.: Machine-learning research: Four current directions, AI Mag. 18(4) (1998), 97–136.Google Scholar
  27. 27.
    Forrest, S., Smith, R. E., Javornik, B. and Perelson, A. S.: Using genetic algorithms to explore pattern recognition in the immune system, Evol. Comput. 1(3) (1993), 191–211.CrossRefGoogle Scholar
  28. 28.
    Hansen, L. K. and Salamon, P.: Neural network ensembles, IEEE Trans. Pattern Anal. Mach. Intell. 12(10) (1990), 993–1001.CrossRefGoogle Scholar
  29. 29.
    Horn, J., Nafpliotis, N. and Goldberg, D. E.: A niched Pareto genetic algorithm for multiobjective optimization, in Proceedings of the First IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence, Vol. 1, Piscataway, New Jersey, IEEE Service Center, 1994, pp. 82–87.Google Scholar
  30. 30.
    Islam, M. M., Yao, X. and Murase, K.: A constructive algorithm for training cooperative neural network ensembles, IEEE Trans. Neural Netw. 14(4) (July 2003), 820–834.CrossRefGoogle Scholar
  31. 31.
    Khare, V. and Yao, X.: Artificial speciation and automatic modularisation, in L. Wang, K. C. Tan, T. Furuhashi, J.-H. Kim and X. Yao (eds.), Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution And Learning (SEAL'02), number 1, Singapore, November 2002, pp. 56–60.Google Scholar
  32. 32.
    Khare, V. and Yao, X.: Artificial speciation of neural network ensembles, in J. A. Bullinaria (ed.), Proc. of the 2002 UK Workshop on Computational Intelligence (UKCI'02), Birmingham, September 2002, pp. 96–103.Google Scholar
  33. 33.
    Krogh, A. and Vedelsby, J.: Neural network ensembles, cross validation, and active learning, NIPS 7 (1995), 231–238.Google Scholar
  34. 34.
    Langdon, W. B., Barrett, S. J. and Buxton, B. F.: Combining decision trees and neural networks for drug discovery, in Genetic Programming, Proceedings of the 5th European Conference, EuroGP 2002, Kinsale, Ireland, 3–5 April 2002, pp. 60–70.Google Scholar
  35. 35.
    Liu, Y. and Yao, X.: Ensemble learning via negative correlation, Neural Netw. 12(10) (1999), 1399–1404.CrossRefPubMedGoogle Scholar
  36. 36.
    Liu, Y., Yao, X. and Higuchi, T.: Evolutionary ensembles with negative correlation learning, IEEE Trans. Evol. Comput. 4(4) (November 2000), 380–387.Google Scholar
  37. 37.
    Michie, D., Spiegelhalter, D. and Taylor, C.: Machine Learning, Neural and Statistical Classification, Ellis Horwood Limited, 1994.Google Scholar
  38. 38.
    Opitz, D.: Feature selection for ensembles, in Proceedings of 16th National Conference on Artificial Intelligence (AAAI), 1999, pp. 379–384.Google Scholar
  39. 39.
    Opitz, D. and Maclin, R.: Popular ensemble methods: An empirical study, J. Artif. Intell. Res. 11 (1999), 169–198.MATHGoogle Scholar
  40. 40.
    Opitz, D. W. and Shavlik, J. W.: Generating accurate and diverse members of a neural-network ensemble, NIPS 8 (1996), 535–541.Google Scholar
  41. 41.
    Sharkey, A.: Multi-Net Systems, Chapter Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems, Springer, 1999, pp. 1–30.Google Scholar
  42. 42.
    Smith, R., Forrest, S. and Perelson, A.: Searching for diverse, cooperative populations with genetic algorithms, Evol. Comput. 1(2) (1993), 127–149.CrossRefGoogle Scholar
  43. 43.
    Srinivas, N. and Deb, K.: Multi-objective function optimization using non-dominated sorting genetic algorithms, Evol. Comput. 2(3) (1994), 221–248.CrossRefGoogle Scholar
  44. 44.
    Stanley, K. O. and Miikkulainen, R.: Evolving neural networks through augmenting topologies, Evol. Comput. 10(2) (2002), 99–127.CrossRefPubMedGoogle Scholar
  45. 45.
    Storn, R. and Price, K.: Differential evolution – a simple and efficient adaptive scheme for global optimization over continuous spaces, Technical Report TR-95-012, International Computer Science Institute, Berkeley, USA, 1995.Google Scholar
  46. 46.
    Tumer, K. and Ghosh, J.: Analysis of decision boundaries in linearly combined neural classifiers, Pattern Recogn. 29(2) (February 1996), 341–348.CrossRefGoogle Scholar
  47. 47.
    Yao, X.: Evolving artificial neural networks, in Proceedings of the IEEE, Vol. 87, No. 9, September 1999, pp. 1423–1447.Google Scholar

Copyright information

© Springer Science+Business Media, Inc. 2006

Authors and Affiliations

  1. 1.The Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA), School of Computer ScienceThe University of BirminghamEdgbastonUK

Personalised recommendations