Advertisement

Preliminary Experiments with Ensembles of Neurally Diverse Artificial Neural Networks for Pattern Recognition

  • Abdullahi Adamu
  • Tomas Maul
  • Andrzej Bargiela
  • Christopher Roadknight
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 361)

Abstract

Although there have been a few approaches to achieve the goal of fault tolerance by diversifying redundancy of the individual networks that make up a neural network ensemble, some of which include ensembles of neural networks of different sizes, and ensembles of different models of neural networks such as Radial Basis Function Networks and Multilayer Perceptron, there is yet to be an empirical study on hybrid neural networks that makes use of a diverse set of transfer functions, which we would expect to be able to exhibit diverse network architectures, and thus possibly more diverse error patterns. In this paper, we present an approach that uses transfer function diversity to achieve significant results on ensembles. The results show that even with relatively small networks having 5 hidden nodes, and a relatively small ensemble size of just 10 members, the ensemble is able to get competitive results on the Iris data set. It also capable of obtaining competitive results with 20 ensemble members of relatively small networks on other popular data sets such as the Diabetes, Sonar, Hepatitis, and Australian Credit Card problems. In addition to that, it is shown that these results can be achieved with a simple sorting and selection of the Top N solutions of the population, in contrast to other methods of selecting ensemble members that can be computationally expensive, such as selection of the Pareto-front, or hill climbing methods of selection.

Keywords

Hybrid Neural Networks Artificial Neural Networks Transfer function Optimization Pattern Recognition 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Lower Your Risks: Age, Race, Gender & Family History (2013), http://www.diabetes.org/are-you-at-risk/lower-your-risk/nonmodifiables.html
  2. 2.
    Abbass, H.: Pareto neuro-evolution: constructing ensemble of neural networks using multi-objective optimization. In: The 2003 Congress on Evolutionary Computation, CEC 2003, vol. 3, pp. 2074–2080. IEEE, Cancun (2003)CrossRefGoogle Scholar
  3. 3.
    Bache, K., Lichman, M.: UCI Machine Learning Repository (2013), http://archive.ics.uci.edu/ml
  4. 4.
    Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press (1995)Google Scholar
  5. 5.
    Briggman, K., Kristan, W.: Multifunctional pattern-generating circuits. Annual Review of Neuroscience 31, 2710294 (2008)CrossRefGoogle Scholar
  6. 6.
    Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: a survey and categorisation. Information Fusion 6(1), 5–20 (2005)CrossRefGoogle Scholar
  7. 7.
    Chandra, A., Yao, X.: Ensemble Learning Using Multi-Objective Evolutionary Algorithms. Journal of Mathematical Modelling and Algorithms 5(4), 417–445 (2006)CrossRefMATHMathSciNetGoogle Scholar
  8. 8.
    Gutierrez, P., Hervas, C., Carbonero, M., Fernandez, J.: Combined projection and kernel basis functions for classification in evolutionary neural networks. Neurocomputing 72(13-15), 2731–2742 (2009)CrossRefGoogle Scholar
  9. 9.
    Gutiérrez, P.A., Hervás-Martínez, C.: Hybrid Artificial Neural Networks: Models, Algorithms and Data. In: Cabestany, J., Rojas, I., Joya, G. (eds.) IWANN 2011, Part II. LNCS, vol. 6692, pp. 177–184. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  10. 10.
    Hansen, L., Salamon, P.: Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence 12(10), 993–1001 (1990)CrossRefGoogle Scholar
  11. 11.
    Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation 4(4), 380–387 (2000)CrossRefMathSciNetGoogle Scholar
  12. 12.
    Islam, M.M., Yao, X., Murase, K.: A constructive algorithm for training cooperative neural network ensembles. IEEE Transactions on Neural Networks 14(4), 820–834 (2003)CrossRefGoogle Scholar
  13. 13.
    Jankowski, N., Duch, W.: Optimal transfer function neural networks. In: 9th European Symposium on Artificial Neural Networks, ESANN 2001, Bruges, vol. (I), pp. 101–106 (2001)Google Scholar
  14. 14.
    Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems, pp. 231–238 (1995)Google Scholar
  15. 15.
    Liu, Y., Yao, X.: Ensemble learning via negative correlation. Neural Networks: The Official Journal of the International Neural Network Society 12(10), 1399–1404 (1999)CrossRefGoogle Scholar
  16. 16.
    Maul, T.: Early experiments with neural diversity machines. Neurocomputing 113, 36–48 (2013)CrossRefGoogle Scholar
  17. 17.
    Opitz, D., Shavlik, J.: Generating Accurate and Diverse Members of a Neural-Network Ensemble. In: Advances in Neural Information Processing Systems, vol. 8, pp. 535–541 (1996)Google Scholar
  18. 18.
    Perrone, M., Cooper, L.: When networks disagree: Ensemble methods for hybrid neural networks. In: Neural Networks for Speech and Image Processing, pp. 126–142 (1993)Google Scholar
  19. 19.
    Sharkey, A., Sharkey, N.: Combining diverse neural nets. The Knowledge Engineering Review 12(3), 231–247 (1997)CrossRefGoogle Scholar
  20. 20.
    Wu, Z., Chen, Y.: Genetic algorithm based selective neural network ensemble. In: Proceedings of the Seventeenth International Joint Conference on Artificial Intelligence, IJCAI 2001, vol. 1, pp. 797–802 (2001)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Abdullahi Adamu
    • 1
  • Tomas Maul
    • 1
  • Andrzej Bargiela
    • 2
  • Christopher Roadknight
    • 2
  1. 1.University of Nottingham - Malaysia CampusSemenyihMalaysia
  2. 2.University of Nottingham - UK CampusSemenyihUnited Kingdom

Personalised recommendations