Advertisement

Casting Random Forests as Artificial Neural Networks (and Profiting from It)

  • Johannes Welbl
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8753)

Abstract

While Artificial Neural Networks (ANNs) are highly expressive models, they are hard to train from limited data. Formalizing a connection between Random Forests (RFs) and ANNs allows exploiting the former to initialize the latter. Further parameter optimization within the ANN framework yields models that are intermediate between RF and ANN, and achieve performance better than RF and ANN on the majority of the UCI datasets used for benchmarking.

Keywords

Hide Layer Random Forest Training Scheme Stochastic Gradient Descent Tree Split 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)CrossRefzbMATHGoogle Scholar
  2. 2.
    Cybenko, G.: Approximation by superpositions of a sigmoidal function. Math. Control Signals Syst. 2, 303–314 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)CrossRefGoogle Scholar
  4. 4.
    Breiman, L., Friedman, J., Olshen, R., Stone, C.: Classification and Regression Trees. Wadsworth and Brooks, Monterey (1984)zbMATHGoogle Scholar
  5. 5.
    Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Parallel Distributed Processing: Explorations in the Microstructure of Cognition. MIT Press, Cambridge (1986)Google Scholar
  6. 6.
    Bottou, L.: Online learning and stochastic approximations. In On-line Learning in Neural Networks, pp. 9–42. Cambridge University Press, Cambridge (1998)Google Scholar
  7. 7.
    Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. J. Artif. Intell. Res. 2, 1–32 (1994)zbMATHGoogle Scholar
  8. 8.
    Bache, K., Lichman, M.: UCI Machine Learning Repository. University of California, School of Information and Computer Sciences, Irvine, CA (2013). http://archive.ics.uci.edu/ml Google Scholar
  9. 9.
    Bergstra, J., Breuleux, O., Bastien, F., Lamblin, P., Pascanu, R., Desjardins, G., Turian, J., Warde-Farley, D., Bengio, Y.: Theano: a CPU and GPU math expression compiler. In: Proceedings of the Python for Scientific Computing Conference (SciPy), Oral Presentation, June 2010Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Heidelberg Collaboratory for Image ProcessingHeidelbergGermany

Personalised recommendations