Casting Random Forests as Artificial Neural Networks (and Profiting from It)
While Artificial Neural Networks (ANNs) are highly expressive models, they are hard to train from limited data. Formalizing a connection between Random Forests (RFs) and ANNs allows exploiting the former to initialize the latter. Further parameter optimization within the ANN framework yields models that are intermediate between RF and ANN, and achieve performance better than RF and ANN on the majority of the UCI datasets used for benchmarking.
KeywordsHide Layer Random Forest Training Scheme Stochastic Gradient Descent Tree Split
- 5.Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Parallel Distributed Processing: Explorations in the Microstructure of Cognition. MIT Press, Cambridge (1986)Google Scholar
- 6.Bottou, L.: Online learning and stochastic approximations. In On-line Learning in Neural Networks, pp. 9–42. Cambridge University Press, Cambridge (1998)Google Scholar
- 9.Bergstra, J., Breuleux, O., Bastien, F., Lamblin, P., Pascanu, R., Desjardins, G., Turian, J., Warde-Farley, D., Bengio, Y.: Theano: a CPU and GPU math expression compiler. In: Proceedings of the Python for Scientific Computing Conference (SciPy), Oral Presentation, June 2010Google Scholar