Ensemble Construction via Designed Output Distortion
A new technique for generating regression ensembles is introduced in the present paper. The technique is based on earlier work on promoting model diversity through injection of noise into the outputs; it differs from the earlier methods in its rigorous requirement that the mean displacements applied to any data points output value be exactly zero.
It is illustrated how even the introduction of extremely large displacements may lead to prediction accuracy superior to that achieved by bagging.
It is demonstrated how ensembles of models with very high bias may have much better prediction accuracy than single models of the same bias-defying the conventional belief that ensembling high bias models is not purposeful.
Finally is outlined how the technique may be applied to classification.
Unable to display preview. Download preview PDF.
- 1.Krogh, A. and Vedelsby, J. Neural network ensembles, Cross Validation, and Active Learning. In: G. Tesauro, D. S. Touretzky and T. K. Leen, eds. Advances in Neural Information Processing Systems 7, p. 231–238, MIT Press, Cambridge, MA, 1995.Google Scholar
- 2.Sollich, P. and Krogh, A. Learning with ensembles: How over-fitting can be useful. In: D. S. Touretzky, M. C. Mozer and M. E. Hasselmo, eds. Advances in Neural Information Processing Systems 8, p. 190–196, MIT Press, 1996.Google Scholar
- 5.Raviv, Y. and Intrator, N. Bootstrapping with noise: An effective regularization technique. Connection Science, Special issue on Combining Estimators, 8:356–372, 1996.Google Scholar
- 6.Raviv, Y. and Intrator, N. Variance reduction via noise and bias constraints. In: Sharkey, A. J. C. (Ed.) Combining Artificial Neural Nets. Springer Verlag. 1999.Google Scholar
- 7.Murphy, P. M. & Aha, D. W. UCI Repository of machine learning databases. University of California, Department of Information and Computer Science. Irvine, CA 1994.Google Scholar