Skip to main content

Random Oracles for Regression Ensembles

  • Chapter
Book cover Ensembles in Machine Learning Applications

Part of the book series: Studies in Computational Intelligence ((SCI,volume 373))

Abstract

This paper considers the use of Random Oracles in Ensembles for regression tasks. A Random Oracle model (Kuncheva and Rodríguez, 2007) consists of a pair of models and a fixed randomly created “oracle” (in the case of the Linear Random Oracle, it is a hyperplane that divides the dataset in two during training and, once the ensemble is trained, decides which model to use). They can be used as the base model for any ensemble method. Previously, they have been used for classification. Here, the use of Random Oracles for regression is studied using 61 datasets, Regression Trees as base models and several ensemble methods: Bagging , Random Subspaces, AdaBoost.R2 and Iterated Bagging. For all the considered methods and variants, ensembles with Random Oracles are better than the corresponding version without the Oracles.

This work was supported by the Project TIN2008-03151 of the Spanish Ministry of Education and Science.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  2. Breiman, L.: Using iterated bagging to Debias regressions. Machine Learning 45, 261–277 (2001)

    Article  MATH  Google Scholar 

  3. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Machine Learning Research 7, 1–30 (2006)

    Google Scholar 

  4. Dietterich, T.G.: Approximate statistical test for comparing supervised classification learning algorithms. Neural Comp. 10, 1895–1923 (1998)

    Article  Google Scholar 

  5. Drucker, H.: Improving regressors using boosting techniques. In: Fisher, D.H. (ed.) Proc. the 14th Int. Conf. Machine Learning, Nashville, TN, pp. 107–115. Morgan Kaufmann, San Francisco (1997)

    Google Scholar 

  6. Elomaa, T., Kääriäinen, M.: An analysis of reduced error pruning. J. Artif. Intell. Research 15, 163–187 (2001)

    MATH  Google Scholar 

  7. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Saitta, L. (ed.) Proc. the 13th Int. Conf. Machine Learning, Bari, Italy, pp. 148–156. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  8. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comp. and Syst. Sciences 55, 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  9. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA data mining software: An update. SIGKDD Explorations 11, 10–18 (2009)

    Article  Google Scholar 

  10. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Trans. Pattern Analysis and Machine Intell. 20, 832–844 (1998)

    Article  Google Scholar 

  11. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles. Machine Learning 51, 181–207 (2003)

    Article  MATH  Google Scholar 

  12. Kuncheva, L.I.: Combining pattern classifiers: Methods and algorithms. John Wiley & Sons, Hoboken (2004)

    Book  MATH  Google Scholar 

  13. Kuncheva, L.I., Rodríguez, J.J.: Classifier ensembles with a random linear oracle. IEEE Trans. Knowledge and Data Engineering 19, 500–508 (2007)

    Article  Google Scholar 

  14. Margineantu, D.D., Dietterich, T.G.: Pruning adaptive boosting. In: Fisher, D.H. (ed.) Proc. 14th Int. Conf. Machine Learning, Nashville, TN, pp. 211–218. Morgan Kaufmann, San Francisco (1997)

    Google Scholar 

  15. Rodríguez, J.J., García-Osorio, C., Maudes, J.: Forests of nested dichotomies. Pattern Recognition Letters 31, 125–132 (2010)

    Article  Google Scholar 

  16. Shrestha, D.L., Solomatine, D.P.: Experiments with AdaBoost.RT: An improved boosting scheme for regression. Neural Comp. 18, 1678–1710 (2006)

    Article  MATH  Google Scholar 

  17. Suen, Y., Melville, P., Mooney, R.: Combining bias and variance reduction techniques for regression trees. In: Gama, J., Camacho, R., Brazdil, P.B., Jorge, A.M., Torgo, L. (eds.) ECML 2005. LNCS (LNAI), vol. 3720, pp. 741–749. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  18. Wang, Y., Witten, I.H.: Induction of model trees for predicting continuous classes. In: van Someren, M., Widmer, G. (eds.) ECML 1997. LNCS, vol. 1224, pp. 128–137. Springer, Heidelberg (1997)

    Google Scholar 

  19. Witten, I.H., Frank, E.: Data mining: Practical machine learning tools and techniques. Morgan Kaufmann, San Francisco (2005)

    MATH  Google Scholar 

  20. Zhang, C., Zhang, J., Wang, G.: An empirical study of using Rotation Forest to improve regressors. Applied Mathematics and Comp. 195, 618–629 (2008)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Pardo, C., Rodríguez, J.J., Díez-Pastor, J.F., García-Osorio, C. (2011). Random Oracles for Regression Ensembles. In: Okun, O., Valentini, G., Re, M. (eds) Ensembles in Machine Learning Applications. Studies in Computational Intelligence, vol 373. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-22910-7_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-22910-7_11

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-22909-1

  • Online ISBN: 978-3-642-22910-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics