Skip to main content

An Extended Study of the Discriminant Random Forest

  • Chapter
  • First Online:
Data Mining

Part of the book series: Annals of Information Systems ((AOIS,volume 8))

Abstract

Classification technologies have become increasingly vital to information analysis systems that rely upon collected data to make predictions or informed decisions. Many approaches have been developed, but one of the most successful in recent times is the random forest. The discriminant random forest is a novel extension of the random forest classification methodology that leverages linear discriminant analysis to performmultivariate node splitting during tree construction.An extended study of the discriminant random forest is presented which shows that its individual classifiers are stronger and more diverse than their random forest counterparts, yielding statistically significant reductions in classification error of up to 79.5%. Moreover, empirical tests suggest that this approach is computationally less costly with respect to both memory and efficiency. Further enhancements of the methodology are investigated that exhibit significant performance improvements and greater stability at low false alarm rates.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging Predictors. Machine Learning. 26(2), 123–140 (1996)

    Google Scholar 

  2. Breiman, L.: Using Adaptive Bagging to Debias Regressions. Technical Report 547, Statistics Dept. UC Berkeley (1999)

    Google Scholar 

  3. Breiman, L.: Random Forests. Machine Learning. 45(1), 5–32 (2001)

    Article  Google Scholar 

  4. Breiman, L., Friedman, J., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Boca Raton, FL: Chapman and Hall (1984)

    Google Scholar 

  5. Chernick, M.R.: Bootstrap Methods, A Practitioner’s Guide. New York: John Wiley and Sons, Inc. (1999).

    Google Scholar 

  6. Dietterich, T.: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Machine Learning. 1–22 (1998)

    Google Scholar 

  7. Drummond, C., Holte, R.: Explicitly Representing Expected Cost: An Alternative to ROC Representation. Proc. of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2000)

    Google Scholar 

  8. Duda, R.O., Hart, P.E., Stork, D.H.: Pattern Classification, 2nd edition. New York: Wiley Interscience (2000)

    Google Scholar 

  9. Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap. New York: Chapman and Hall/CRC (1993)

    Google Scholar 

  10. Fawcett, T.: ROC Graphs: Notes and Practical Considerations for Researchers. Technical Report, Palo Alto, USA: HP Laboratories (2004)

    Google Scholar 

  11. Fisher, R.A.: The Use of Multiple Measurements in Taxonomic Problems. Annals of Eugenics. 7, 179–188 (1936)

    Article  Google Scholar 

  12. Ho, T.K.: Random Decision Forest. Proc. of the 3rd International Conference on Document Analysis and Recognition. 278–282 (1995)

    Google Scholar 

  13. Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Trans. On Pattern Analysis and Machine Intelligence. 20(8), 832–844 (1998)

    Article  Google Scholar 

  14. Lemmond, T.D., Hatch A.O., Chen, B.Y., Knapp, D.A., Hiller, L.J., Mugge, M.J., and Hanley, W.G.: Discriminant Random Forests. Proceedings of the 2008 International Conference on Data Mining (2008) (to appear)

    Google Scholar 

  15. Loeve, M.: Probability Theory II (Graduate Texts in Mathematics), 4th edition. New York: Springer-Verlag (1994)

    Google Scholar 

  16. Mardia, K., Kent, J., Bibby, J.: Multivariate Analysis. New York: Academic Press (1992)

    Google Scholar 

  17. McLachlan, G.J.: Discriminant Analysis and Statistical Pattern Recognition. New York, Wiley-Interscience (2004)

    Google Scholar 

  18. The Physical Protection of Nuclear Material and Nuclear Facilities. IAEA INFCIRC/225/Rev.4 (Corrected).

    Google Scholar 

  19. Prinzie, A., Van den Poel, D.: Random Forests for multiclass classification: Random Multinomial Logit. Expert Systems with Applications. 34(3), 1721–1732 (2008)

    Article  Google Scholar 

  20. Rodriguez, J.J., Kuncheva, L.I., et al.: Rotation forest: A New Classifier Ensemble Method. IEEE Transactions on Pattern Analysis and Machine Intelligence. 28(10), 1619–1630 (2006)

    Article  Google Scholar 

Download references

Acknowledgments

This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Tracy D. Lemmond , Barry Y. Chen , Andrew O. Hatch or William G. Hanley .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Lemmond, T.D., Chen, B.Y., Hatch, A.O., Hanley, W.G. (2010). An Extended Study of the Discriminant Random Forest. In: Stahlbock, R., Crone, S., Lessmann, S. (eds) Data Mining. Annals of Information Systems, vol 8. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-1280-0_6

Download citation

  • DOI: https://doi.org/10.1007/978-1-4419-1280-0_6

  • Published:

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4419-1279-4

  • Online ISBN: 978-1-4419-1280-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics