Skip to main content

A Brief Survey on Random Forest Ensembles in Classification Model

  • Conference paper
  • First Online:
International Conference on Innovative Computing and Communications

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 56))

Abstract

Machine Learning has got the popularity in recent times. Apart from machine learning the decision tree is one of the most sought out algorithms to classify or predict future instances with already trained data set. Random Forest is an extended version of decision tree which can predict the future instances with multiple classifiers rather than single classifier to reach accuracy and correctness of the prediction. The performances of the Random Forest model is reconnoitered and vary with other models of classification which yield institutionalization, regularization, connection, high penchant change and highlight choice on the learning models. We incorporate principled projection strategies which are aiding to predict the future values. Ensemble techniques are machine learning techniques where more than one learners are constructed for given task. The ultimate aim of ensemble methods is to find high accuracy with greater performance. Ensembles are taking a different approach than single classifier to highlight the data. In this, more than one ensemble is constructed and all individual learners are combined based on some voting strategy. In the current study, we have outlined the concept of Random forest ensembles in classification.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Buczak AL, Guven E (2016) A survey of data mining and machine learning methods for cyber security intrusion detection. IEEE Commun Surv Tutor 18(2):1153–1176

    Article  Google Scholar 

  2. Wu X, Zhu X, Wu GQ, Ding W (2014) Data mining with big data. IEEE Trans Knowl Eng 26(1):97–107

    Google Scholar 

  3. Kumar GK, Viswanath P, Rao AA (2016) Ensemble of randomized soft decision trees for robust classification. Acad Proc Eng Sci Sadhana 41(3):273–282

    MathSciNet  MATH  Google Scholar 

  4. Mitchell TM (1999) Machine learning and data mining. ACM Commun 42(11):30–36

    Article  Google Scholar 

  5. Zhou Z, Tang W (2003) Selective ensemble of decision trees. vol 2639. LNCS, Springer, pp 476–483

    Google Scholar 

  6. Gashler M, Carrier CG, Martinez T (2008) Decision tree ensemble: small heterogeneous is better than large homogeneous. In: Proceedings of 7th international conference on machine learning and applications, pp 900–905

    Google Scholar 

  7. Lebedev AV et al (2014) Random forest ensembles for detection and prediction of Alzheimer’s disease with a good between cohort robustness. NeuroImage Clin 6:115–125

    Article  Google Scholar 

  8. Kotsiantis S (2011) Combining bagging, boosting, rotation forest and random subspace methods. Artif Intell Rev 35(3):223–240

    Article  Google Scholar 

  9. Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees. Mach Learn 40:139–157

    Article  Google Scholar 

  10. Louppe G (2014) Understanding random forests: from theory to practice, Cornell University Library, pp 1–225

    Google Scholar 

  11. Jampani V, Gadde R, Gehler PV (2015) Efficient facade segmentation using auto-context. In: Proceedings: 2015 IEEE-winter conference on applications of computer vision, WACV’15, Feb 2015, pp 1038–1045

    Google Scholar 

  12. Pietruczuk L, Rutkowski L, Jaworski M, Duda P (2017) How to adjust an ensemble size in stream data mining? Inf Sci (Ny) 381:46–54

    Article  MathSciNet  Google Scholar 

  13. Breiman L (2001) Random forests. Mach Learn 45(1):5–32

    Article  Google Scholar 

  14. Cohen L, Schwing AG, Pollefeys M (2014) Efficient structured parsing of facades using dynamic programming. In: Proceedings: IEEE computer society conference on computer vision and pattern recognition, 2014, pp 3206–3213

    Google Scholar 

  15. Robert Banfield E, Hall LO, Bowyer KW, Kegelmeyer WP (2007) A comparison of decision tree ensemble creation techniques. IEEE Trans Pattern Anal Mach Intell 29(1):173–180

    Article  Google Scholar 

  16. Yin L, Jeon Y (2006) Random forests and adaptive nearest neighbors. J Am Stat Assoc 101(474):578–590

    Article  MathSciNet  Google Scholar 

  17. Ahn H, Moon H, Fazzari MJ, Lim N, Chen JJ, Kodell RL (2007) Classification by ensembles from random partitions of high-dimensional data. Comput Stat Data Anal 51(12):6166–6179

    Article  MathSciNet  Google Scholar 

  18. Du YDY, Song ASA, Zhu LZL, Zhang WZW (2009) A mixed-type registration approach in medical image processing. In: Proceedings of 2nd international conference on biomedical engineering and informatics, BMEI’09, 2009, pp 1–4. IEEE

    Google Scholar 

  19. Freund. Y, Schapire RRE (1996) Experiments with a new boosting algorithm. In: Proceedings international conference on machine learning, 1996, pp 148–156

    Google Scholar 

  20. Lou Y, Caruana R, Gehrke J (2012) Intelligible models for classification and regression. In: Proceedings of KDD’12; 18th ACM SIGKDD international conference on knowledge discovery and data mining, pp 1–9

    Google Scholar 

  21. Wang. Y, Zheng. J, Zhou. H, Shen L (2008) Medical image processing by denoising and contour extraction. In: International conference on proceedings: information and automation, ICIA 2008

    Google Scholar 

  22. Quinlan JR (1996) Improved use of continuous attributes in C4.5. J Artif Intell Res 4:77–90

    Article  Google Scholar 

  23. Lou Y, Caruana R, Gehrke. J (2012) Intelligible models for classification and regression. In: Proceedings of KDD ’12; 18th ACM SIGKDD international conference on knowledge discovery and data mining, 2012, pp 1–9

    Google Scholar 

  24. Martinovic A, Mathias M, Weissenberg J, Van L (2012) A three-layered approach to facade parsing-supplementary material, pp 1–8

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sujatha Srinivasan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shaik, A.B., Srinivasan, S. (2019). A Brief Survey on Random Forest Ensembles in Classification Model. In: Bhattacharyya, S., Hassanien, A., Gupta, D., Khanna, A., Pan, I. (eds) International Conference on Innovative Computing and Communications. Lecture Notes in Networks and Systems, vol 56. Springer, Singapore. https://doi.org/10.1007/978-981-13-2354-6_27

Download citation

Publish with us

Policies and ethics