Abstract
Machine Learning has got the popularity in recent times. Apart from machine learning the decision tree is one of the most sought out algorithms to classify or predict future instances with already trained data set. Random Forest is an extended version of decision tree which can predict the future instances with multiple classifiers rather than single classifier to reach accuracy and correctness of the prediction. The performances of the Random Forest model is reconnoitered and vary with other models of classification which yield institutionalization, regularization, connection, high penchant change and highlight choice on the learning models. We incorporate principled projection strategies which are aiding to predict the future values. Ensemble techniques are machine learning techniques where more than one learners are constructed for given task. The ultimate aim of ensemble methods is to find high accuracy with greater performance. Ensembles are taking a different approach than single classifier to highlight the data. In this, more than one ensemble is constructed and all individual learners are combined based on some voting strategy. In the current study, we have outlined the concept of Random forest ensembles in classification.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Buczak AL, Guven E (2016) A survey of data mining and machine learning methods for cyber security intrusion detection. IEEE Commun Surv Tutor 18(2):1153–1176
Wu X, Zhu X, Wu GQ, Ding W (2014) Data mining with big data. IEEE Trans Knowl Eng 26(1):97–107
Kumar GK, Viswanath P, Rao AA (2016) Ensemble of randomized soft decision trees for robust classification. Acad Proc Eng Sci Sadhana 41(3):273–282
Mitchell TM (1999) Machine learning and data mining. ACM Commun 42(11):30–36
Zhou Z, Tang W (2003) Selective ensemble of decision trees. vol 2639. LNCS, Springer, pp 476–483
Gashler M, Carrier CG, Martinez T (2008) Decision tree ensemble: small heterogeneous is better than large homogeneous. In: Proceedings of 7th international conference on machine learning and applications, pp 900–905
Lebedev AV et al (2014) Random forest ensembles for detection and prediction of Alzheimer’s disease with a good between cohort robustness. NeuroImage Clin 6:115–125
Kotsiantis S (2011) Combining bagging, boosting, rotation forest and random subspace methods. Artif Intell Rev 35(3):223–240
Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees. Mach Learn 40:139–157
Louppe G (2014) Understanding random forests: from theory to practice, Cornell University Library, pp 1–225
Jampani V, Gadde R, Gehler PV (2015) Efficient facade segmentation using auto-context. In: Proceedings: 2015 IEEE-winter conference on applications of computer vision, WACV’15, Feb 2015, pp 1038–1045
Pietruczuk L, Rutkowski L, Jaworski M, Duda P (2017) How to adjust an ensemble size in stream data mining? Inf Sci (Ny) 381:46–54
Breiman L (2001) Random forests. Mach Learn 45(1):5–32
Cohen L, Schwing AG, Pollefeys M (2014) Efficient structured parsing of facades using dynamic programming. In: Proceedings: IEEE computer society conference on computer vision and pattern recognition, 2014, pp 3206–3213
Robert Banfield E, Hall LO, Bowyer KW, Kegelmeyer WP (2007) A comparison of decision tree ensemble creation techniques. IEEE Trans Pattern Anal Mach Intell 29(1):173–180
Yin L, Jeon Y (2006) Random forests and adaptive nearest neighbors. J Am Stat Assoc 101(474):578–590
Ahn H, Moon H, Fazzari MJ, Lim N, Chen JJ, Kodell RL (2007) Classification by ensembles from random partitions of high-dimensional data. Comput Stat Data Anal 51(12):6166–6179
Du YDY, Song ASA, Zhu LZL, Zhang WZW (2009) A mixed-type registration approach in medical image processing. In: Proceedings of 2nd international conference on biomedical engineering and informatics, BMEI’09, 2009, pp 1–4. IEEE
Freund. Y, Schapire RRE (1996) Experiments with a new boosting algorithm. In: Proceedings international conference on machine learning, 1996, pp 148–156
Lou Y, Caruana R, Gehrke J (2012) Intelligible models for classification and regression. In: Proceedings of KDD’12; 18th ACM SIGKDD international conference on knowledge discovery and data mining, pp 1–9
Wang. Y, Zheng. J, Zhou. H, Shen L (2008) Medical image processing by denoising and contour extraction. In: International conference on proceedings: information and automation, ICIA 2008
Quinlan JR (1996) Improved use of continuous attributes in C4.5. J Artif Intell Res 4:77–90
Lou Y, Caruana R, Gehrke. J (2012) Intelligible models for classification and regression. In: Proceedings of KDD ’12; 18th ACM SIGKDD international conference on knowledge discovery and data mining, 2012, pp 1–9
Martinovic A, Mathias M, Weissenberg J, Van L (2012) A three-layered approach to facade parsing-supplementary material, pp 1–8
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Shaik, A.B., Srinivasan, S. (2019). A Brief Survey on Random Forest Ensembles in Classification Model. In: Bhattacharyya, S., Hassanien, A., Gupta, D., Khanna, A., Pan, I. (eds) International Conference on Innovative Computing and Communications. Lecture Notes in Networks and Systems, vol 56. Springer, Singapore. https://doi.org/10.1007/978-981-13-2354-6_27
Download citation
DOI: https://doi.org/10.1007/978-981-13-2354-6_27
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-2353-9
Online ISBN: 978-981-13-2354-6
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)