Abstract
In this chapter, we present ETL committee, an ensemble method that uses ETL as a base learner. The ETL committee strategy relies on the use of training data manipulation to create an ensemble of ETL classifiers. ETL committee combines the main ideas of bagging and random subspaces. From bagging, we borrow the bootstrap sampling method. From random subspaces, we use the feature sampling idea. In the ETL committee training, we use ETL with template sampling, which provides an additional randomization step. As far as we know, this is the first study that uses transformation rule learning as the base learner for an ensemble method. This chapter is organized as follows. In Sect. 3.1, we explain the main idea behind ensemble methods. In Sect. 3.2, we detail the ETL committee training phase. In Sect. 3.3, we detail the classification phase. Finally, in Sect. 3.4, we present some related works.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Banfield, R.E., Hall, L.O., Bowyer, K.W., Kegelmeyer, W.P.: Ensemble diversity measures and their application to thinning. Inf. Fusion 6(1), 49–62 (2005)
Banfield, R.E., Hall, L.O., Bowyer, K.W., Kegelmeyer, W.P.: A comparison of decision tree ensemble creation techniques. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 173–180 (2007). doi:10.1109/TPAMI.2007.2
Biau, G., Devroye, L., Lugosi, G.: Consistency of random forests and other averaging classifiers. J. Mach. Learn. Res. 9, 2015–2033 (2008)
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)
Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001). doi:10.1023/A:1010933404324
Brown, G., Wyatt, J.L., Tiňo, P.: Managing diversity in regression ensembles. J. Mach. Learn. Res. 6, 1621–1650 (2005)
Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) MCS’00: Proceedings of the First International Workshop on Multiple Classifier Systems, pp. 1–15. Springer, London (2000)
Florian, R., Ittycheriah, A., Jing, H., Zhang, T.: Named entity recognition through classifier combination. In: Daelemans, W., Osborne, M. (eds.) Proceedings of CoNLL-2003, pp. 168–171. Edmonton, Canada (2003)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)
García-Pedrajas, N., Ortiz-Boyer, D.: Boosting random subspace method. Neural Netw. 21(9), 1344–1362 (2008). doi:10.1016/j.neunet.2007.12.046
Ho, T.K.: The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 832–844 (1998). doi:10.1109/34.709601
Oza, N.C., Tumer, K.: Classifier ensembles: select real-world applications. Inf. Fusion 9(1), 4–20 (2008). doi:10.1016/j.inffus.2007.07.002
Panov, P., Dzeroski, S.: Combining bagging and random subspaces to create better ensembles. In: Berthold, M.R., ShaweTaylor, J., Lavrac, N. (eds.) 7th International Symposium on Intelligent Data Analysis, pp. 118–129. Ljubljana, Slovenia (2007)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2012 The Author(s)
About this chapter
Cite this chapter
dos Santos, C.N., Milidiú, R.L. (2012). ETL Committee. In: Entropy Guided Transformation Learning: Algorithms and Applications. SpringerBriefs in Computer Science. Springer, London. https://doi.org/10.1007/978-1-4471-2978-3_3
Download citation
DOI: https://doi.org/10.1007/978-1-4471-2978-3_3
Published:
Publisher Name: Springer, London
Print ISBN: 978-1-4471-2977-6
Online ISBN: 978-1-4471-2978-3
eBook Packages: Computer ScienceComputer Science (R0)