Several Speed-Up Variants of Cascade Generalization
Cascade generalization sequentially composes different classification methods into one single framework. However, the latter classification method has to be up against a larger attribute set. As a result, its learning process will slow down, especially for the data sets that have many class labels, and for the learning algorithms whose computational complexity is high with respect to the number of attributes, among others. This paper is devoted to propose several variants of the original cascade generalization, where the basic idea is to reduce the number of augmented attributed each time in the Cascade framework. Extensive experimental results manifest that all the variants are much faster than the original one as supposed. In addition, all the variants have achieved a little reduction of error rates, compared with the original Cascade framework.
KeywordsClass Label Classification Algorithm Base Classifier Continuous Attribute Nominal Attribute
Unable to display preview. Download preview PDF.