Several Speed-Up Variants of Cascade Generalization

  • Zhipeng Xie
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4223)


Cascade generalization sequentially composes different classification methods into one single framework. However, the latter classification method has to be up against a larger attribute set. As a result, its learning process will slow down, especially for the data sets that have many class labels, and for the learning algorithms whose computational complexity is high with respect to the number of attributes, among others. This paper is devoted to propose several variants of the original cascade generalization, where the basic idea is to reduce the number of augmented attributed each time in the Cascade framework. Extensive experimental results manifest that all the variants are much faster than the original one as supposed. In addition, all the variants have achieved a little reduction of error rates, compared with the original Cascade framework.


Class Label Classification Algorithm Base Classifier Continuous Attribute Nominal Attribute 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Blake, C.L., Merz, C.J.: UCI repository of machine learning databases. University of California, Irvine, CA (1998),
  2. 2.
    Gama, J., Brazdil, P.: Cascade Generalization. Machine Learning 41, 315–343 (2000)MATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Zhipeng Xie
    • 1
  1. 1.Department of Computing and Information TechnologyFudan UniversityShanghaiChina

Personalised recommendations