Embedding Random Projections in Regularized Gradient Boosting Machines

Part of the Studies in Computational Intelligence book series (SCI, volume 373)


Random Projections are a suitable technique for dimensionality reduction in Machine Learning. In this work, we propose a novel Boosting technique that is based on embedding Random Projections in a regularized gradient boosting ensemble. Random Projections are studied from different points of view: pure Random Projections, normalized and uniform binary. Furthermore, we study the effect to keep or change the dimensionality of the data space. Experimental results performed on synthetic and UCI datasets show that Boosting methods with embedded random data projections are competitive to AdaBoost and Regularized Boosting.


Regularization Parameter Gaussian Mixture Model Test Pattern Random Projection Dimensionality Reduction Technique 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Arriaga, R.I., Vempala, S.: An algorithmic theory of learning: Robust concepts and random projection. Machine Learning 63, 161–182 (2006)CrossRefzbMATHGoogle Scholar
  2. 2.
    Blum, A.: Random projection, margins, kernels, and feature-selection. In: Saunders, C., Grobelnik, M., Gunn, S., Shawe-Taylor, J. (eds.) SLSFS 2005. LNCS, vol. 3940, pp. 52–68. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  3. 3.
    Dasgupta, S.: Experiments with random projection. In: Proc. the 16th Conf. Uncertainty in Artif. Intell., Stanford, CA, pp. 143–151. Morgan Kaufmann, San Francisco (2000)Google Scholar
  4. 4.
  5. 5.
    Fradkin, D., Madigan, D.: Experiments with random projections for machine learning. In: Proc. the 9th ACM SIGKDD Int. Conf. Knowledge Discovery and Data Mining, Washington, DC, pp. 517–522. ACM Press, New York (2003)CrossRefGoogle Scholar
  6. 6.
    Frank, A., Asuncion, A.: UCI machine learning repository. University of California, School of Information and Computer Sciences, Irvine (2010)Google Scholar
  7. 7.
    Friedman, J.H.: Greedy function approximation: A gradient boosting machine. Annals of Stat. 29, 1189–1232 (2000)CrossRefGoogle Scholar
  8. 8.
    Johnson, W.B., Lindenstrauss, J.: Extensions of Lipschitz maps into a Hilbert space. Contemporary Mathematics 26, 189–206 (1984)MathSciNetzbMATHGoogle Scholar
  9. 9.
    Pujol, O.: Boosted geometry-based ensembles. In: El Gayar, N., Kittler, J., Roli, F. (eds.) MCS 2010. LNCS, vol. 5997, pp. 195–204. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  10. 10.
    Rahimi, A., Recht, B.: Weighted sums of random kitchen sinks: Replacing minimization with randomization in learning. In: Advances in Neural Inf. Proc. Syst., vol. 21, pp. 1313–1320. MIT Press, Cambridge (2008)Google Scholar
  11. 11.
    Rodriguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation Forest: A new classifier ensemble method. IEEE Trans. Pattern Analysis and Machine Intell. 28, 1619–1630 (2006)CrossRefGoogle Scholar
  12. 12.
    Thrun, S., Bala, J., Bloedorn, E., Bratko, I., Cestnik, B., Cheng, J., De Jong, K., Dzeroski, S., Hamann, R., Kaufman, K., Keller, S., Kononenko, I., Kreuziger, J., Michalski, R.S., Mitchell, T., Pachowicz, P., Roger, B., Vafaie, H., Van de Velde, W., Wenzel, W., Wnek, J., Zhang, J.: The MONK’s problems: A performance comparison of different learning algorithms. Technical Report CMU-CS-91-197, Carnegie Mellon University (1991)Google Scholar
  13. 13.
    Zhang, C.-X., Zhang, J.-S.: RotBoost: A technique for combining Rotation Forest and AdaBoost. Pattern Recogn. Letters 29, 1524–1536 (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  1. 1.Computer Vision Center, BarcelonaSpain
  2. 2.Dept. of Applied Mathematics and Analysis, Computer Vision CenterUniversity of BarcelonaBarcelonaSpain

Personalised recommendations