Applying Least Angle Regression to ELM

  • Hang Shao
  • Nathalie Japkowicz
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7310)


Basic extreme learning machines apply least square solution to calculate the neural network’s output weights. In the presence of outliers and multi-collinearity, the least square solution becomes invalid. In order to fix this problem, a new kind of extreme learning machine is proposed. An outlier detection technique is introduced to locate outliers and avoid their interference. The least square solution is replaced by regularization for output weights calculation during which the number of hidden nodes is also automatically chosen. Simulation results show that the proposed model has good prediction performance on both normal datasets and datasets contaminated by outliers.


Machine learning Extreme learning machine Least angle regression Outlier detection Robustness 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Asuncion, A., Newman, D.: UCI machine learning repository (2007),
  2. 2.
    Chang, C., Lin, C.: Libsvm: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 1–27 (2011)Google Scholar
  3. 3.
    Cortes, C., Vapnik, V.: Support-vector networks. Machine Learning 20, 273–297 (1995)zbMATHGoogle Scholar
  4. 4.
    Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. The Annals of Statistics 32, 407–499 (2004)MathSciNetzbMATHCrossRefGoogle Scholar
  5. 5.
    Frénay, B., Verleysen, M.: Using svms with randomised feature spaces: an extreme learning approach. In: European Symposium on Artificial Neural Networks (ESANN), pp. 315–320 (2010)Google Scholar
  6. 6.
    Geman, S., Bienenstock, E., Doursat, R.: Neural networks and the bias/variance dilemma. Neural Computation 4, 1–58 (1992)CrossRefGoogle Scholar
  7. 7.
    Hoerl, A., Kennard, R.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics 42, 80–86 (2000)Google Scholar
  8. 8.
    Huang, G.B., Chen, L.: Convex incremental extreme learning machine. Neurocomputing 70, 3056–3062 (2007)CrossRefGoogle Scholar
  9. 9.
    Huang, G.B., Chen, L.: Enhanced random search based incremental extreme learning machine. Neurocomputing 71, 3460–3468 (2008)CrossRefGoogle Scholar
  10. 10.
    Huang, G.B., Zhu, Q., Siew, C.: Extreme learning machine: theory and applications. Neurocomputing 70, 489–501 (2006)CrossRefGoogle Scholar
  11. 11.
    Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Transactions on Neural Networks 17(4), 879–892 (2006)CrossRefGoogle Scholar
  12. 12.
    Jaeger, H.: The ”echo state” approach to analysing and training recurrent neural networks-with an erratum note. GMD Report 148, German National Research Center for Information Technology (2001)Google Scholar
  13. 13.
    Lendasse, A., Sorjamaa, A., Miche, Y.: Op-elm toolbox (2008),
  14. 14.
    Miche, Y., Sorjamaa, A., Bas, P., Simula, O., Jutten, C., Lendasse, A.: Op-elm: Optimally pruned extreme learning machine. IEEE Transactions on Neural Networks 21, 158–162 (2010)CrossRefGoogle Scholar
  15. 15.
    Mitra, K., Veeraraghavan, A., Chellappa, R.: Robust regression using sparse learning for high dimensional parameter estimation problems. In: IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP), pp. 3846–3849 (2010)Google Scholar
  16. 16.
    Skoglund, K.: Regression and pca software (2011),
  17. 17.
    Zhu, Q., Huang, G.B.: Extreme learning machine toolbox (2004),
  18. 18.
    Zou, H.: The adaptive lasso and its oracle properties. Journal of the American Statistical Association 101, 1418–1429 (2006)MathSciNetzbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Hang Shao
    • 1
  • Nathalie Japkowicz
    • 1
  1. 1.School of Electrical and Computer EngineeringUniversity of OttawaOttawaCanada

Personalised recommendations