Skip to main content
Log in

Boosting the partial least square algorithm for regression modelling

  • Published:
Journal of Control Theory and Applications Aims and scope Submit manuscript

Abstract

Boosting algorithms are a class of general methods used to improve the general performance of regression analysis. The main idea is to maintain a distribution over the train set. In order to use the given distribution directly, a modified PLS algorithm is proposed and used as the base learner to deal with the nonlinear multivariate regression problems. Experiments on gasoline octane number prediction demonstrate that boosting the modified PLS algorithm has better general performance over the PLS algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Y. Freund, R. E. Schapire, A decision-theoretic generalization of online learning and an application to boosting[J]. J. of Computer and System Sciences, 1997, 55(1): 119–139.

    Article  MATH  MathSciNet  Google Scholar 

  2. Y. Freund, An adaptive version of the boost by majority algorithm[J]. Machine Learning, 2001, 43(3): 293–318.

    Article  MATH  Google Scholar 

  3. J. H. Friedman, T. Hastie, R. Tibshirani, Additive logistic regression: a statistical view of boosting[J]. Annals of Statistics, 2000, 28(2): 337–374.

    Article  MATH  MathSciNet  Google Scholar 

  4. R. E. Schapire, Y. Singer, Improved boosting algorithms using confidence-rated predictions[J]. Machine Learning, 1999, 37(3): 297–336.

    Article  MATH  Google Scholar 

  5. H. Drucker, Improving regressors using boosting techniques[C] // Proc. of the 14th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 1997: 107–115.

    Google Scholar 

  6. H. Drucker, Boosting using neural nets[M] // Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems. London: Springer, 1999: 51–77.

    Google Scholar 

  7. G. Ridgeway, The state of boosting[C] // Proc. of the 31st Symposium on the Interface: Models, Predictions, and Computing. Schaumburg, Illinois: Elsevier, 1999, 31: 172–181.

    Google Scholar 

  8. G. Ridgeway, D. Madigan, T. Richardson, Boosting methodology for regression problems[C] // Proc. of the Seventh International Workshop on Artificial Intelligence and Statistics. San Francisco, CA: Morgan Kaufmann Publishers, 1999: 152–161.

    Google Scholar 

  9. P. Geladi, B. R. Kowalski, Partial least-squares regression: a tutorial[J]. Analytica Chimica Acta, 1986, 185(1): 1–17.

    Article  Google Scholar 

  10. V. N. Vapnik, Statistical Learning Theory[M]. New York: John Wiley & Sons, Inc, 1998.

    Google Scholar 

  11. L. G. Valiant, A theory of the learnable[J]. Communications of the ACM, 1984, 27(11): 1134–1142.

    Article  MATH  Google Scholar 

  12. R. E. Schapire, The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197–227.

    Google Scholar 

  13. R. E. Schapire, The boosting approach to machine learning: An overview[C] // Lectures Notes in Statistics: Nonlinear Estimation and Classification, Proc. from MSRI Workshop. Berlin: Springer-Verlag, 2001: 149–172.

    Google Scholar 

  14. J. J. Kelly, C. H. Barlow, T. M. Jinguji, et al. Prediction of gasoline octane numbers from near-infrared spectral features in the range 660-1215nm[J]. Analytical Chemistry, 1989, 61(4): 313–320.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

This work was supported by the National High-tech Research and Development Program of China (No. 2003AA412110).

Ling YU was born in 1974. She received the M.S. degree in control theory and application from Zhejiang University in 1998, and now is an on-job doctoral candidate of Zhejiang University. Her current research interests are in the area of nonlinear identification and multivariate regression analysis.

Tiejun WU was born in 1950. He received the Ph. D. degree in engineering from Zhejiang University in 1988. He is currently with Zhejiang university as professor. His research interest is focused on complex systems intelligent control and optimization.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yu, L., Wu, T. Boosting the partial least square algorithm for regression modelling. J. Control Theory Appl. 4, 257–260 (2006). https://doi.org/10.1007/s11768-006-5231-z

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11768-006-5231-z

Keywords

Navigation