A boosting algorithm for regression
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1327)
Part I: Coding and Learning in Biology
A new boosting algorithm ADABOOST-Ra for regression problems is presented and upper bound on the error is obtained. Experimental results to compare ADABOOST-RΔ and other learning algorithms are given.
Unable to display preview. Download preview PDF.
- Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Internal Report of AT & T, September (1995)Google Scholar
- Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. Machine Learning: Proc. of Thirteenth Int. Conf. (1996) 148–156Google Scholar
- V.N. Vapnik (1982) Estimation of Dependences Based on Empirical Data. Springer-Verlag.Google Scholar
- V.N. Vapnik, A.Y. Chervonenkis (1971) On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications, 16(2): 264–280.Google Scholar
- Freund, Y.: Boosting a weak learning algorithm by majority. Information and Computation 121 (2) (1995) 256–285Google Scholar
© Springer-Verlag Berlin Heidelberg 1997