Abstract
Boosting methods are known to improve generalization performances of learning algorithms reducing both bias and variance or enlarging the margin of the resulting multi-classifier system. In this contribution we applied Adaboost to the discrimination of different types of coffee using data produced with an Electronic Nose. Two groups of coffees (blends and monovarieties), consisting of seven classes each, have been analyzed. The boosted ensemble of Multi-Layer Perceptrons was able to halve the classification error for the blends data and to diminish it from 21% to 18% for the more difficult monovarieties data set.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
G.J. Briem, J.A. Benediktsson, and J.R. Sveinsson. Boosting. Bagging and Consensus Based Classification of Multisource Remote Sensing Data. In MCS 2001, Cambridge, UK, volume 2096 of LNCS, pages 279–288. Springer-Verlag, 2001.
T.G. Dietterich. Ensemble methods in machine learning. In MCS 2000, Cagliari, Italy, volume 1857 of LNCS, pages 1–15. Springer-Verlag, 2000.
T.G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision tress: Bagging, boosting and randomization. Machine Learning, 40(2):139–158, 2000.
P. Domingos. A Unified Bias-Variance Decomposition and its Applications. In Proceedings of the 17 th ICML, pages 231–238, Stanford, CA, 2000. Morgan Kaufmann.
Y. Freund and R. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and Systems Sciences, 55(1):119–139, 1997.
Y. Freund and R.E. Schapire. Experiments with a new boosting algorithm. In Proceedings of the 13 th ICML, pages 148–156. Morgan Kauffman, 1996.
C. Furlanello and S. Merler. Boosting of Tree-based Classifiers for Predictive Risk Modeling in GIS. In MCS 2000, Cagliari, Italy, volume 1857 of LNCS, pages 220–229. Springer-Verlag, 2000.
Gardner and Bartlett. Electronic noses. Oxford University Press, 1999.
M. Haruno, S. Shirai, and Y. Ooyama. Using decision trees to construct a practical parser. Machine Learning, 34:131–149, 1999.
F. Masulli, G. Valentini, M. Pardo, and G. Sberveglieri. Classification of sensor array data by Output Coding decomposition methods. In International Workshop MATCHEMS 2001, pages 169–172, Brescia, Italy, 2001.
S. Merler, C. Furlanello, B. Larcher, and A. Sboner. Tuning Cost-Sensitive Boosting and its Application to Melanoma Diagnosis. In MCS 2001, Cambridge, UK, volume 2096 of LNCS, pages 32–42. Springer-Verlag, 2001.
M. Pardo, G. Niederjaufner, G. Benussi, E. Comini, G. Faglia, G. Sberveglieri, M. Holmberg, and I. Lundstrom. Data preprocessing enhances the classification of different brands of espresso coffee with an electronic nose. Sensors and Actuators B, 69, 2000.
G. Sberveglieri. Sensors and Actuators B, 6, 1992.
R. Schapire and Y. Singer. Boostexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135–168, 2000.
R. Schapire, Y. Singer, and A. Singhal. Boosting and Rocchio applied to text filtering. In Proceedings of the 11 th International Conference on Research and Development in Information Retrieval, 1998.
R.E. Schapire. A brief introduction to boosting. In 16 th IJCAI, pages 1401–1406. Morgan Kauffman, 1999.
R.E. Schapire, Y. Freund, P. Bartlett, and W. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics, 26(5):1651–1686, 1998.
H. Schwenk and Y. Bengio. Training methods for adaptive boosting of neural networks. In Advances in Neural Information Processing Systems, volume 10, pages 647–653. 1998.
G. Valentini and F. Masulli. NEURObjects: an object-oriented library for neural network development. Neurocomputing. (in press).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Masulli, F., Pardo, M., Sberveglieri, G., Valentini, G. (2002). Boosting and Classification of Electronic Nose Data. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_26
Download citation
DOI: https://doi.org/10.1007/3-540-45428-4_26
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43818-2
Online ISBN: 978-3-540-45428-1
eBook Packages: Springer Book Archive