Boosting and Classification of Electronic Nose Data

  • Francesco Masulli
  • Matteo Pardo
  • Giorgio Sberveglieri
  • Giorgio Valentini
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2364)

Abstract

Boosting methods are known to improve generalization performances of learning algorithms reducing both bias and variance or enlarging the margin of the resulting multi-classifier system. In this contribution we applied Adaboost to the discrimination of different types of coffee using data produced with an Electronic Nose. Two groups of coffees (blends and monovarieties), consisting of seven classes each, have been analyzed. The boosted ensemble of Multi-Layer Perceptrons was able to halve the classification error for the blends data and to diminish it from 21% to 18% for the more difficult monovarieties data set.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    G.J. Briem, J.A. Benediktsson, and J.R. Sveinsson. Boosting. Bagging and Consensus Based Classification of Multisource Remote Sensing Data. In MCS 2001, Cambridge, UK, volume 2096 of LNCS, pages 279–288. Springer-Verlag, 2001.Google Scholar
  2. [2]
    T.G. Dietterich. Ensemble methods in machine learning. In MCS 2000, Cagliari, Italy, volume 1857 of LNCS, pages 1–15. Springer-Verlag, 2000.Google Scholar
  3. [3]
    T.G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision tress: Bagging, boosting and randomization. Machine Learning, 40(2):139–158, 2000.CrossRefGoogle Scholar
  4. [4]
    P. Domingos. A Unified Bias-Variance Decomposition and its Applications. In Proceedings of the 17 th ICML, pages 231–238, Stanford, CA, 2000. Morgan Kaufmann.Google Scholar
  5. [5]
    Y. Freund and R. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and Systems Sciences, 55(1):119–139, 1997.MATHCrossRefMathSciNetGoogle Scholar
  6. [6]
    Y. Freund and R.E. Schapire. Experiments with a new boosting algorithm. In Proceedings of the 13 th ICML, pages 148–156. Morgan Kauffman, 1996.Google Scholar
  7. [7]
    C. Furlanello and S. Merler. Boosting of Tree-based Classifiers for Predictive Risk Modeling in GIS. In MCS 2000, Cagliari, Italy, volume 1857 of LNCS, pages 220–229. Springer-Verlag, 2000.Google Scholar
  8. [8]
    Gardner and Bartlett. Electronic noses. Oxford University Press, 1999.Google Scholar
  9. [9]
    M. Haruno, S. Shirai, and Y. Ooyama. Using decision trees to construct a practical parser. Machine Learning, 34:131–149, 1999.MATHCrossRefGoogle Scholar
  10. [10]
    F. Masulli, G. Valentini, M. Pardo, and G. Sberveglieri. Classification of sensor array data by Output Coding decomposition methods. In International Workshop MATCHEMS 2001, pages 169–172, Brescia, Italy, 2001.Google Scholar
  11. [11]
    S. Merler, C. Furlanello, B. Larcher, and A. Sboner. Tuning Cost-Sensitive Boosting and its Application to Melanoma Diagnosis. In MCS 2001, Cambridge, UK, volume 2096 of LNCS, pages 32–42. Springer-Verlag, 2001.Google Scholar
  12. [12]
    M. Pardo, G. Niederjaufner, G. Benussi, E. Comini, G. Faglia, G. Sberveglieri, M. Holmberg, and I. Lundstrom. Data preprocessing enhances the classification of different brands of espresso coffee with an electronic nose. Sensors and Actuators B, 69, 2000.Google Scholar
  13. [13]
    G. Sberveglieri. Sensors and Actuators B, 6, 1992.Google Scholar
  14. [14]
    R. Schapire and Y. Singer. Boostexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135–168, 2000.MATHCrossRefGoogle Scholar
  15. [15]
    R. Schapire, Y. Singer, and A. Singhal. Boosting and Rocchio applied to text filtering. In Proceedings of the 11 th International Conference on Research and Development in Information Retrieval, 1998.Google Scholar
  16. [16]
    R.E. Schapire. A brief introduction to boosting. In 16 th IJCAI, pages 1401–1406. Morgan Kauffman, 1999.Google Scholar
  17. [17]
    R.E. Schapire, Y. Freund, P. Bartlett, and W. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics, 26(5):1651–1686, 1998.MATHCrossRefMathSciNetGoogle Scholar
  18. [18]
    H. Schwenk and Y. Bengio. Training methods for adaptive boosting of neural networks. In Advances in Neural Information Processing Systems, volume 10, pages 647–653. 1998.Google Scholar
  19. [19]
    G. Valentini and F. Masulli. NEURObjects: an object-oriented library for neural network development. Neurocomputing. (in press).Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Francesco Masulli
    • 1
    • 2
  • Matteo Pardo
    • 3
  • Giorgio Sberveglieri
    • 3
  • Giorgio Valentini
    • 1
    • 4
  1. 1.INFMIstituto Nazionale per la Fisica della MateriaGenovaItaly
  2. 2.Dipartimento di InformaticaUniversità di PisaPisaItaly
  3. 3.INFM and Dipartimento di Chimica e FisicaBresciaItaly
  4. 4.DISI - Dipartimento di Informatica e Scienze dell’InformazioneUniversità di GenovaGenovaItaly

Personalised recommendations