Skip to main content

Boosting and Classification of Electronic Nose Data

  • Conference paper
  • First Online:
Multiple Classifier Systems (MCS 2002)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2364))

Included in the following conference series:

Abstract

Boosting methods are known to improve generalization performances of learning algorithms reducing both bias and variance or enlarging the margin of the resulting multi-classifier system. In this contribution we applied Adaboost to the discrimination of different types of coffee using data produced with an Electronic Nose. Two groups of coffees (blends and monovarieties), consisting of seven classes each, have been analyzed. The boosted ensemble of Multi-Layer Perceptrons was able to halve the classification error for the blends data and to diminish it from 21% to 18% for the more difficult monovarieties data set.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. G.J. Briem, J.A. Benediktsson, and J.R. Sveinsson. Boosting. Bagging and Consensus Based Classification of Multisource Remote Sensing Data. In MCS 2001, Cambridge, UK, volume 2096 of LNCS, pages 279–288. Springer-Verlag, 2001.

    Google Scholar 

  2. T.G. Dietterich. Ensemble methods in machine learning. In MCS 2000, Cagliari, Italy, volume 1857 of LNCS, pages 1–15. Springer-Verlag, 2000.

    Google Scholar 

  3. T.G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision tress: Bagging, boosting and randomization. Machine Learning, 40(2):139–158, 2000.

    Article  Google Scholar 

  4. P. Domingos. A Unified Bias-Variance Decomposition and its Applications. In Proceedings of the 17 th ICML, pages 231–238, Stanford, CA, 2000. Morgan Kaufmann.

    Google Scholar 

  5. Y. Freund and R. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and Systems Sciences, 55(1):119–139, 1997.

    Article  MATH  MathSciNet  Google Scholar 

  6. Y. Freund and R.E. Schapire. Experiments with a new boosting algorithm. In Proceedings of the 13 th ICML, pages 148–156. Morgan Kauffman, 1996.

    Google Scholar 

  7. C. Furlanello and S. Merler. Boosting of Tree-based Classifiers for Predictive Risk Modeling in GIS. In MCS 2000, Cagliari, Italy, volume 1857 of LNCS, pages 220–229. Springer-Verlag, 2000.

    Google Scholar 

  8. Gardner and Bartlett. Electronic noses. Oxford University Press, 1999.

    Google Scholar 

  9. M. Haruno, S. Shirai, and Y. Ooyama. Using decision trees to construct a practical parser. Machine Learning, 34:131–149, 1999.

    Article  MATH  Google Scholar 

  10. F. Masulli, G. Valentini, M. Pardo, and G. Sberveglieri. Classification of sensor array data by Output Coding decomposition methods. In International Workshop MATCHEMS 2001, pages 169–172, Brescia, Italy, 2001.

    Google Scholar 

  11. S. Merler, C. Furlanello, B. Larcher, and A. Sboner. Tuning Cost-Sensitive Boosting and its Application to Melanoma Diagnosis. In MCS 2001, Cambridge, UK, volume 2096 of LNCS, pages 32–42. Springer-Verlag, 2001.

    Google Scholar 

  12. M. Pardo, G. Niederjaufner, G. Benussi, E. Comini, G. Faglia, G. Sberveglieri, M. Holmberg, and I. Lundstrom. Data preprocessing enhances the classification of different brands of espresso coffee with an electronic nose. Sensors and Actuators B, 69, 2000.

    Google Scholar 

  13. G. Sberveglieri. Sensors and Actuators B, 6, 1992.

    Google Scholar 

  14. R. Schapire and Y. Singer. Boostexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135–168, 2000.

    Article  MATH  Google Scholar 

  15. R. Schapire, Y. Singer, and A. Singhal. Boosting and Rocchio applied to text filtering. In Proceedings of the 11 th International Conference on Research and Development in Information Retrieval, 1998.

    Google Scholar 

  16. R.E. Schapire. A brief introduction to boosting. In 16 th IJCAI, pages 1401–1406. Morgan Kauffman, 1999.

    Google Scholar 

  17. R.E. Schapire, Y. Freund, P. Bartlett, and W. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics, 26(5):1651–1686, 1998.

    Article  MATH  MathSciNet  Google Scholar 

  18. H. Schwenk and Y. Bengio. Training methods for adaptive boosting of neural networks. In Advances in Neural Information Processing Systems, volume 10, pages 647–653. 1998.

    Google Scholar 

  19. G. Valentini and F. Masulli. NEURObjects: an object-oriented library for neural network development. Neurocomputing. (in press).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Masulli, F., Pardo, M., Sberveglieri, G., Valentini, G. (2002). Boosting and Classification of Electronic Nose Data. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_26

Download citation

  • DOI: https://doi.org/10.1007/3-540-45428-4_26

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43818-2

  • Online ISBN: 978-3-540-45428-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics