Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3944))

Included in the following conference series:

  • 2385 Accesses

Abstract

I submitted entries for the two classification problems — “Catalysis” and “Gatineau” — in the Evaluating Predictive Uncertainty Challenge. My entry for Catalysis was the best one; my entry for Gatineau was the third best, behind two similar entries by Nitesh Chawla.

The Catalysis dataset was later revealed to be about predicting a property of yeast proteins from expression levels of the genes encoding them. The nature of the Gatineau dataset has not been revealed, for proprietary reasons. The two datasets are similar in number of input variables that are available for predicting the binary outcome (617 for Catalysis, 1092 for Gatineau). They differ substantially in the number of cases available for training (1173 for Catalysis, 5176 for Gatineau) and in the fractions of cases that are in the two classes (43%/57% for Catalysis, 9%/91% for Gatineau).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  • Lampinen, J., Vehtari, A.: Bayesian approach for neural networks — review and case studies. Neural Networks 14, 257–274 (2001)

    Article  Google Scholar 

  • Liu, J.S.: Monte Carlo Strategies in Scientific Computing. Springer, Heidelberg (2001)

    MATH  Google Scholar 

  • Neal, R.M.: Probabilistic Inference Using Markov Chain Monte Carlo Methods, Technical Report CRG-TR-93-1, Department of Computer Science, University of Toronto, 144 p. (1993) Available from, http://www.cs.utoronto.ca/~radford/

  • Neal, R.M.: Bayesian Learning for Neural Networks. Lecture Notes in Statistics, vol. 118. Springer, Heidelberg (1996)

    MATH  Google Scholar 

  • Neal, R.M., Zhang, J.: High Dimensional Classification with Bayesian Neural Networks and Dirichlet Diffusion Trees. In: Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L. (eds.) Feature Extraction, Foundations and Applications. Studies in Fuzziness and Soft Computing. Physica-Verlag, Springer, Heidelberg (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Neal, R.M. (2006). Classification with Bayesian Neural Networks. In: Quiñonero-Candela, J., Dagan, I., Magnini, B., d’Alché-Buc, F. (eds) Machine Learning Challenges. Evaluating Predictive Uncertainty, Visual Object Classification, and Recognising Tectual Entailment. MLCW 2005. Lecture Notes in Computer Science(), vol 3944. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11736790_2

Download citation

  • DOI: https://doi.org/10.1007/11736790_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-33427-9

  • Online ISBN: 978-3-540-33428-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics