Skip to main content
Log in

Bayesian regression filters and the issue of priors

  • Articles
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

We propose a Bayesian framework for regression problems, which covers areas usually dealt with by function approximation. An online learning algorithm is derived which solves regression problems with a Kalman filter. Its solution always improves with increasing model complexity, without the risk of over-fitting. In the infinite dimension limit it approaches the true Bayesian posterior. The issues of prior selection and over-fitting are also discussed, showing that some of the commonly held beliefs are misleading. The practical implementation is summarised. Simulations using 13 popular publicly available data sets are used to demonstrate the method and highlight important issues concerning the choice of priors.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Chui CK, Chen G Kalman Filtering with Real-Time Applications, Springer Series in Information Sciences, Springer-Verlag, Berlin, 1987

    Google Scholar 

  2. Geman S, Bienenstock E, Doursat R. Neural networks and the bias/variance dilemma. Neural Computation 1992; 4(1): 1–58

    Google Scholar 

  3. Hanson SJ, Cowan JD, Lee Giles C (eds). Advances in Neural Information Processing Systems, Vol 5, Morgan Kaufmann, San Mateo, CA, 1993

    Google Scholar 

  4. Jenkins GM, Watts DG. Spectral Analysis and its Applications, Holden-Day, San Francisco, 1968

    Google Scholar 

  5. MacKay DJC. Bayesian interpolation. Neural Computation 1992; 4(3): 415–447

    Google Scholar 

  6. MacKay DJC. Hyperparameters: Optimize, or integrate out? In: G Heidbreder, ed, Maximum Entropy and Bayesian Methods, Santa Barbara 1993, Kluwer, Dordrecht, 1995

    Google Scholar 

  7. Michie D, Speigelhalter DJ, Taylor CC. Machine Learning, Neural and Statistical Classification, Prentice Hall, Englewood Cliffs, NJ, 1994

    Google Scholar 

  8. Neal RM. Bayesian learning via stochastic dynamics. In: SJ Hansonet al., eds, Advances in Neural Information Processing Systems, Vol 5, 475–482, Morgan Kaufmann, San Mateo, CA, 1993

    Google Scholar 

  9. Neal RM. Bayesian learning for neural networks, PhD thesis, Department of Computer Science, University of Toronto, 1995 (ftp://ftp.cs.utoronto.ca/pub/radford/thesis.ps.Z.)

  10. Quinlan JR. Combining instance-based and modelbased learning. In: PE Utgoff, ed, Proc Machine Learning Conf 1993; Morgan Kaufmann, San Mateo, CA

    Google Scholar 

  11. Williams CKI, Qazaz C, Bishop CM, Zhu H. On the relationship between Bayesian error bars and the input data density. In: Fourth Int Conf Artificial Neural Networks, IEE Conference Publications no. 409 1995; 160–165

    Google Scholar 

  12. Williams CKI, Rasmussen CE. Regression with gaussian processes. In: M Mozer, D Touretzky, M Hasselmo, eds, Advances in Neural Information Processing Systems, Vol 8, MIT Press, 1996 (to appear)

  13. Wolpert DH. On the use of evidence in neural networks. In: SJ Hansonet al., eds, Advances in Neural Information Processing Systems, Vol 5, 539–546, Morgan Kaufmann, San Mateo, CA, 1993

    Google Scholar 

  14. Zhu H, Rohwer R. Information geometric measurements of generalisation. Technical Report NCRG/4350, Aston University, 1995. (ftp://cs.aston.ac.uk/neural/zhuh/generalisation.ps.Z.)

  15. Zhu H, Rohwer R. Bayesian invariant measurements of generalisation. Neural Proceessing Letters 1995; 2(6): 28–31.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huaiyu Zhu.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhu, H., Rohwer, R. Bayesian regression filters and the issue of priors. Neural Comput & Applic 4, 130–142 (1996). https://doi.org/10.1007/BF01414873

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01414873

Keywords

Navigation