Soft-Constrained Nonparametric Density Estimation with Artificial Neural Networks

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9896)

Abstract

The estimation of probability density functions (pdf) from unlabeled data samples is a relevant (and, still open) issue in pattern recognition and machine learning. Statistical parametric and nonparametric approaches present severe drawbacks. Only a few instances of neural networks for pdf estimation are found in the literature, due to the intrinsic difficulty of unsupervised learning under the necessary integral-equals-one constraint. In turn, also such neural networks do suffer from serious limitations. The paper introduces a soft-constrained algorithm for training a multilayer perceptron (MLP) to estimate pdfs empirically. A variant of the Metropolis-Hastings algorithm (exploiting the very probabilistic nature of the MLP) is used to satisfy numerically the constraint on the integral of the function learned by the MLP. The preliminary outcomes of a simulation on data drawn from a mixture of Fisher-Tippett pdfs are reported on, and compared graphically with the estimates yielded by statistical techniques, showing the viability of the approach.

Keywords

Density estimation Nonparametric estimation Unsupervised learning Constrained learning Multilayer perceptron 

References

  1. 1.
    Andrieu, C., de Freitas, N., Doucet, A., Jordan, M.I.: An introduction to MCMC for machine learning. Mach. Learn. 50(1–2), 5–43 (2003)CrossRefMATHGoogle Scholar
  2. 2.
    Beirami, A., Sardari, M., Fekri, F.: Wireless network compression via memory-enabled overhearing helpers. IEEE Trans. Wirel. Commun. 15(1), 176–190 (2016)CrossRefGoogle Scholar
  3. 3.
    Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. Wiley, New York (1973)MATHGoogle Scholar
  4. 4.
    Koslicki, D., Thompson, D.: Coding sequence density estimation via topological pressure. J. Math. Biol. 70(1/2), 45–69 (2015)MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Liang, F., Barron, A.: Exact minimax strategies for predictive density estimation, data compression, and model selection. IEEE Trans. Inf. Theory 50(11), 2708–2726 (2004)MathSciNetCrossRefMATHGoogle Scholar
  6. 6.
    Magdon-Ismail, M., Atiya, A.: Density estimation and random variate generation using multilayer networks. IEEE Trans. Neural Netw. 13(3), 497–520 (2002)CrossRefGoogle Scholar
  7. 7.
    Modha, D.S., Fainman, Y.: A learning law for density estimation. IEEE Trans. Neural Netw. 5(3), 519–523 (1994)CrossRefGoogle Scholar
  8. 8.
    Newman, M.E.J., Barkema, G.T.: Monte Carlo Methods in Statistical Physics. Oxford University Press, Oxford (1999)MATHGoogle Scholar
  9. 9.
    Ohl, T.: VEGAS revisited: adaptive Monte Carlo integration beyond factorization. Comput. Phys. Commun. 120, 13–19 (1999)CrossRefMATHGoogle Scholar
  10. 10.
    Rubinstein, R.Y., Kroese, D.P.: Simulation and the Monte Carlo Method, 2nd edn. Wiley, Hoboken (2012)MATHGoogle Scholar
  11. 11.
    Trentin, E.: Networks with trainable amplitude of activation functions. Neural Netw. 14(45), 471–493 (2001)CrossRefGoogle Scholar
  12. 12.
    Trentin, E.: Simple and effective connectionist nonparametric estimation of probability density functions. In: Schwenker, F., Marinai, S. (eds.) ANNPR 2006. LNCS (LNAI), vol. 4087, pp. 1–10. Springer, Heidelberg (2006)Google Scholar
  13. 13.
    Trentin, E., Gori, M.: Robust combination of neural networks and hidden Markov models for speech recognition. IEEE Trans. Neural Netw. 14(6), 1519–1531 (2003)CrossRefGoogle Scholar
  14. 14.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New-York (1995)CrossRefMATHGoogle Scholar
  15. 15.
    Yang, Z.: Machine Learning Approaches to Bioinformatics. World Scientific Publishing Company, Singapore (2010)CrossRefMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Dipartimento di Ingegneria dell’Informazione e Scienze MatematicheUniversità degli Studi di SienaSienaItaly

Personalised recommendations