Summary
We use the differential entropy concept and methods related to differential entropy estimation in this chapter. In the beginning, we define the basic terms: entropy, differential entropy, the Kullback–Leibler distance and the refractory periods. We show relations between differential entropy and the Kullback–Leibler distance as well.
Hereafter a detailed description of the methods used is given. These methods can be divided into three groups: parametric methods of entropy estimation, “plug-in” entropy estimators based on nonparametric density estimation and direct entropy estimators. The formulas for direct entropy estimation based on the first four sample moments are introduced.
The results are illustrated by comparing the methods of the entropy estimation, combined with two refractory period estimates. We compare the estimates based on the histogram, the kernel density estimator, the sample spacing method, Vasicek’s method, the nearest neighbor distance method and the methods based on sample moments.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Beirlant, J., Dudewicz, E. J., Györfi, L., van der Meulen, E. C.: Nonparametric entropy estimation: an overview. Int. J. Math. Stat. Sci., 6, 17–39 (1997).
Cover, T. M., Thomas, J. A.: Elements of Information Theory. John Wiley & Sons, New York (1991).
Johnson, N. L., Kotz, S.: Distributions in Statistics—Continuous Univariate Distributions— 1. John Wiley & Sons, New York (1970).
Kostal, L., Lansky, P.: Similarity of interspike interval distributions and information gain in stationary neuronal firing. Biol. Cybernet., 94(2), 157–167 (2006).
Reeke, G. N., Coop, A. D.: Estimating the temporal interval entropy of neuronal discharge. Neural Computation, Massachusetts Institute of Technology, 16, 941–970 (2004).
Vasicek, O.: A test for normality based on sample entropy. J. Statist. Soc. B, 38, 54–59 (1976).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Birkhäuser Boston
About this chapter
Cite this chapter
Hampel, D. (2008). Estimation of Differential Entropy for Positive Random Variables and Its Application in Computational Neuroscience. In: Deutsch, A., et al. Mathematical Modeling of Biological Systems, Volume II. Modeling and Simulation in Science, Engineering and Technology. Birkhäuser Boston. https://doi.org/10.1007/978-0-8176-4556-4_19
Download citation
DOI: https://doi.org/10.1007/978-0-8176-4556-4_19
Publisher Name: Birkhäuser Boston
Print ISBN: 978-0-8176-4555-7
Online ISBN: 978-0-8176-4556-4
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)