Skip to main content
Log in

Novel techniques for improving NNetEn entropy calculation for short and noisy time series

  • Original Paper
  • Published:
Nonlinear Dynamics Aims and scope Submit manuscript

Abstract

Entropy is a fundamental concept in the field of information theory. During measurement, conventional entropy measures are susceptible to length and amplitude changes in time series. A new entropy metric, neural network entropy (NNetEn), has been developed to overcome these limitations. NNetEn entropy is computed using a modified LogNNet neural network classification model. The algorithm contains a reservoir matrix of N = 19,625 elements that must be filled with the given data. A substantial number of practical time series have fewer elements than 19,625. The contribution of this paper is threefold. Firstly, this work investigates different methods of filling the reservoir with time series (signal) elements. The reservoir filling method determines the accuracy of the entropy estimation by convolution of the study time series and LogNNet test data. The present study proposes 6 methods for filling the reservoir for time series of any length 5 ≤ N ≤ 19,625. Two of them (Method 3 and Method 6) employ the novel approach of stretching the time series to create intermediate elements that complement it, but do not change its dynamics. The most reliable methods for short-time series are Method 3 and Method 5. The second part of the study examines the influence of noise and constant bias on entropy values. In addition to external noise, the hyperparameter (bias) used in entropy calculation also plays a critical role. Our study examines three different time series data types (chaotic, periodic, and binary) with different dynamic properties, Signal-to-Noise Ratio (SNR), and offsets. The NNetEn entropy calculation errors are less than 10% when SNR is greater than 30 dB, and entropy decreases with an increase in the bias component. The third part of the article analyzes real-time biosignal EEG data collected from emotion recognition experiments. The NNetEn measures show robustness under low-amplitude noise using various filters. Thus, NNetEn measures entropy effectively when applied to real-world environments with ambient noise, white noise, and 1/f noise.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24

Similar content being viewed by others

Data availability

Soft to calculate NNetEn for all 6 methods of matrix filling available in the Researchgate repository, https://www.researchgate.net/publication/366575849_NNetEn_calculator_1004_for_NNetEn_calculation_with_six_matrix_filling_methods

References

  1. Li, J., Shang, P., Zhang, X.: Financial time series analysis based on fractional and multiscale permutation entropy. Commun. Nonlinear Sci. Numer. Simul. 78, 104880 (2019). https://doi.org/10.1016/j.cnsns.2019.104880

    Article  MathSciNet  MATH  Google Scholar 

  2. Minhas, A.S., Kankar, P.K., Kumar, N., Singh, S.: Bearing fault detection and recognition methodology based on weighted multiscale entropy approach. Mech. Syst. Signal Process. 147, 107073 (2021). https://doi.org/10.1016/j.ymssp.2020.107073

    Article  Google Scholar 

  3. Altan, A., Karasu, S., Zio, E.: A new hybrid model for wind speed forecasting combining long short-term memory neural network, decomposition methods and grey wolf optimizer. Appl. Soft Comput. 100, 106996 (2021). https://doi.org/10.1016/j.asoc.2020.106996

    Article  Google Scholar 

  4. Ai, Y.T., Guan, J.Y., Fei, C.W., Tian, J., Zhang, F.L.: Fusion information entropy method of rolling bearing fault diagnosis based on n-dimensional characteristic parameter distance. Mech. Syst. Signal Process. 88, 123–136 (2017). https://doi.org/10.1016/j.ymssp.2016.11.019

    Article  Google Scholar 

  5. Ra, J.S., Li, T., Yan, L.: A novel permutation entropy-based EEG channel selection for improving epileptic seizure prediction. Sensors 21(23), 7972 (2021)

    Article  Google Scholar 

  6. Zavala-Yoe, R., Ramirez-Mendoza, R.A., Cordero, L.M.: Entropy measures to study and model long term simultaneous evolution of children in Doose and Lennox–Gastaut syndromes. J. Integr. Neurosci. 15(02), 205–221 (2016). https://doi.org/10.1142/S0219635216500138

    Article  Google Scholar 

  7. Etem, T., Kaya, T.: A novel True Random Bit Generator design for image encryption. Phys. A: Statist. Mech. Appl. 540, 122750 (2020). https://doi.org/10.1016/j.physa.2019.122750

    Article  Google Scholar 

  8. Wang, P., Wang, Y., Xiang, J., Xiao, X.: Fast image encryption algorithm for logistics-sine-cosine mapping. Sensors 22(24), 9929 (2022)

    Article  Google Scholar 

  9. Etem, T., Kaya, T.: Self-generated encryption model of acoustics. Appl. Acoust. 170, 107481 (2020). https://doi.org/10.1016/j.apacoust.2020.107481

    Article  Google Scholar 

  10. Yağ, İ, Altan, A.: Artificial intelligence-based robust hybrid algorithm design and implementation for real-time detection of plant diseases in agricultural environments. Biology 11(12), 1732 (2022). https://doi.org/10.3390/biology11121732(2022)

    Article  Google Scholar 

  11. Li, Y., Wang, X., Liu, Z., Liang, X., Si, S.: The entropy algorithm and its variants in the fault diagnosis of rotating machinery: a review. IEEE Access. 6, 66723–66741 (2018). https://doi.org/10.1109/ACCESS.2018.2873782

    Article  Google Scholar 

  12. Benedetto, F., Mastroeni, L., Vellucci, P.: Modeling the flow of information between financial time-series by an entropy-based approach. Ann. Oper. Res. 299, 1235–1252 (2021). https://doi.org/10.1007/s10479-019-03319-7

    Article  MathSciNet  MATH  Google Scholar 

  13. Nie, F., Zhang, P., Li, J., Ding, D.: A novel generalized entropy and its application in image thresholding. Signal Process. 134, 23–34 (2017). https://doi.org/10.1016/j.sigpro.2016.11.004

    Article  Google Scholar 

  14. Aaron Oludehinwa, I., Isaac Olusola, O., Segun Bolaji, O., Olayinka Odeyemi, O., Ndzi Njah, A.: Magnetospheric chaos and dynamical complexity response during storm time disturbance. Nonlinear Process. Geophys. 28, 257–270 (2021). https://doi.org/10.5194/NPG-28-257-2021

    Article  Google Scholar 

  15. Velichko, A., Heidari, H.: A method for estimating the entropy of time series using artificial neural networks. Entropy (2021). https://doi.org/10.3390/e23111432

    Article  MathSciNet  Google Scholar 

  16. Zhao, X., Ji, M., Zhang, N., Shang, P.: Permutation transition entropy: Measuring the dynamical complexity of financial time series. Chaos Soliton Fract. (2020). https://doi.org/10.1016/j.chaos.2020.109962

    Article  MathSciNet  MATH  Google Scholar 

  17. Keum, J., Coulibaly, P.: Sensitivity of Entropy Method to Time Series Length in Hydrometric Network Design. J. Hydrol. Eng. 22, 04017009 (2017). https://doi.org/10.1061/(asce)he.1943-5584.0001508

    Article  Google Scholar 

  18. Litak, G., Taccani, R., Radu, R., Urbanowicz, K., Hołyst, J.A., Wendeker, M., Giadrossi, A.: Estimation of a noise level using coarse-grained entropy of experimental time series of internal pressure in a combustion engine. Chaos Soliton Fract. 23, 1695–1701 (2005). https://doi.org/10.1016/j.chaos.2004.06.057

    Article  MATH  Google Scholar 

  19. Richman, J.S., Moorman, J.R.: Physiological time-series analysis using approximate entropy and sample entropy maturity in premature infants Physiological time-series analysis using approximate entropy and sample entropy. Am. J. Physiol. Hear. Circ. Physiol. 278, H2039–H2049 (2000)

    Article  Google Scholar 

  20. Pan, Y.H., Wang, Y.H., Liang, S.F., Lee, K.T.: Fast computation of sample entropy and approximate entropy in biomedicine. Comput. Methods Programs Biomed. 104, 382–396 (2011). https://doi.org/10.1016/j.cmpb.2010.12.003

    Article  Google Scholar 

  21. Li, G., Liu, F., Yang, H.: Research on feature extraction method of ship radiated noise with k-nearest neighbor mutual information variational mode decomposition, neural network estimation time entropy and self-organizi. measurement., 11146 (2022)

  22. Heidari, H.: Biomedical signal analysis using entropy measures : a case study of motor imaginary bci in end users with disability. In: Biomedical Signals Based Computer-Aided Diagnosis for Neurological Disorders. pp. 145–163 (2022)

  23. Velichko, A., Belyaev, M., Wagner, M.P., Taravat, A.: Entropy Approximation by Machine Learning Regression: Application for Irregularity Evaluation of Images in Remote Sensing. Remote Sens. 14(23), 5983 (2022). https://doi.org/10.3390/rs14235983(2022)

    Article  Google Scholar 

  24. Velichko, A., Wagner, M.P., Taravat, A., Hobbs, B., Ord, A.: NNetEn2D: Two-Dimensional Neural Network Entropy in Remote Sensing Imagery and Geophysical Mapping. Remote Sens. 14(9), 2166 (2022)

    Article  Google Scholar 

  25. Boriskov, P., Velichko, A., Shilovsky, N., Belyaev, M.: Bifurcation and Entropy Analysis of a Chaotic Spike Oscillator Circuit Based on the S-Switch. Remote Sens. 14(9), 2166 (2022). https://doi.org/10.3390/rs14092166(2022)

    Article  Google Scholar 

  26. Oludehinwa, I.A., Velichko, A., Ogunsua, B.O., Olusola, O.I., Odeyemi, O.O., Njah, A.N., Ologun, O.T.: Dynamical complexity response in Traveling Ionospheric Disturbances across Eastern Africa sector during geomagnetic storms using Neural Network Entropy. Sci. Open Arch Earth Sp (2022). https://doi.org/10.1002/essoar.10510393.1

    Article  Google Scholar 

  27. Zanin, M., Olivares, F.: Ordinal patterns-based methodologies for distinguishing chaos from noise in discrete time series. Commun. Phys. 4(1), 1–14 (2021)

    Article  Google Scholar 

  28. Delgado-Bonal, A., Marshak, A.: Approximate entropy and sample entropy: A comprehensive tutorial. Entropy 21, 541 (2019). https://doi.org/10.3390/e21060541

    Article  MathSciNet  Google Scholar 

  29. Chen, Z., Li, Y., Liang, H., Yu, J.: Improved Permutation Entropy for Measuring Complexity of Time Series under Noisy Condition. Complexity 2019, 12 (2019)

    Google Scholar 

  30. Xie, H.B., Guo, T.: Fuzzy entropy spectrum analysis for biomedical signals de-noising. 2018 IEEE EMBS Int. Conf. Biomed. Heal. Informatics, BHI 2018. 2018-Janua, 50–53 (2018). https://doi.org/10.1109/BHI.2018.8333367

  31. Chatterjee, S., Chapman, S.C., Lunt, B.M., Linford, M.R.: Using cross-correlation with pattern recognition entropy to obtain reduced total ion current chromatograms from raw liquid chromatography-mass spectrometry data. Bull. Chem. Soc. Jpn. 91, 1775–1780 (2018). https://doi.org/10.1246/BCSJ.20180230

    Article  Google Scholar 

  32. Na, S.D., Wei, Q., Seong, K.W., Cho, J.H., Kim, M.N.: Noise reduction algorithm with the soft thresholding based on the Shannon entropy and bone-conduction speech cross- correlation bands. Technol. Heal. Care. 26, 281–289 (2018). https://doi.org/10.3233/THC-174615

    Article  Google Scholar 

  33. Wang, Z., Wang, J., Zhao, Z., Wang, R.: A novel method for multi-fault feature extraction of a gearbox under strong background noise. Entropy (2018). https://doi.org/10.3390/e20010010

    Article  Google Scholar 

  34. Ahmed, M.U., Mandic, D.P.: Multivariate Multiscale Entropy Analysis. IEEE Signal process. 19(2), 91–94 (2012)

    Article  Google Scholar 

  35. Simons, S., Espino, P., Abásolo, D.: Fuzzy entropy analysis of the electroencephalogram in patients with Alzheimer’s disease: is the method superior to sample entropy? Entropy (2018). https://doi.org/10.3390/e20010021

    Article  Google Scholar 

  36. Niu, H., Wang, J.: Quantifying complexity of financial short-term time series by composite multiscale entropy measure. Commun. Nonlinear Sci. Numer. Simul. 22, 375–382 (2015). https://doi.org/10.1016/j.cnsns.2014.08.038

    Article  MathSciNet  MATH  Google Scholar 

  37. Molavipour, S., Ghourchian, H., Bassi, G., Skoglund, M.: Neural estimator of information for time-series data with dependency. Entropy (2021). https://doi.org/10.3390/e23060641

    Article  MathSciNet  MATH  Google Scholar 

  38. Wu, S.D., Wu, C.W., Lee, K.Y., Lin, S.G.: Modified multiscale entropy for short-term time series analysis. Phys. A Stat. Mech. Appl. 392, 5865–5873 (2013). https://doi.org/10.1016/j.physa.2013.07.075

    Article  Google Scholar 

  39. Soroush, M.Z., Maghooli, K., Setarehdan, S.K., Nasrabadi, A.M.: A Review on EEG signals based emotion recognition. Inter. Clin. Neurosci. J. 4(4), 118–129 (2017). https://doi.org/10.15171/icnj.2017.01

    Article  Google Scholar 

  40. Tuncer, T., Dogan, S., Subasi, A.: A new fractal pattern feature generation function based emotion recognition method using EEG. Chaos Soliton Fract. 144, 110671 (2021). https://doi.org/10.1016/j.chaos.2021.110671

    Article  MathSciNet  Google Scholar 

  41. Parameshwara, R., Member, S., Narayana, S., Member, S., Murugappan, M., Member, S., Subramanian, R., Member, S.: Automated Parkinson’s Disease Detection and Affective Analysis from Emotional EEG Signals. 1–12, https://doi.org/10.48550/arXiv.2202.12936

  42. Murugappan, M., Alshuaib, W., Bourisly, A.K., Khare, S.K., Sruthi, S., Bajaj, V.: Tunable Q wavelet transform based emotion classification in Parkinson’s disease using Electroencephalography. PLoS ONE 15, 1–17 (2020). https://doi.org/10.1371/journal.pone.0242014

    Article  Google Scholar 

  43. Murugappan, M., Alshuaib, W.B., Bourisly, A., Sruthi, S., Khairunizam, W., Shalini, B., Yean, W.: Emotion Classification in Parkinson’s D isease EEG using RQA and ELM. 2020 16th IEEE Int. Colloq. Signal Process. Its Appl. 290–295 (2020). https://doi.org/10.1109/CSPA48992.2020.9068709

  44. Li, D., Xie, L., Chai, B., Wang, Z.: A feature-based on potential and differential entropy information for electroencephalogram emotion recognition. Electron. Lett. 58(4), 174–177 (2022)

    Article  Google Scholar 

  45. Li, C., Hou, Y., Song, R., Cheng, J., Liu, Y.: Multi-channel EEG-based emotion recognition in the presence of noisy labels. Sci. China Inform. Sci. (2022). https://doi.org/10.1007/s11432-021-3439-2

    Article  Google Scholar 

  46. Yuvaraj, R., Murugappan, M., Omar, M.I., Ibrahim, N.M., Sundaraj, K., Mohamad, K., Satiyan, M.: Emotion processing in Parkinson’s disease: an EEG spectral power study. Int. J. Neurosci. 124, 491–502 (2014). https://doi.org/10.3109/00207454.2013.860527

    Article  Google Scholar 

  47. Deng, L.: The MNIST database of handwritten digit images for machine learning research. IEEE Signal Process. Mag. 29, 141–142 (2012). https://doi.org/10.1109/MSP.2012.2211477

    Article  Google Scholar 

  48. Tabik, S., Alvear-Sandoval, R.F., Ruiz, M.M., Sancho-Gómez, J.L., Figueiras-Vidal, A.R., Herrera, F.: MNIST-NET10: A heterogeneous deep networks fusion based on the degree of certainty to reach 0.1% error rate. ensembles overview and proposal. Inf. Fusion. 62, 73–80 (2020). https://doi.org/10.1016/j.inffus.2020.04.002

    Article  Google Scholar 

  49. Velichko, A.: Neural network for low-memory IoT devices and MNIST image recognition using kernels based on logistic map. Electronics 9, 1432 (2020). https://doi.org/10.3390/electronics9091432

    Article  Google Scholar 

  50. Heidari, H., Velichko, A.: An improved LogNNet classifier for IoT applications. J. Phys. Conf. Ser. (2021). https://doi.org/10.1088/1742-6596/2094/3/032015

    Article  Google Scholar 

  51. Savi, M.A.: Effects of randomness on chaos and order of coupled logistic maps. Phys. Lett. Sect. A Gen. At. Solid State Phys. 364, 389–395 (2007). https://doi.org/10.1016/j.physleta.2006.11.095

    Article  MATH  Google Scholar 

  52. Lee, S.H., Park, C.M.: A new measure to characterize the self-similarity of binary time series and its application. IEEE Access 9, 73799–73807 (2021). https://doi.org/10.1109/ACCESS.2021.3081400

    Article  Google Scholar 

  53. Kadtke, J.B., Kremliovsky, M.: Classifying complex, deterministic signals. In: Kravtsov, Y.A., Kadtke, J.B. (eds.) Predictability of Complex Dynamical Systems, pp. 79–102. Springer, Berlin Heidelberg (1996)

    Chapter  Google Scholar 

Download references

Funding

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Murugappan Murugappan.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Heidari, H., Velichko, A., Murugappan, M. et al. Novel techniques for improving NNetEn entropy calculation for short and noisy time series. Nonlinear Dyn 111, 9305–9326 (2023). https://doi.org/10.1007/s11071-023-08298-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11071-023-08298-w

Keywords

Navigation