Skip to main content
Log in

A Novel FD3 Framework for Carbon Emissions Prediction

  • Research
  • Published:
Environmental Modeling & Assessment Aims and scope Submit manuscript

Abstract

Monitoring and controlling the carbon emissions need machine learning-based forecasting models at this modern era. Despite various of artificial neural networks (ANNs), we propose a novel FD3 framework to tackle carbon emissions prediction. In our approach, three “FD” procedures are executed: (1) frequency decomposition achieved by using complementary ensemble empirical mode decomposition with adaptive noise (CEEMDAN), an advanced version of the famous empirical mode decomposition (EMD); (2) forecasting dendritic neuron model (DNM) that has proved validity on numerous prediction tasks, showing advanced nonlinear fitting ability than traditional network-structured ANNs; and (3) fluctuation density measurement (FD function) that used to regulate the predicting strategy for each decomposed subseries. In experiments, the FD3 framework has shown better performance than seven baseline models in terms of three widely used time series prediction evaluation metrics. The success of our FD3 has confirmed the validity of “preprocessing-forecasting” workflow and provides better solutions for carbon emissions prediction. Furthermore, the design of FD function can give more insights for signal analysis that the selection of decomposed subseries can have huge impacts on the original data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data Availability

The data used in this work can be accessed at https://carbonmonitor.org.

References

  1. Liu, Z., Deng, Z., Davis, S. J., Giron, C., & Ciais, P. (2022). Monitoring global carbon emissions in 2021. Nature Reviews Earth & Environment, 3(4), 217–219. https://doi.org/10.1038/s43017-022-00285-w

    Article  Google Scholar 

  2. Wen, J., Yang, J., Jiang, B., Song, H., & Wang, H. (2020). Big data driven marine environment information forecasting: A time series prediction network. IEEE Transactions on Fuzzy Systems, 29(1), 4–18. https://doi.org/10.1109/TFUZZ.2020.3012393

    Article  Google Scholar 

  3. Zhang, W., Li, X., & Li, X. (2020). Deep learning-based prognostic approach for lithium-ion batteries with adaptive time-series prediction and on-line validation. Measurement, 164, 108052. https://doi.org/10.1016/j.measurement.2020.108052

    Article  Google Scholar 

  4. Guo, J., Lao, Z., Hou, M., Li, C., & Zhang, S. (2021). Mechanical fault time series prediction by using EFMSAE-LSTM neural network. Measurement, 173, 108566. https://doi.org/10.1016/j.measurement.2020.108566

    Article  Google Scholar 

  5. Aiken, E. L., Nguyen, A. T., Viboud, C., & Santillana, M. (2021). Toward the use of neural networks for influenza prediction at multiple spatial resolutions. Science Advances, 7(25), 1237. https://doi.org/10.1126/sciadv.abb1237

    Article  Google Scholar 

  6. McCulloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. The Bulletin of Mathematical Biophysics, 5(4), 115–133. https://doi.org/10.1007/BF02478259

    Article  Google Scholar 

  7. Younis, M. C. (2021). Evaluation of deep learning approaches for identification of different corona-virus species and time series prediction. Computerized Medical Imaging and Graphics, 90, 101921. https://doi.org/10.1016/j.compmedimag.2021.101921

    Article  Google Scholar 

  8. Liu, Y., Gong, C., Yang, L., & Chen, Y. (2020). DSTP-RNN: A dual-stage two-phase attention-based recurrent neural network for long-term and multivariate time series prediction. Expert Systems with Applications, 143, 113082. https://doi.org/10.1016/j.eswa.2019.113082

    Article  Google Scholar 

  9. Liu, H., & Long, Z. (2020). An improved deep learning model for predicting stock market price time series. Digital Signal Processing, 102, 102741. https://doi.org/10.1016/j.dsp.2020.102741

    Article  Google Scholar 

  10. Wang, P., Zheng, X., Ai, G., Liu, D., & Zhu, B. (2020). Time series prediction for the epidemic trends of COVID-19 using the improved LSTM deep learning method: Case studies in Russia, Peru and Iran. Chaos, Solitons & Fractals, 140, 110214. https://doi.org/10.1016/j.chaos.2020.110214

    Article  Google Scholar 

  11. Townsend, J., Chaton, T., & Monteiro, J. M. (2020). Extracting relational explanations from deep neural networks: A survey from a neural-symbolic perspective. IEEE Transactions on Neural Networks and Learning Systems, 31(9), 3456–3470. https://doi.org/10.1109/TNNLS.2019.2944672

    Article  Google Scholar 

  12. Stanley, K. O., Clune, J., Lehman, J., & Miikkulainen, R. (2019). Designing neural networks through neuroevolution. Nature Machine Intelligence, 1(1), 24–35. https://doi.org/10.1038/s42256-018-0006-z

    Article  Google Scholar 

  13. Tung, F., & Mori, G. (2020). Deep neural network compression by in-parallel pruning-quantization. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(3), 568–579. https://doi.org/10.1109/TPAMI.2018.2886192

    Article  Google Scholar 

  14. Yeom, S.-K., Seegerer, P., Lapuschkin, S., Binder, A., Wiedemann, S., Müller, K.-R., & Samek, W. (2021). Pruning by explaining: A novel criterion for deep neural network pruning. Pattern Recognition, 115, 107899. https://doi.org/10.1016/j.patcog.2021.107899

    Article  Google Scholar 

  15. Valente, J. M., & Maldonado, S. (2020). SVR-FFS: A novel forward feature selection approach for high-frequency time series forecasting using support vector regression. Expert Systems with Applications, 160, 113729. https://doi.org/10.1016/j.eswa.2020.113729

    Article  Google Scholar 

  16. Wang, J., Peng, Z., Wang, X., Li, C., & Wu, J. (2021). Deep fuzzy cognitive maps for interpretable multivariate time series prediction. IEEE Transactions on Fuzzy Systems, 29(9), 2647–2660. https://doi.org/10.1109/TFUZZ.2020.3005293

    Article  Google Scholar 

  17. Chen, X., & Sun, L. (2022). Bayesian temporal factorization for multidimensional time series prediction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(9), 4659–4673. https://doi.org/10.1109/TPAMI.2021.3066551

    Article  Google Scholar 

  18. Ghosh-Dastidar, S., & Adeli, H. (2009). Spiking neural networks. International Journal of Neural Systems, 19(04), 295–308. https://doi.org/10.1142/S0129065709002002

    Article  Google Scholar 

  19. Capizzi, G., Sciuto, G. L., Napoli, C., Woźniak, M., & Susi, G. (2020). A spiking neural network-based long-term prediction system for biogas production. Neural Networks, 129, 271–279. https://doi.org/10.1016/j.neunet.2020.06.001

    Article  Google Scholar 

  20. Kumarasinghe, K., Kasabov, N., & Taylor, D. (2021). Brain-inspired spiking neural networks for decoding and understanding muscle activity and kinematics from electroencephalography signals during hand movements. Scientific Reports, 11(1), 1–15. https://doi.org/10.1038/s41598-021-81805-4

    Article  CAS  Google Scholar 

  21. Poirazi, P., & Papoutsi, A. (2020). Illuminating dendritic function with computational models. Nature Reviews Neuroscience, 21(6), 303–321. https://doi.org/10.1038/s41583-020-0301-7

    Article  CAS  Google Scholar 

  22. Ujfalussy, B. B., Makara, J. K., Lengyel, M., & Branco, T. (2018). Global and multiplexed dendritic computations under in vivo-like conditions. Neuron, 100(3), 579–592. https://doi.org/10.1016/j.neuron.2018.08.032

    Article  CAS  Google Scholar 

  23. Dan, O., Hopp, E., Borst, A., & Segev, I. (2018). Non-uniform weighting of local motion inputs underlies dendritic computation in the fly visual system. Scientific Reports, 8(1), 1–12. https://doi.org/10.1038/s41598-018-23998-9

    Article  CAS  Google Scholar 

  24. Gidon, A., Zolnik, T. A., Fidzinski, P., Bolduan, F., Papoutsi, A., Poirazi, P., Holtkamp, M., Vida, I., & Larkum, M. E. (2020). Dendritic action potentials and computation in human layer 2/3 cortical neurons. Science, 367(6473), 83–87. https://doi.org/10.1126/science.aax6239

    Article  CAS  Google Scholar 

  25. Francioni, V., & Harnett, M. T. (2022). Rethinking single neuron electrical compartmentalization: Dendritic contributions to network computation in vivo. Neuroscience, 489, 185–199. https://doi.org/10.1016/j.neuroscience.2021.05.038

    Article  CAS  Google Scholar 

  26. Todo, Y., Tamura, H., Yamashita, K., & Tang, Z. (2014). Unsupervised learnable neuron model with nonlinear interaction on dendrites. Neural Networks, 60, 96–103. https://doi.org/10.1016/j.neunet.2014.07.011

    Article  Google Scholar 

  27. Todo, Y., Tang, Z., Todo, H., Ji, J., & Yamashita, K. (2019). Neurons with multiplicative interactions of nonlinear synapses. International Journal of Neural Systems, 29(08), 1950012. https://doi.org/10.1142/S0129065719500126

    Article  Google Scholar 

  28. Ji, J., Tang, Y., Ma, L., Li, J., Lin, Q., Tang, Z., & Todo, Y. (2021). Accuracy versus simplification in an approximate logic neural model. IEEE Transactions on Neural Networks and Learning Systems, 32(11), 5194–5207. https://doi.org/10.1109/TNNLS.2020.3027298

    Article  Google Scholar 

  29. Gao, S., Zhou, M., Wang, Z., Sugiyama, D., Cheng, J., Wang, J., Todo, Y. (2021). Fully complex-valued dendritic neuron model. IEEE Transactions on Neural Networks and Learning Systems, 1–14. https://doi.org/10.1109/TNNLS.2021.3105901

  30. Luo, X., Wen, X., Zhou, M., Abusorrah, A., & Huang, L. (2022). Decision-tree-initialized dendritic neuron model for fast and accurate data classification. IEEE Transactions on Neural Networks and Learning Systems, 33(9), 4173–4183. https://doi.org/10.1109/TNNLS.2021.3055991

    Article  Google Scholar 

  31. Zhou, T., Gao, S., Wang, J., Chu, C., Todo, Y., & Tang, Z. (2016). Financial time series prediction using a dendritic neuron model. Knowledge-Based Systems, 105, 214–224. https://doi.org/10.1016/j.knosys.2016.05.031

    Article  Google Scholar 

  32. Chen, W., Sun, J., Gao, S., Cheng, J.-J., Wang, J., & Todo, Y. (2017). Using a single dendritic neuron to forecast tourist arrivals to Japan. IEICE Transaction on Information and Systems, 100(1), 190–202. https://doi.org/10.1587/transinf.2016EDP7152

    Article  Google Scholar 

  33. Zhang, T., Lv, C., Ma, F., Zhao, K., Wang, H., & O’Hare, G. M. (2020). A photovoltaic power forecasting model based on dendritic neuron networks with the aid of wavelet transform. Neurocomputing, 397, 438–446. https://doi.org/10.1016/j.neucom.2019.08.105

    Article  Google Scholar 

  34. Song, Z., Tang, Y., Ji, J., & Todo, Y. (2020). Evaluating a dendritic neuron model for wind speed forecasting. Knowledge-Based Systems, 201–202, 106052. https://doi.org/10.1016/j.knosys.2020.106052

    Article  Google Scholar 

  35. Tang, Y., Ji, J., Gao, S., Dai, H., Yu, Y., & Todo, Y. (2018). A pruning neural network model in credit classification analysis. Computational Intelligence and Neuroscience, 2018, 22. https://doi.org/10.1155/2018/9390410

    Article  Google Scholar 

  36. Gao, S., Zhou, M., Wang, Y., Cheng, J., Yachi, H., & Wang, J. (2019). Dendritic neuron model with effective learning algorithms for classification, approximation, and prediction. IEEE Transactions on Neural Networks and Learning Systems, 30(2), 601–614. https://doi.org/10.1109/TNNLS.2018.2846646

    Article  Google Scholar 

  37. Xu, Z., Wang, Z., Li, J., Jin, T., Meng, X., & Gao, S. (2021). Dendritic neuron model trained by information feedback-enhanced differential evolution algorithm for classification. Knowledge-Based Systems, 233, 107536. https://doi.org/10.1016/j.knosys.2021.107536

    Article  Google Scholar 

  38. Yu, Y., Lei, Z., Wang, Y., Zhang, T., Peng, C., & Gao, S. (2022). Improving dendritic neuron model with dynamic scale-free network-based differential evolution. IEEE/CAA Journal of Automatica Sinica, 9(1), 99–110. https://doi.org/10.1109/JAS.2021.1004284

    Article  Google Scholar 

  39. Esteban, O., Markiewicz, C. J., Blair, R. W., Moodie, C. A., Isik, A. I., Erramuzpe, A., Kent, J. D., Goncalves, M., DuPre, E., Snyder, M., et al. (2019). fMRIPrep: A robust preprocessing pipeline for functional MRI. Nature Methods, 16(1), 111–116. https://doi.org/10.1038/s41592-018-0235-4

    Article  CAS  Google Scholar 

  40. Mishra, P., Biancolillo, A., Roger, J. M., Marini, F., & Rutledge, D. N. (2020). New data preprocessing trends based on ensemble of multiple preprocessing techniques. TrAC Trends in Analytical Chemistry, 132, 116045. https://doi.org/10.1016/j.trac.2020.116045

    Article  CAS  Google Scholar 

  41. Melsted, P., Booeshaghi, A. S., Liu, L., Gao, F., Lu, L., Min, K. H. J., da Veiga Beltrame, E., Hjörleifsson, K. E., Gehring, J., & Pachter, L. (2021). Modular, efficient and constant-memory single-cell RNA-seq preprocessing. Nature Biotechnology, 39(7), 813–818. https://doi.org/10.1038/s41587-021-00870-2

  42. Qasem, S. N., & Mohammadzadeh, A. (2021). A deep learned type-2 fuzzy neural network: Singular value decomposition approach. Applied Soft Computing, 105, 107244. https://doi.org/10.1016/j.asoc.2021.107244

    Article  Google Scholar 

  43. Gendeel, M., Zhang, Y., Qian, X., & Xing, Z. (2021). Deterministic and probabilistic interval prediction for wind farm based on VMD and weighted LS-SVM. Energy Sources, Part A: Recovery, Utilization, and Environmental Effects, 43(7), 800–814. https://doi.org/10.1080/15567036.2019.1632980

    Article  Google Scholar 

  44. He, H., Gao, S., Jin, T., Sato, S., & Zhang, X. (2021). A seasonal-trend decomposition-based dendritic neuron model for financial time series prediction. Applied Soft Computing, 108, 107488. https://doi.org/10.1016/j.asoc.2021.107488

    Article  Google Scholar 

  45. Taheri, S., Talebjedi, B., Laukkanen, T., et al. (2021). Electricity demand time series forecasting based on empirical mode decomposition and long short-term memory. Energy Engineering: Journal of the Association of Energy Engineering, 118(6), 1577–1594. https://doi.org/10.32604/EE.2021.017795

  46. Rezaei, H., Faaljou, H., & Mansourfar, G. (2021). Stock price prediction using deep learning and frequency decomposition. Expert Systems with Applications, 169, 114332. https://doi.org/10.1016/j.eswa.2020.114332

    Article  Google Scholar 

  47. Huang, N. E., Shen, Z., Long, S. R., Wu, M. C., Shih, H. H., Zheng, Q., Yen, N. C., Tung, C. C., Liu, H. H. (1998). The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences, 454(1971), 903–995. https://doi.org/10.1098/rspa.1998.0193

  48. Wu, Z., & Huang, N. E. (2009). Ensemble empirical mode decomposition: A noise-assisted data analysis method. Advances in Adaptive Data Analysis, 1(01), 1–41. https://doi.org/10.1142/S1793536909000047

    Article  Google Scholar 

  49. Yeh, J.-R., Shieh, J.-S., & Huang, N. E. (2010). Complementary ensemble empirical mode decomposition: A novel noise enhanced data analysis method. Advances in Adaptive Data Analysis, 2(02), 135–156. https://doi.org/10.1142/S1793536910000422

    Article  Google Scholar 

  50. Chaudhari, H., Nalbalwar, S., Sheth, R. (2016). A review on intrensic mode function of EMD. In: 2016 International Conference on Electrical, Electronics, and Optimization Techniques (ICEEOT), pp. 2349–2352. https://doi.org/10.1109/ICEEOT.2016.7755114. IEEE.

  51. Wang, J., Zhang, W., Li, Y., Wang, J., & Dang, Z. (2014). Forecasting wind speed using empirical mode decomposition and Elman neural network. Applied Soft Computing, 23, 452–459. https://doi.org/10.1016/j.asoc.2014.06.027

    Article  Google Scholar 

  52. Cao, J., Li, Z., & Li, J. (2019). Financial time series forecasting model based on CEEMDAN and LSTM. Physica A: Statistical Mechanics and its Applications, 519, 127–139. https://doi.org/10.1016/j.physa.2018.11.061

    Article  Google Scholar 

  53. Dong, J., Dai, W., Tang, L., & Yu, L. (2019). Why do EMD-based methods improve prediction? A multiscale complexity perspective. Journal of Forecasting, 38(7), 714–731. https://doi.org/10.1002/for.2593

    Article  Google Scholar 

  54. Çelik, T. B., İcan, Ö., & Bulut, E. (2023). Extending machine learning prediction capabilities by explainable AI in financial time series prediction. Applied Soft Computing, 132, 109876. https://doi.org/10.1016/j.asoc.2022.109876

    Article  Google Scholar 

  55. Mounir, N., Ouadi, H., & Jrhilifa, I. (2023). Short-term electric load forecasting using an EMD-BI-LSTM approach for smart grid energy management system. Energy and Buildings, 288, 113022. https://doi.org/10.1016/j.enbuild.2023.113022

    Article  Google Scholar 

  56. Wu, H., Xu, J., Wang, J., & Long, M. (2021). Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems, 34, 22419–22430.

    Google Scholar 

  57. Torres, M. E., Colominas, M. A., Schlotthauer, G., Flandrin, P. (2011). A complete ensemble empirical mode decomposition with adaptive noise. In: 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4144–4147. https://doi.org/10.1109/ICASSP.2011.5947265. IEEE

  58. Colominas, M. A., Schlotthauer, G., & Torres, M. E. (2014). Improved complete ensemble EMD: A suitable tool for biomedical signal processing. Biomedical Signal Processing and Control, 14, 19–29. https://doi.org/10.1016/j.bspc.2014.06.009

    Article  Google Scholar 

  59. Boutana, D., Benidir, M., Barkat, B. (2010). On the selection of intrinsic mode function in EMD method: Application on heart sound signal. In: 2010 3rd International Symposium on Applied Sciences in Biomedical and Communication Technologies (ISABEL 2010), pp. 1–5. https://doi.org/10.1109/ISABEL.2010.5702895. IEEE

  60. Terrien, J., Marque, C., & Karlsson, B. (2011). Automatic detection of mode mixing in empirical mode decomposition using non-stationarity detection: Application to selecting IMFs of interest and denoising. EURASIP Journal on Advances in Signal Processing, 2011, 1–8. https://doi.org/10.1186/1687-6180-2011-37

    Article  Google Scholar 

  61. Huo, D., Huang, X., Dou, X., Ciais, P., Li, Y., Deng, Z., Wang, Y., Cui, D., Benkhelifa, F., Sun, T., et al. (2022). Carbon monitor cities near-real-time daily estimates of CO2 emissions from 1500 cities worldwide. Scientific Data, 9(1), 533. https://doi.org/10.1038/s41597-022-01657-z

    Article  CAS  Google Scholar 

  62. Wolf, A., Swift, J. B., Swinney, H. L., & Vastano, J. A. (1985). Determining Lyapunov exponents from a time series. Physica D: Nonlinear Phenomena, 16(3), 285–317. https://doi.org/10.1016/0167-2789(85)90011-9

    Article  Google Scholar 

  63. Li, D., Han, M., & Wang, J. (2012). Chaotic time series prediction based on a novel robust echo state network. IEEE Transactions on Neural Networks and Learning Systems, 23(5), 787–799. https://doi.org/10.1109/TNNLS.2012.2188414

    Article  Google Scholar 

Download references

Funding

This research was partially supported by the Japan Society for the Promotion of Science (JSPS) KAKENHI under Grant JP22H03643, Japan Science and Technology Agency (JST) Support for Pioneering Research Initiated by the Next Generation (SPRING) under Grant JPMJSP2145, and JST through the Establishment of University Fellowships toward the Creation of Science Technology Innovation under Grant JPMJFS2115.

Author information

Authors and Affiliations

Authors

Contributions

Houtian He: writing—original draft, methodology, conceptualization. Tongyan Liu: programming, writing—review and editing, visualization. Qianqian Li: writing—review and editing. Jiaru Yang: writing—review and editing. Rong-Long Wang: writing—reviewing and editing, supervision. Shangce Gao: writing—review and editing, resources, project administration, supervision.

Corresponding author

Correspondence to Shangce Gao.

Ethics declarations

Ethical Approval

Not applicable

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

He, H., Liu, T., Li, Q. et al. A Novel FD3 Framework for Carbon Emissions Prediction. Environ Model Assess 29, 455–469 (2024). https://doi.org/10.1007/s10666-023-09918-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10666-023-09918-w

Keywords

Navigation