Skip to main content
Log in

PM\(_{2.5}\) forecasting based on transformer neural network and data embedding

  • Research
  • Published:
Earth Science Informatics Aims and scope Submit manuscript

Abstract

Forecasting time series data is a big challenge due to the temporal and multivariate dependencies in the data. In this paper, we present a new approach named as TPPM25 (Transformer-based Prediction of PM\(_{2.5}\)) for forecasting PM\(_{2.5}\), a key air quality indicator. It is based on the state-of-the-art Transformer neural network and various data embedding techniques. By performing attention calculations among features over time steps, TPPM25 mimics cognitive attention and selectively enhances essential parts of the input data while diminishing other parts. TPPM25 is able to effectively capture temporal relations to multiple influencing meteorological features. Experiments demonstrate its effectiveness by comparing with a cutting-edge ensemble deep learning model from Zhang et al. (Inf Sci 544:427–445, 2021). Our TPPM25 model outperforms Zhang et al.’s model under the same experimental setting on a well-researched benchmark dataset. As Zhang et al.’s model is restricted to univariate PM\(_{2.5}\) prediction, our TPPM25 model bypasses this restriction and further improves the prediction accuracy when considering more influencing meteorological features. Moreover, our TPPM25 model is able to maintain high prediction accuracy over longer periods of time as compared to the Long-Short Term Memory (LSTM) and Bidirectional LSTM models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Data Availability

The datasets generated during and/or analysed during the current study are not publicly available but are available from the corresponding author on reasonable request.

Code Availability

The code generated during and/or analysed during the current study is not publicly available but is available from the corresponding author on reasonable request.

References

Download references

Acknowledgements

All authors are supported by the Office of Research, Georgia Southern University.

Funding

All authors are supported by the Office of Research, Georgia Southern University.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. Data collection, experiments, and analysis were performed by Jordan Limperis and Weitian Tong. The first draft of the manuscript was written by Jordan Limperis and Weitian Tong. All authors (i.e., Jordan Limperis, Weitian Tong, Felix Hamza-Lup, and Lixin Li) commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Weitian Tong.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Communicated by: H. Babaie.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Limperis, J., Tong, W., Hamza-Lup, F. et al. PM\(_{2.5}\) forecasting based on transformer neural network and data embedding. Earth Sci Inform 16, 2111–2124 (2023). https://doi.org/10.1007/s12145-023-01002-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12145-023-01002-x

Keywords

Navigation