Strengths and weaknesses of the FAIRMODE benchmarking methodology for the evaluation of air quality models

  • A. Monteiro
  • P. Durka
  • C. Flandorfer
  • E. Georgieva
  • C. Guerreiro
  • J. Kushta
  • L. Malherbe
  • B. Maiheu
  • A. I. Miranda
  • G. Santos
  • J. Stocker
  • E. Trimpeneers
  • F. Tognet
  • M. Stortini
  • J. Wesseling
  • S. Janssen
  • P. Thunis
Article
  • 70 Downloads

Abstract

The Forum of Air Quality Modelling in Europe (FAIRMODE) was launched in 2007 to bring together air quality modellers and users in order to promote and support the harmonised use of models by EU Member States, with emphasis on model application under the European Air Quality Directive. In this context, a methodology for evaluating air quality model applications has been developed. This paper presents an analysis of the strengths and weaknesses of the FAIRMODE benchmarking approach, based on users’ feedback. European wide, regional and urban scale model applications, developed by different research groups over Europe, have been taken into account. The analysis is focused on the main pollutants under the Air Quality Directive, namely PM10, NO2 and O3. The different case studies are described and analysed with respect to the methodologies applied for model evaluation and quality assurance. This model evaluation intercomparison demonstrates the potential of a harmonised evaluation and benchmarking methodology. A SWOT analysis of the FAIRMODE benchmarking approach is performed based on feedback from users of the tool. This analysis helps to identify the main advantages and value of this model evaluation benchmarking approach compared with other methodologies, in addition to highlighting requirements for future development.

Keywords

Air quality modelling Model evaluation DELTA tool Benchmarking FAIRMODE (MQO) 

Notes

Acknowledgements

Thanks are due for the financial support to CESAM (UID/AMB/50017 - POCI-01-0145-FEDER-007638), to FCT/MCTES through national funds (PIDDAC), and the co-funding by the FEDER, within the PT2020 Partnership Agreement and Compete 2020. This work was partly performed within FAIRMODE (http://fairmode.ew.eea.europa.eu/), and the community members are acknowledged for their contribution.

References

  1. Adriaenssens S, Trimpeneers E, (2015) Transnational model intercomparison and validation exercise in North-West Europe. Interregional Environment Agency Belgium (IRCEL). Final report of the Joaquin EU-Interreg IVB projectGoogle Scholar
  2. Alexandrov GA, Ames D, Bellocchi G, Bruen M, Crout N, Erechtchoukova M, Hildebrandt A, Hoffman F, Jackisch C, Khaiter P, Mannina G, Mathunaga T, Purucker ST, Rivington M, Samaniego L (2011) Technical assessment and evaluation of environmental models and software: letter to the editor. Environ Model Softw 26(3):328–336CrossRefGoogle Scholar
  3. AQD (2008) Directive 2008/50/EC of the European Parliament and of the Council of 21 May 2008 on Ambient Air Quality and Cleaner Air for Europe (No. 152), Official JournalGoogle Scholar
  4. ASTM Standard D6589 (2005) Standard guide for statistical evaluation of atmospheric dispersion model performance (no. D6589). ASTM international, west Conshohocken, PAGoogle Scholar
  5. Borrego C, Monteiro A, Ferreira J, Miranda AI, Costa AM, Carvalho AC, Lopes M (2008) Procedures for estimation of modelling uncertainty in air quality assessment. Environ Int 34:613–620CrossRefGoogle Scholar
  6. Carnevale C, Finzi G, Pederzoli A, Pisoni E, Thunis P, Turrini E, Volta M (2014) Applying the delta tool to support the air quality directive: evaluation of the TCAM chemical transport model. Air Qual Atmos Hlth 7(3):335–346CrossRefGoogle Scholar
  7. Carnevale C, Finzi G, Pederzoli A, Pisoni E, Thunis P, Turrini E, Volta M (2015) A methodology for the evaluation of re-analyzed PM10 concentration fields: a case study over the PO Valley. Air Qual Atmos Hlth 8(6):533–544CrossRefGoogle Scholar
  8. Denby B (2010) Guidance on the use of models for the European air quality directive (ETC/ACC no. version 6.2). In: a working document of the forum for air quality modelling in Europe FAIRMODEGoogle Scholar
  9. Dennis R, Fox T, Fuentes M, Gilliland A, Hanna S, Hogrefe C, Irwin J, Rao ST, Scheffe R, Schere K, Steyn D, Venkatram A (2010) A framework for evaluating regional-scale numerical photochemical modeling systems. Environ Fluid Mech 10:471–489CrossRefGoogle Scholar
  10. Derwent D, Fraser A, Abbott J, Willis P, Murrells T (2010) Evaluating the performance of air quality models (no. issue 3). Department for Environment and Rural AffairsGoogle Scholar
  11. Georgieva E, Syrakov D, Prodanova M, Etropolska I, Slavov K (2015) Evaluating the performance of WRF-CMAQ air quality modelling system in Bulgaria by means of the DELTA tool. Int J Environ Pollut 57(3/4):272–284CrossRefGoogle Scholar
  12. Irwin JS, Civerolo K, Hogrefe C, Appel W, Foley K, Swall J (2008) A procedure for inter-comparing the skill of regional-scale air quality model simulations of daily maximum 8-h ozone concentrations. Atmos Environ 42:5403–5412CrossRefGoogle Scholar
  13. Jakeman AJ, Letcher RA, Norton JP (2006) Ten iterative steps in development and evaluation of environmental models. Environ Model Softw 21(5):602–614CrossRefGoogle Scholar
  14. Janssen S, Dumont G, Fierens F, Deutsch F, Maiheu B, Celis D, Trimpeneers E, Mensink C (2012) Land use to characterize spatial representativeness of air quality monitoring stations and its relevance for model validation. Atmos Environ 59:492–500CrossRefGoogle Scholar
  15. Kaminski JW, Neary L, Struzewska J, McConnell JC, Lupu A, Jarosz J, Toyota K, Gong SL, Côté J, Liu X, Chance K, Richter A (2008) GEM-AQ, an on-line global multiscale chemical weather modelling system: model description and evaluation of gas phase chemistry processes. Atmos Chem Phys 8(12):3255–3281CrossRefGoogle Scholar
  16. Kracht O. (2018) Spatial representativeness of air quality monitoring sites—outcomes of the FAIRMODE / AQUILA Intercomparison exercise, JRC Technical report (in press)Google Scholar
  17. Martin F, Fileni L, Palomino I, Vivanco MG, Garrido JL (2014) Analysis of the spatial representativeness of rural background monitoring stations in Spain. Atmospheric Pollution Res 5:779–788CrossRefGoogle Scholar
  18. Pernigotti D, Thunis P, Belis C, Gerboles M (2013) Model quality objectives based on measurement uncertainty. Part II: PM10 and NO2. Atmos Environ 79:869–878CrossRefGoogle Scholar
  19. Ribeiro I., Monteiro A., Miranda A.I., Fernandes A.P., Monteiro A.C., Lopes M., Borrego C. (2014). Air quality modelling as a supplementary assessment method in the frame of the European air quality directive. International Journal of Environmental Pollution 54, Nos. 2/3/4, 262–270Google Scholar
  20. Solomon PA (2012) Introduction: addressing air pollution and health science questions to inform science and policy. Air Qual Atmos Hlth 5(2):149–150CrossRefGoogle Scholar
  21. Stidworthy A, Jackson M, Johnson K, Carruthers D, Stocker J (2017) Evaluation of local and regional air quality forecasts for London. In: Proc 18th conference on harmonisation within atmospheric dispersion modelling for regulatory purposes, Bologna, 9–12 October 2017Google Scholar
  22. Stortini M, Agostini C, Maccaferri S, Amorati R (2017) Applying the FAIRMODE tools to support the Air Quality Directive: the experiences of ARPAE. In: Proc. 18th international conference on harmonisation within atmospheric dispersion modelling for regulatory purposes, Bologna, Italy, October 9–12. Submitted to the IJEP special issueGoogle Scholar
  23. Thunis P, Georgieva E, Pederzoli A (2012a) A tool to evaluate air quality model performances in regulatory applications. Environ Model Softw 38:220–230CrossRefGoogle Scholar
  24. Thunis P, Pederzoli A, Pernigotti D (2012b) Performance criteria to evaluate air quality modeling applications. Atmos Environ 59:476–482CrossRefGoogle Scholar
  25. Thunis P, Pernigotti D, Gerboles M (2013) Model quality objectives based on measurement uncertainty. Part I: ozone. Atmos Environ 79:861–868CrossRefGoogle Scholar
  26. Thunis P, Pisoni E, Degraeuwe B, Kranenburg R, Schaap M, Clappier A (2015) Dynamic evaluation of air quality models over European regions. Atmos Environ 111:185–194CrossRefGoogle Scholar
  27. Veldeman N., Maiheu B., Lefebvre W. et al. (2016) Activity report for 2015 reference task on air quality modelling in Flanders. VITO report nr. 2016/RMA/R/0582 (in Dutch)Google Scholar

Copyright information

© Springer Science+Business Media B.V., part of Springer Nature 2018

Authors and Affiliations

  • A. Monteiro
    • 1
  • P. Durka
    • 2
  • C. Flandorfer
    • 3
  • E. Georgieva
    • 4
  • C. Guerreiro
    • 5
  • J. Kushta
    • 6
  • L. Malherbe
    • 7
  • B. Maiheu
    • 8
  • A. I. Miranda
    • 1
  • G. Santos
    • 5
  • J. Stocker
    • 9
  • E. Trimpeneers
    • 10
  • F. Tognet
    • 7
  • M. Stortini
    • 11
  • J. Wesseling
    • 12
  • S. Janssen
    • 8
  • P. Thunis
    • 13
  1. 1.CESAM, Department of Environment and PlanningUniversity of AveiroAveiroPortugal
  2. 2.Institute of Environmental Protection – National Research InstituteWarsawPoland
  3. 3.Section Environmental MeteorologyZentralanstalt für Meteorologie und Geodynamik (ZAMG)ViennaAustria
  4. 4.National Institute of Meteorology and HydrologyBulgarian Academy of SciencesSofiaBulgaria
  5. 5.Norwegian Institute for Air Research (NILU)KjellerNorway
  6. 6.The Cyprus Institute, Energy, Environment and Water Research CentreNicosiaCyprus
  7. 7.INERISVerneuil en HalatteFrance
  8. 8.VITOMolBelgium
  9. 9.Cambridge Environmental Research Consultants (CERC)CambridgeUK
  10. 10.Belgian Interregional Environment Agency (IRCEL)BrusselsBelgium
  11. 11.Regional Agency for Prevention, Environment and Energy (ARPAE)BolognaItaly
  12. 12.Centre for Environmental QualityNational Institute for Public Health and the EnvironmentBilthovenThe Netherlands
  13. 13.Joint Research Centre (JRC), Directorate for Energy, Transport and Climate, Air and Climate UnitEuropean CommissionIspraItaly

Personalised recommendations