Advertisement

Acta Oceanologica Sinica

, Volume 32, Issue 9, pp 74–81 | Cite as

Development of a skill assessment tool for the Korea operational oceanographic system

  • Kyoung-Ho Cho
  • Jin-Yong Choi
  • Sang-Hun Jeong
  • Jung-Woon Choi
  • Jae-Il Kwon
  • Kwang-Soon Park
Article

Abstract

A standard skill assessment (SA) tool was developed and implemented to evaluate the performance of operational forecast models in the Korea operational oceanographic system. The SA tool provided a robust way to assess model skill in the system by comparing predictions and observations, and involved the computation of multiple skill metrics including correlation and error skills. User- and system-based acceptance criteria of skill metrics were applied to determine whether predictions were acceptable for the system. To achieve this, the tool produced a time series comparison plot, a skill score table, and an advanced summarized diagram to effectively demonstrate the multiple skill scores. Moreover, the SA was conducted to evaluate both atmospheric and hydrodynamic forecast variables. For the atmospheric variables, acceptable error criteria were preferable to acceptable correlation criteria over short timescales, since the mean square error overwhelmed the observation variance. Conversely, for the hydrodynamic variables, acceptable root mean square percentage error (e.g., pe rms) criteria were preferable to acceptable error (e.g., e rms) criteria owing to the spatially variable tidal intensity around the Korean Peninsula. Furthermore, the SA indicated that predetermined acceptance error criteria were appropriate to satisfy a target central frequency (f c ) for which errors fell within the specified limits (i.e., the f c equals 70%).

Key words

skill assessment tool operational forecast system Korea operational oceanographic system 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anthes R A, Kuo Y-H, Hsie E-Y, et al. 1989. Estimation of skill and uncertainty in regional numerical models. Q J Roy Meteor Soc, 115: 763–806CrossRefGoogle Scholar
  2. Cancino L, Neves R. 1999. Hydrodynamic and sediment suspension modeling in estuarine systems: Part I. Description of the numerical models. J Mar Syst, 22: 105–116CrossRefGoogle Scholar
  3. Chen C, Liu H, Beardsley R C. 2003. An unstructured grid, finite-volume, three-dimensional, primitive equations ocean model: Application to coastal ocean and estuaries. J Atmos Ocean Tech, 20: 159–186CrossRefGoogle Scholar
  4. Haidvogel D B, Arango H G, Hedstrom K, et al. 2000. Model evaluation experiments in the North Atlantic Basin: Simulations in non-linear terrain-following coordinates. Dynam Atmos Oceans, 32: 239–281CrossRefGoogle Scholar
  5. Hess K W, Gross T F, Schmalz R A, et al. 2003. NOS standards for evaluating operational now cast and forecast hydrodynamic model systems. NOAA Tech Rep NOS CS, 17: 1–48Google Scholar
  6. Jolliff J K, Kindle J C, Shulman I, et al. 2009. Summary diagrams for coupled hydrodynamic-ecosystem model skill assessment. J Mar Syst, 76: 64–82CrossRefGoogle Scholar
  7. Kim S D, Kim B, Baek S-H. 2011. Data archiving system for the operational ocean forecasting. The Second China-Korea Joint Workshop on Marine Environmental Forecasting for the Yellow Sea and East China Sea, Vol. 1: 85–86Google Scholar
  8. Lee J-C, Kwon J-I, Park K-W, et al. 2008. Calculations of storm surges, Typhoon Maemi. J Korean Soc Coast Ocean Eng, 20: 93–100Google Scholar
  9. Legates D R, McCabe Jr G J. 1999. Evaluating the use of “goodness-of-fit” measures in hydrologic and hydroclimatic model validation. Water Resour Res, 35(1): 233–241CrossRefGoogle Scholar
  10. Loague K, Green R E. 1991. Statistical and graphical methods for e-valuating solute transport models: overview and application. J Contam Hydrol, 7: 51–73CrossRefGoogle Scholar
  11. Matsumoto K, Takanezawa T, Ooe M. 2000. Ocean tide models developed by assimilating TOPEX/POSEIDON altimeter data into hydrodynamical model: A global model and a regional model around Japan. J Oceanogr, 56: 567–581CrossRefGoogle Scholar
  12. Murphy A H, Epstein E S. 1989. Skill scores and correlation coefficients in model verification. Mon Weather Rev, 117: 572–581CrossRefGoogle Scholar
  13. Park K S, Lee J C, Jun K C, et al. 2009. Development of an operational storm surge prediction system for the Korean Coast. Ocean and Polar Res, 31(4): 369–377CrossRefGoogle Scholar
  14. Stow C A, Jolliff J, McGillicuddy Jr D J, et al. 2009. Skill assessment for coupled biological/physical models of marine systems. J Mar Syst, 76: 4–15CrossRefGoogle Scholar
  15. Stow C A, Roessler C, Borsuk ME, et al. 2003. A comparison of estuarine water quality models for TMDL development in the Neuse River Estuary. J Water Res PL-ASCE, 129: 307–314CrossRefGoogle Scholar
  16. Taylor K E. 2001. Summarizing multiple aspects of model performance in a single diagram. J Geophys Res, 106(D7): 7183–7192CrossRefGoogle Scholar
  17. Willmott C J, Ackleson S G, Davis R E, et al. 1985. Statistics for the evaluation and comparison of models. J Geophys Res, 90(C5): 8995–9005CrossRefGoogle Scholar
  18. Willmott C J. 1981. On the validation of models. Phys Geogr, 2: 184–194Google Scholar
  19. Zhang A, Hess K W, Aikman III F. 2010. User-based skill assessment techniques for operational hydrodynamic forecast systems. J Oper Oceanogr, 3: 11–24Google Scholar
  20. Zhang A, Hess K W, Wei E, et al. 2006. Implementation of model skill assessment software for water level and current in tidal regions. NOAA Tech Rep NOS CS, 24: 1–61Google Scholar

Copyright information

© The Chinese Society of Oceanography and Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Kyoung-Ho Cho
    • 1
  • Jin-Yong Choi
    • 1
  • Sang-Hun Jeong
    • 1
  • Jung-Woon Choi
    • 1
  • Jae-Il Kwon
    • 1
  • Kwang-Soon Park
    • 1
  1. 1.Coastal Disaster Research CenterKorea Institute of Ocean Science and TechnologyAnsanRepublic of Korea

Personalised recommendations