Advertisement

Invalidation of Models and Fitness-for-Purpose: A Rejectionist Approach

  • Keith BevenEmail author
  • Stuart Lane
Chapter
Part of the Simulation Foundations, Methods and Applications book series (SFMA)

Abstract

This chapter discusses the issues associated with the invalidation of computer simulation models, taking environmental science as an example. We argue that invalidation is concerned with labelling a model as not fit-for-purpose for a particular application, drawing an analogy with the Popperian idea of falsification of hypotheses and theories. Model invalidation is a good thing in that it implies that some improvements are required, either to the data, to the auxiliary relations or to the model structures being used. It is argued that as soon as epistemic uncertainties in observational data and boundary conditions are acknowledged, invalidation loses some objectivity. Some principles for model evaluation are suggested, and a number of potential techniques for model comparison and rejection are considered, including Bayesian likelihoods, implausibility and the GLUE limits of acceptability approaches. Some problems remain in applying these techniques, particularly in assessing the role of input uncertainties on fitness-for-purpose, but the approach allows for a more thoughtful and reflective consideration of model invalidation as a positive way of making progress in science.

Keywords

Epistemic uncertainty Model equifinality Bayes GLUE Limits of acceptability Behavioural models 

Notes

Acknowledgements

The discussions on which this paper is based were initiated while KB was supported by the Fondation Herbette as visiting professor at the University of Lausanne. We thank Claus Beisbart, Nicole Saam and an anonymous referee for their comments on an earlier draft of this chapter.

References

  1. Anderson, M. P., & Woessner, W. W. (1992). The role of the postaudit in model validation. Advances in Water Resources, 15(3), 167–173.CrossRefGoogle Scholar
  2. Augusiak, J., van den Brink, P. J., & Grimm, V. (2014). Merging validation and evaluation of ecological models to‘evaludation’: A review of terminology and a practical approach. Ecological Modelling, 280, 117–128.CrossRefGoogle Scholar
  3. Baker, V. R. (2017). Debates— Hypothesis testing in hydrology: Pursuing certainty versus pursuing uberty. Water Resources Research, 53, 1770–1778.CrossRefGoogle Scholar
  4. Barraque, B. (2002). Modélisation et gestion de l’environnement. In P. Nouvel (Ed.), Enquète sur le concept de modèle (pp. 121–141). Paris: Presses Universitaires de France.Google Scholar
  5. Bennett, N. D., Croke, B. F., Guariso, G., Guillaume, J. H., Hamilton, S. H., Jakeman, A. J., et al. (2013). Characterising performance of environmental models. Environmental Modelling and Software, 40, 1–20.CrossRefGoogle Scholar
  6. Bernado, J. M., & Smith, A. F. M. (2000). Bayesian theory. Chichester: Wiley. ISBN 978-0-471-49464-5.Google Scholar
  7. Beven, K. J. (1989). Changing ideas in hydrology: The case of physically-based models. Journal of Hydrology, 105, 157–172.CrossRefGoogle Scholar
  8. Beven, K. J. (1993). Prophecy, reality and uncertainty in distributed hydrological modelling. Advances in Water Resources, 16, 41–51.CrossRefGoogle Scholar
  9. Beven, K. J. (2002). Towards a coherent philosophy for environmental modelling. Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences, 458, 2465–2484.MathSciNetCrossRefGoogle Scholar
  10. Beven, K. J. (2006). A manifesto for the equifinality thesis. Journal of Hydrology, 320, 18–36.CrossRefGoogle Scholar
  11. Beven, K. J. (2009). Environmental modelling: An uncertain future? Routledge: London.Google Scholar
  12. Beven, K. J. (2012a). Rainfall-runoff modelling: The primer (2nd ed.). Chichester: Wiley-Blackwell.CrossRefGoogle Scholar
  13. Beven, K. J. (2012b). Causal models as multiple working hypotheses about environmental processes. Comptes Rendus Geoscience, Académie de Sciences, Paris, 344, 77–88.  https://doi.org/10.1016/j.crte.2012.01.005.CrossRefGoogle Scholar
  14. Beven, K. J. (2016). EGU Leonardo Lecture: Facets of hydrology—epistemic error, non-stationarity, likelihood, hypothesis testing, and communication. Hydrological Sciences Journal, 61(9), 1652–1665.  https://doi.org/10.1080/02626667.2015.1031761.CrossRefGoogle Scholar
  15. Beven, K. J. (2018). On hypothesis testing in hydrology: Why falsification of models is still a really good idea. WIRES Water.  https://doi.org/10.1002/wat2.1278.CrossRefGoogle Scholar
  16. Beven, K. J., & Alcock, R. (2012). Modelling everything everywhere: A new approach to decision making for water management under uncertainty. Freshwater Biology, 56, 124–132.  https://doi.org/10.1111/j.1365-2427.2011.02592.x.CrossRefGoogle Scholar
  17. Beven, K. J., & Binley, A. M. (1992). The future of distributed models: Model calibration and uncertainty prediction. Hydrological Processes, 6, 279–298.CrossRefGoogle Scholar
  18. Beven, K., & Binley, A. (2014). GLUE: 20 years on. Hydrological Processes, 28(24), 5897–5918.CrossRefGoogle Scholar
  19. Beven, K. J., & Smith, P. J. (2015). Concepts of Information content and likelihood in parameter calibration for hydrological simulation models. ASCE Journal of Hydrologic Engineering.  https://doi.org/10.1061/(asce)he.1943-5584.0000991.CrossRefGoogle Scholar
  20. Blasone, R. S., Vrugt, J. A., Madsen, H., Rosbjerg, D., Robinson, B. A., & Zyvoloski, G. A. (2008). Generalized likelihood uncertainty estimation (GLUE) using adaptive Markov Chain Monte Carlo sampling. Advances in Water Resources, 31(4), 630–648.CrossRefGoogle Scholar
  21. Box, G. E. P. (1979). Robustness in the strategy of scientific model building. In R. L. Launer & G. N. Wilkinson (Eds.), Robustness in statistics (pp. 201–236). Academic Press.Google Scholar
  22. Box, G. E. P., & Tiao, G. C. (1992). Bayesian inference in statistical analysis. New York: Wiley.CrossRefGoogle Scholar
  23. Brazier, R. E., Beven, K. J., Freer, J., & Rowan, J. S. (2000). Equifinality and uncertainty in physically-based soil erosion models: Application of the GLUE methodology to WEPP, the Water Erosion Prediction Project–for sites in the UK and USA. Earth Surface Processes and Landforms, 25, 825–845.CrossRefGoogle Scholar
  24. Callon, M., Lascoumes, P., & Barthe, Y. (2009). Acting in an uncertain world. An essay on technical democracy. Cambridge, MA: MIT Press.Google Scholar
  25. Cartwright, N. (1999). The dappled world. A study of the boundaries of science. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  26. Chalmers, A. (1976). What is this thing called science? St Lucia, Queensland: University of Queensland Press.Google Scholar
  27. Chamberlin, T. C. (1895). The method of multiple working hypotheses. Science, 15(old series), 92–96.Google Scholar
  28. Choi, H. T., & Beven, K. J. (2007). Multi-period and multi-criteria model conditioning to reduce prediction uncertainty in distributed rainfall-runoff modelling within GLUE framework. Journal of Hydrology, 332(3–4), 316–336.CrossRefGoogle Scholar
  29. CMS Collaboration. (2013). Observation of a new boson with mass near 125 GeV in pp collisions at \( \sqrt s \) = 7 and 8 TeV. Journal of High Energy Physics, 6, 81.Google Scholar
  30. Collins, M., Chandler, R. E., Cox, P. M., Huthnance, J. M., Rougier, J. C., & Stephenson, D. B. (2012). Quantifying future climate change. Nature Climate Change, 2, 403–409.CrossRefGoogle Scholar
  31. Dean, S., Freer, J. E., Beven, K. J., Wade, A. J., & Butterfield, D. (2009). Uncertainty assessment of a process-based integrated catchment model of phosphorus (INCA-P). Stochastic Environmental Research and Risk Assessment, 2009(23), 991–1010.  https://doi.org/10.1007/s00477-008-0273-z.CrossRefGoogle Scholar
  32. Deutsch, D. (1997). The fabric of reality. London: Allen Lane.Google Scholar
  33. Dolby, R. G. H. (1996). Uncertain knowledge. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  34. Elshafei, Y., Sivapalan, M., Tonts, M., & Hipsey, M. R. (2014). A prototype framework for models of socio-hydrology: Identification of key feedback loops and parameterisation approach. Hydrology and Earth System Sciences, 18(6), 2141–2166.CrossRefGoogle Scholar
  35. Fernandez, C., & Steel, M. J. F. (1998). On Bayesian modeling of fat tails and skewness. Journal of American Statistical Association, 93, 359–371.MathSciNetzbMATHGoogle Scholar
  36. Feyerabend, P. (1975). Against method. New York: Verso Books.Google Scholar
  37. Fildes, R., & Kourentzes, N. (2011). Validation and forecasting accuracy in models of climate change. International Journal of Forecasting, 27(4), 968–995.CrossRefGoogle Scholar
  38. Güntner, A., Reich, M., Mikolaj, M., Creutzfeldt, B., Schroeder, S., & Wziontek, H. (2017). Landscape-scale water balance monitoring with an iGrav superconducting gravimeter in a field enclosure. Hydrology and Earth System Sciences, 21, 3167–3182.  https://doi.org/10.5194/hess-21-3167-2017.CrossRefGoogle Scholar
  39. Haasnoot, M., Van Deursen, W. P. A., Guillaume, J. H., Kwakkel, J. H., van Beek, E., & Middelkoop, H. (2014). Fit for purpose? Building and evaluating a fast, integrated model for exploring water policy pathways. Environmental Modelling & Software, 60, 99–120.Google Scholar
  40. Hackett, J., & Zalta, E. N. (Eds.) (2013). Roger bacon. Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/archives/spr2015/entries/roger-bacon/.
  41. Halpern, J. Y. (2005). Reasoning about uncertainty. Cambridge, MA: MIT Press.zbMATHGoogle Scholar
  42. Hargreaves, J. C., & Annan, J. D. (2014). Can we trust climate models? WIREs Climate Change, 5, 435–440.  https://doi.org/10.1002/wcc.288.CrossRefGoogle Scholar
  43. Herskovitz, P. J. (1991). A theoretical framework for simulation validation: Popper’s falsficationism. International Journal of Modelling and Simulation, 11, 56–58.CrossRefGoogle Scholar
  44. Hills, R. C., & Reynolds, S. G. (1969). Illustrations of soil moisture variability in selected areas and plots of different sizes. Journal of Hydrology, 8, 27–47.CrossRefGoogle Scholar
  45. Hollaway, M. et al. (2017). The challenges of modelling phosphorus in a headwater catchment: Applying a ‘limits of acceptability’ uncertainty framework to a water quality model. Under review.Google Scholar
  46. Howson, C. (2000). Hume’s problem: Induction and the justification of belief. Oxford: Oxford University Press, Clarendon Press.CrossRefGoogle Scholar
  47. Howson, C., & Urbach, P. (1993). Scientific reasoning: The Bayesian approach (2nd ed.). Chicago, IL: Open Court.Google Scholar
  48. Hume, D. (1748). Philosophical essays concerning human understanding. London: A. Millar.Google Scholar
  49. IPCC. (2013). Summary for policymakers. In T. F. Stocker, D. Qin, G. -K. Plattner, M. Tignor, S. K. Allen, J. Boschung, A. Nauels, Y. Xia, V. Bex & P. M. Midgley (Eds.), Climate change 2013: The physical science basis. Contribution of working Group I to the fifth assessment report of the intergovernmental panel on climate change. Cambridge: Cambridge University Press.Google Scholar
  50. Jeong, H., & Adamowski, J. (2016). A system dynamics based socio-hydrological model for agricultural wastewater reuse at the watershed scale. Agricultural Water Management, 171, 89–107.CrossRefGoogle Scholar
  51. Kass, R. E., & Raftery, A. E. (1995). Bayes factors. Journal of the American Statistical Association., 90(430), 791.  https://doi.org/10.2307/2291091.MathSciNetCrossRefzbMATHGoogle Scholar
  52. Klein, E. E., & Herskovitz, P. J. (2007). Philosophy of science underpinnings of prototype validation: Popper vs Quine. Information Systems Journal, 17(1), 111–132.CrossRefGoogle Scholar
  53. Knutti, R. (2018). Climate model confirmation: From philosophy to predicting climate in the real world. In E. A. Lloyd & E. Winsberg (Eds.), Climate modelling: Philosophical and conceptual issues. Palgrave Macmillan (Chap. 11).Google Scholar
  54. Koen, B. V. (2003). Discussion of the method: Conducting the engineer’s approach to problem solving. New York: Oxford University Press.Google Scholar
  55. Kohler, M. A. (1969). Keynote address, in Hydrological Forecasting, WMO Technical Note No. 92, pp. X1–XV1, WMO, Geneva.Google Scholar
  56. Konikow, L. F., & Bredehoeft, J. D. (1992). Ground-water models cannot be validated. Advances in Water Resources, 15(1), 75–83.CrossRefGoogle Scholar
  57. Kuhn, T. (1970). The structure of scientific revolutions (2nd ed.). Chicago, IL: University of Chicago Press.Google Scholar
  58. Ladyman, J. (2002). Understanding philosophy of science. London: Routledge.CrossRefGoogle Scholar
  59. Lakatos, I. (1978). Philosophical papers. In J. Worrell & G. Curry (Eds.), The methodology of scientific research programmes (Vol. 1). Cambridge University Press.Google Scholar
  60. Landström, C., Whatmore, S. J., Lane, S. N., Odoni, N., Ward, N., & Bradley, S. (2011). Coproducing flood risk knowledge: Redistributing expertise in critical ‘participatory modelling’. Environment and Planning A, 43(7), 1617–1633.CrossRefGoogle Scholar
  61. Lane, S. N. (2012). Making mathematical models perform in geographical space(s). In J. Agnew & D. Livingstone (Eds.), Handbook of geographical knowledge. Sage, London (Chap. 17).Google Scholar
  62. Lane, S. N. (2014). Acting, predicting and intervening in a socio-hydrological world. Hydrology and Earth System Sciences, 18, 927–952.CrossRefGoogle Scholar
  63. Lane, S. N. (2017). Slow science, the geographical expedition, and critical physical geography. The Canadian Geographer, 61, 84–101.CrossRefGoogle Scholar
  64. Lane, S. N., Landstrom, C., & Whatmore, S. J. (2011). Imagining flood futures: Risk assessment and management in practice. Philosophical Transactions of the Royal Society, A, 369, 1784–1806.CrossRefGoogle Scholar
  65. Lane, S. N., November, V., Landström, C., & Whatmore, S. J. (2013). Explaining rapid transitions in the practice of flood risk management. Annals of the Association of American Geographers, 103, 330–342.CrossRefGoogle Scholar
  66. Latour, B., & Woolgar, S. (1979). Laboratory life: The social construction of scientific facts.Google Scholar
  67. Liu, Y., Freer, J. E., Beven, K. J., & Matgen, P. (2009). Towards a limits of acceptability approach to the calibration of hydrological models: Extending observation error. Journal of Hydrology, 367, 93–103.  https://doi.org/10.1016/j.jhydrol.2009.01.016.CrossRefGoogle Scholar
  68. Lloyd, E. A. (2010). Confirmation and robustness of climate models. Philosophy of Science, 77(5), 971–984.CrossRefGoogle Scholar
  69. Lloyd, E. A. (2018). The role of “complex” empiricism in the debates about satellite data and climate models. In E. A. Lloyd & E. Winsberg (Eds.), Climate modelling: Philosophical and conceptual issues. Palgrave Macmillan (Chap. 6).Google Scholar
  70. Masicampo, E. J., & Lalande, D. (2012). A peculiar prevalence of p values just below .05. The Quarterly Journal of Experimental Psychology.Google Scholar
  71. Mayo, D. (1991). Sociological versus meta-scientific views of risk management. In D. G. Mayo & R. D. Hollander (Eds.), Acceptable evidence: Science and values in risk management (pp. 249–279). Oxford: Oxford University Press.Google Scholar
  72. Mayo, D. G. (1996). Error and the growth of experimental knowledge. Chicago, IL: University of Chicago Press.CrossRefGoogle Scholar
  73. Mayo, D. G., & Spanos, A. (Eds.). (2010). Error and inference. Cambridge: Cambridge University Press.Google Scholar
  74. Miller, D. (1974). Popper’s qualitative concept of verisimilitude. The British Journal for the Philosophy of Science, 23, 166–177.CrossRefGoogle Scholar
  75. Mitchell, S., Beven, K. J., Freer, J., & Law, B. (2011). Processes influencing model-data mismatch in drought-stressed, fire-disturbed, eddy flux sites. JGR-Biosciences, 116.  https://doi.org/10.1029/2009jg001146.
  76. Morton, A. (1993). Mathematical models: Questions of trustworthiness. British Journal for the Philosophy of Science, 44, 659–674.MathSciNetCrossRefGoogle Scholar
  77. Niiniluoto, I. (2017). Verismilititude: Why and how? In N. Ber-Am & S. Gattei (Eds.), Encouraging openness: Essays for Joseph Agassi. Springer. ISBN: 978-3-319-57669-5.Google Scholar
  78. Nott, D. J., Marshall, L., & Brown, J. (2012). Generalized likelihood uncertainty estimation (GLUE) and approximate Bayesian computation: What’s the connection? Water Resources Research, 48(12), W12602.  https://doi.org/10.1029/2011wr011128.CrossRefGoogle Scholar
  79. O’Hear, A. (1975). Rationality of action and theory-testing in Popper. Mind, 84(334), 273–276.Google Scholar
  80. Oldenbaugh, J. (2018). Building trust, removing doubt? Robustness analysis and climate modeling. In E. A. Lloyd & E. Winsberg (Eds.), Climate modelling: Philosophical and conceptual issues. Palgrave Macmillan (Chap. 10).Google Scholar
  81. Oreskes, N. (2018). The scientific consensus on climate change: How do we know we’re not wrong? In E. A. Lloyd & E. Winsberg (Eds.), Climate modelling: Philosophical and conceptual issues. Palgrave Macmillan (Chap. 2).Google Scholar
  82. Oreskes, N., Shrader-Frechette, K., & Berlitz, K. (1994). Verification, validation and confirmation of numerical models in the earth sciences. Science, 263, 641–646.CrossRefGoogle Scholar
  83. Pande, S., & Savenije, H. H. (2016). A sociohydrological model for smallholder farmers in Maharashtra, India. Water Resources Research, 52(3), 1923–1947.CrossRefGoogle Scholar
  84. Parker, W. S. (2009). Confirmation and adequacy-for-purpose in climate modelling. Aristotelian Society Supplementary Volume., 83, 233–249.CrossRefGoogle Scholar
  85. Parker, W. S. (2018). The significance of robust climate projections. In E. A. Lloyd & E. Winsberg (Eds.), Climate modelling: Philosophical and conceptual issues. Palgrave Macmillan (Chap. 9).Google Scholar
  86. Popper, K. R. (1959). The logic of scientific discovery. London: Hutchinson.zbMATHGoogle Scholar
  87. Popper, K. R. (1969). Conjectures and refutations: The growth of scientific knowledge. London: Routledge.Google Scholar
  88. Popper, K. R. (1976). A note on verisimilitude. British Journal for the Philosophy of Science, 27, 147–159.CrossRefGoogle Scholar
  89. Popper, K. (1983). Realism and the aim of science. London: Hutchinson.Google Scholar
  90. Popper, K. R. (1994). The myth of framework: In defence of science and rationality. London: Routledge.Google Scholar
  91. Quine, W. V. (1969). Ontological relativity and other essays. New York: Columbia University Press.Google Scholar
  92. Quine, W. V. (1975). On empirically equivalent systems of the world. Erkenntnis, 9, 317–328.CrossRefGoogle Scholar
  93. Robert, C. P., Cornuet, J., Marin, J., & Pillai, N. S. (2011). Lack of confidence in approximate Bayesian computation model choice. Proceedings of the National Academy of Sciences, 108(37), 15112–15117.CrossRefGoogle Scholar
  94. Rougier, J. C. (2007). Probabilistic inference for future climate using an ensemble of climate model evaluations. Climatic Change, 81, 247–264.CrossRefGoogle Scholar
  95. Sadegh, M., & Vrugt, J. A. (2013). Bridging the gap between GLUE and formal statistical approaches: Approximate Bayesian computation. Hydrology and Earth System Sciences, 17(12), 4831–4850.CrossRefGoogle Scholar
  96. Schoups, G., & Vrugt, J. A. (2010). A formal likelihood function for parameter and predictive inference of hydrologic models with correlated, heteroscedastic, and non-Gaussian errors. Water Resources Research, 46(10), W10531.  https://doi.org/10.1029/2009wr008933.CrossRefGoogle Scholar
  97. Shackley, S., Young, P., Parkinson, S., & Wynne, B. (1998). Uncertainty, complexity and concepts of good science in climate change modelling: Are GCMs the best tools? Climatic Change, 38, 159–205.CrossRefGoogle Scholar
  98. Smith, L. A., & Stern, N. (2011). Uncertainty in science and its role in climate policy. Philosophical Transactions of the Royal Society, 369(1956), 4818–4841 (Handling Uncertainty in Science).Google Scholar
  99. Stengers, I. (2005). The cosmopolitical proposal. In B. Latour & P. Weibel (Eds.), Making things public (pp. 994–1003) Cambridge, MA: MIT Press.Google Scholar
  100. Stengers, I. (2013). Une autre science est possible! Paris: La Découverte.Google Scholar
  101. Suckling, E. B., & Smith, L. A. (2013). An evaluation of decadal probability forecasts from state-of-the-art climate models. Journal of Climate, 26(23), 9334–9347.CrossRefGoogle Scholar
  102. Vernon, I., Goldstein, M., & Bower, R. G. (2010). Galaxy formation: A Bayesian uncertainty analysis. Bayesian Analysis, 5(4), 619–669.  https://doi.org/10.1214/10-ba524.MathSciNetCrossRefzbMATHGoogle Scholar
  103. Viglione, A., Di Baldassarre, G., Brandimarte, L., Kuil, L., Carr, G., Salinas, J. L., et al. (2014). Insights from socio-hydrology modelling on dealing with flood risk–roles of collective memory, risk-taking attitude and trust. Journal of Hydrology, 518, 71–82.CrossRefGoogle Scholar
  104. Von Bertalanffy, L. (1968). General systems theory. New York: Braziller.Google Scholar
  105. Vrugt, J. A. (2016). Markov chain Monte Carlo simulation using the DREAM software package: Theory, concepts, and MATLAB implementation. Environmental Modelling and Software, 75, 273–316.CrossRefGoogle Scholar
  106. Vrugt, J. A., & Beven, K. J. (2018). Embracing equifinality with efficiency: Limits of acceptability sampling using the DREAM (LOA) algorithm. Journal of Hydrology, 559, 954–971.Google Scholar
  107. Watkins, J. (1985). Science and scepticism. Princeton: Princeton University Press.Google Scholar
  108. Weisberg, Michael. (2006). Robustness analysis. Philosophy of Science, 73(5), 730–742.MathSciNetCrossRefGoogle Scholar
  109. Westerberg, I. K., & McMillan, H. K. (2015). Uncertainty in hydrological signatures. Hydrology and Earth System Sciences, 19(9), 3951–3968.CrossRefGoogle Scholar
  110. Westerberg, I. K., Guerrero, J.-L., Younger, P. M., Beven, K. J., Seibert, J., Halldin, S., et al. (2011). Calibration of hydrological models using flow-duration curves. Hydrology and Earth System Sciences, 15, 2205–2227.  https://doi.org/10.5194/hess-15-2205-2011.CrossRefGoogle Scholar
  111. Wimsatt, W. C. (2007). Re-engineering philosophy for limited beings. Cambridge: Harvard University Press.Google Scholar
  112. Winsberg, E. (2003). Simulated experiments: Methodology for a virtual world. Philosophy of Science, 70, 105–125.CrossRefGoogle Scholar
  113. Woodhouse, M. J., Hogg, A. J., Phillips, J. C., & Rougier, J. C. (2015). Uncertainty analysis of a model of wind-blown volcanic plumes. Bulletin of Volcanology, 77(10), 83.  https://doi.org/10.1007/s00445-015-0959-2.CrossRefGoogle Scholar
  114. Young, P. C. (2013). Hypothetico-inductive data-based mechanistic modeling of hydrological systems. Water Resources Research, 49(2), 915–935.MathSciNetCrossRefGoogle Scholar
  115. Young, P. C. (2018). Data-based mechanistic modelling and forecasting globally averaged surface temperature. International Journal of Forecasting, 34(2), 314–335.  https://doi.org/10.1016/j.ijforecast.2017.10.002.CrossRefGoogle Scholar
  116. Zreda, M., Shuttleworth, W. J., Zeng, X., Zweck, C., Desilets, D., Franz, T., et al. (2012). COSMOS: The cosmic-ray soil moisture observing system. Hydrology and Earth System Sciences, 16(11), 4079–4099.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Lancaster Environment Centre, Lancaster UniversityLancasterUK
  2. 2.Institute of Earth Surface Dynamics, Université de LausanneLausanneSwitzerland

Personalised recommendations