Skip to main content

Uncertainties and Approximations

  • Chapter
  • 349 Accesses

Abstract

Uncertainty, as the word itself says, is when you are not sure something is the truth. Approximation, as the word itself says, is when you have got near to what you think is the truth. Scientists, physical or social, would like to work with concepts that are a little more precise than that. The means to this end has turned out to be probability theory.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

  1. Ch. 1 of David A. Freedman, Statistical Models and Causal Inference (New York: Cambridge University Press, 2010), describes the relevant recent issues, best exemplified by his quoting Heraclitus: “You can’t step into the same river twice.” That implies that dovetailing at its best is a matter of making the best of an imperfect world.

    Google Scholar 

  2. John Taylor, An Introduction to Error Analysis, 2nd ed. (Sausalito, CA: University Science Books, 1997), describes the world of dovetailing hypothesis and observation from a physicist’s perspective.

    Google Scholar 

  3. Robert Nisbet, John Elder, and Gary Miner, Handbook of Statistical Analysis and Data Mining Applications (Amsterdam: Academic Press/Elsevier, 2009), 4–7, provide a (very) short history of data analysis.

    Google Scholar 

  4. The elementary physics and astrophysics textbooks in this comparo are David Halliday, Robert Resnick, and Jearl Walker, Fundamentals of Physics (New York: Wiley, 2005);

    Google Scholar 

  5. Bradley Carroll and Dale Ostlie, An Introduction to Modern Astrophysics (Boston: Addison-Wesley, 2007);

    Google Scholar 

  6. and David Griffiths, Introduction to Quantum Mechanics, 2nd ed. (Upper Saddle River, NJ: Pearson, 2005).

    Google Scholar 

  7. The elementary economics textbooks are N. Gregory Mankiw, Principles of Economics, 6th ed. (Mason, OH: Southwestern, 2012);

    Google Scholar 

  8. Paul Krugman and Robin Wells, Economics, 3rd ed. (New York: Worth, 2013);

    Google Scholar 

  9. and Robert Gordon, Macroeconomics, 12th ed. (Boston: Addison-Wesley, 2012).

    Google Scholar 

  10. Spectroscopy as an astrophysical tool is analyzed in Carroll and Ostlie’s ch. 5. A quantum mechanical theory of the process occurs in David J. Tannor, Introduction to Quantum Mechanics (Sausalito, CA: University Science Books, 2007), ch. 13.

    Google Scholar 

  11. Mass spectrometry may soon be as ubiquitous in our lives as GPS is today, as recently predicted by Alan Dove, “Mass Spectrometry Raises the Bar,” Science 328, no. 5980 (2010): 920–922.

    Article  Google Scholar 

  12. L. Laloux, P. Cizeau, J. P. Bouchaud and M. Potters, “Noise Dressing of Financial Correlation Matrices,” Physical Review Letters 83, no. 7 (August 1999): 1467ff.

    Article  Google Scholar 

  13. Paul Ruud, An Introduction to Classical Econometric Theory (New York: Oxford University Press, 2000), xxiii.

    Google Scholar 

  14. Nisbet at al. (2009) is a textbook on the nature and use of contemporary data-mining software. J. Scott Armstrong, ed., Principles of Forecasting (Boston: Kluwer, 2001) provides a sort of multiauthor manual for business and economic forecasters. Compared with the ways statistical analysis is taught in economics departments, two things stand out in these presentations: (1) a deep involvement in the data; and (2) a hands-loose attitude toward data analysis procedures. In other words: There are all kinds of ways to study a factual environment, and restricting oneself to one of them (econometrics is just one of them) is, to say the least, suboptimal.

    Google Scholar 

  15. Ruud is a classic presenter of classical econometrics. Nate Silver, The Signal and the Noise (New York: Penguin Press, 2012), offers a really fine nontechnical account of the other, or Bayesian, approach to statistical analysis. I agree with Freedman (2010), ch. 1, that it’s easy to exaggerate the practical differences between the two. Both are being used in physics simultaneously, classical for error measurement, Bayesian for statistical mechanics. Seems to work out ok.

    Google Scholar 

  16. This paragraph oversimplifies a tangled web of theories. An important achievement of mathematical economists was to untangle that web with their models. However, the key issue—how, how much, even whether, you can tease out real predictable economic change through monetary or fiscal actions—remains with us. It is one of those long-run unresolved controversies. It’s also a typical one because it consists of a lot of models, each one possibly correct, which can be roughly dovetailed with the data, which is limited in both quantity and quality. Mark Blaug, Economic Theory in Retrospect, 3rd ed. (Cambridge: Cambridge University Press, 1978), 645–79, does a good job of explicating the tangle.

    Google Scholar 

  17. Niall Adams, “Perspectives on Data Mining,” International Journal of Market Research 52, no. 1 (2010): 11–18, offers a useful survey of data-mining techniques used in a field that is strongly focused on useful (i.e., predictive) results, namely, marketing.

    Article  Google Scholar 

  18. As usual in my library, if you want to know where and how data are collected, the only secondary source that takes this question seriously is Edward Learner, Macroeconomic Patterns and Stories, A Guide for MBA’s (Berlin: Springer, 2010). For example, his p. 52 is a table of just this for labor force data collection; it’s surrounded by discussion of uses and relevance.

    Google Scholar 

  19. Barton Zwiebach, A First Course in String Theory (Cambridge: Cambridge University Press, 2009), 5, gives this count, the current number of particles that are force-bearing or matter-bearing according to the standard model.

    Book  Google Scholar 

  20. George Akerlof and Robert Shiller, Animal Spirits (Princeton, NJ: Princeton University Press, 2009), ch. 5, recommend stories, ordinary narratives that humans tell each other, as being relevant for people in forming views about policy, what behaviorists call referencing.

    Google Scholar 

  21. Benjamin Ward, What’s Wrong with Economics (New York: Basic Books, 1972), ch. 12, analyzes the storytelling process as a promising research procedure for economics. Freedman’s 1991 essay, “Statistical Models and Shoe Leather,” in Freedman (2010), pp. 45–62 has “shoe leather” playing a similar research role. The idea is out there; it just hasn’t been picked up yet.

    Book  Google Scholar 

  22. Che Li, Zefeng Ren, Xingan Wang, Wenrui Dong, Dongxu Dai, Xiuyan Wang, Dong H. Zhang, Xueming Yang, Liusi Sheng, Guoliang Li, Hans-Joachim Werner, François Lique, and Millard H. Alexander, “Breakdown of the Born-Oppenheimer Approximation in the F+ o-D2>DF+D Reaction,” Science 317 (2007): 1061–64, describe the eighty-year-old Born-Oppenheimer approximation, which is still in widespread use in quantum chemistry today. They also point out its failures, avoiding which has required detailed empirical research and theorizing. A comparable and actually used economic approximation is that international interaction with the domestic economy does not change in the short run. That made some sense half a century ago, but no more, and who knows how good it was back then?

    Article  Google Scholar 

  23. Benjamin Ward, The Ideal Worlds of Economics (New York: Basic Books, 1979) offers three accounts of political economy as might be presented by liberal, conservative, and radical economists of those times. The rules for developing each were: No known fact was to be contradicted and the moral values expressed were to be generally recognized as decent. The accounts were very different. The radical view has since disappeared from professional economics; however, the other two have proved surprisingly stable and show no signs of converging. This situation is achieved sociologically, not scientifically. People talk past one another, hire people who have similar priors, and apply most of the devices of Chapter 8 to maintain their views.

    Book  Google Scholar 

Download references

Authors

Copyright information

© 2016 Benjamin Ward

About this chapter

Cite this chapter

Ward, B. (2016). Uncertainties and Approximations. In: Dionysian Economics. Palgrave Macmillan, New York. https://doi.org/10.1057/9781137597366_11

Download citation

Publish with us

Policies and ethics