Advertisement

Mehr als nur Werkzeuge

Kritik an Experimenten in der Ökonomik
  • Felix Kersting
  • Robert Lepenies
  • Theresa NeefEmail author
Chapter
Part of the Wirtschaft + Gesellschaft book series (WUG)

Zusammenfassung

In diesem Beitrag geben wir einen Überblick der Kritik an Experimenten in der Ökonomik. Im ersten Teil gehen wir auf die Diskussion um Vor- und Nachteile von experimentellen Methoden ein und bestimmen das Verhältnis zum Mainstream der Wirtschaftswissenschaften. Im zweiten Teil begründen wir, weshalb Methodenkritik ein fester Bestandteil einer pluralen Ökonomik sein sollte. Methoden sind nicht einfach nur neutrale Werkzeuge mit technischen Vor- und Nachteilen. Plurale Ökonom*innen sollten daher die wissenschaftstheoretische und politisch-normative Einbettung von Methoden kritisch beleuchten. Diese Forderung lösen wir am Beispiel von Experimenten ein, indem wir im dritten Teil die Anwendung von Experimenten in der Sozialpolitik des Globalen Nordens und in der Entwicklungspolitik im Globalen Süden kritisch beleuchten.

Schlüsselwörter

Empirical Turn Entwicklungspolitik Ethik Experimente Methodenkritik Nudging Philosophy of Economics Randomisierte kontrollierte Studien (RCTs) Verhaltensökonomik 

Literatur

  1. Abbring, Jaap, und James J. Heckman. 2007. Econometric evaluation of social programs, part III. Distributional treatment effects, dynamic treatment effects, dynamic discrete choice, and general equilibrium policy evaluation. In Handbook of econometrics, Hrsg. James J. Heckman und Edward E. Leamer, 5145–5303. Amsterdam: Elsevier.Google Scholar
  2. Acemoglu, Daron, Simon Johnson, und James Robinson. 2005. Institutions as a fundamental cause of long-run growth. In Handbook of economic growth, Hrsg. Philippe Aghion und Steven Durlauf, 385–472. Amsterdam: Elsevier.Google Scholar
  3. Angrist, Joshua, und Jörn-Steffen Pischke. 2009. Mostly harmless econometrics. An empiricist’s companion. Princeton: Princeton University Press.CrossRefGoogle Scholar
  4. Angrist, Joshua, und Jörn-Steffen Pischke. 2010. The credibility revolution in empirical economics. How better research design is taking the con out of econometrics. Journal of Economic Perspectives 24 (2): 3–30.CrossRefGoogle Scholar
  5. Backhouse, Roger E., und Beatrice Cherrier. 2017. The age of the applied economist: The transformation of economics after 1970. History of Political Economy 49 (Supplement): 1–33.CrossRefGoogle Scholar
  6. Banerjee, Abhijit V., und Esther Duflo. 2009. The experimental approach to development economics. Annual Review of Economics 1 (1): 151–178.CrossRefGoogle Scholar
  7. Banerjee, Abhijit V., und Esther Duflo. 2012. Poor economics. A radical rethinking of the way to fight global poverty. New York: Public Affairs.Google Scholar
  8. Banerjee, Abhijit V., und Esther Duflo. 2014. Under the thumb of history? Political institutions and the scope for action. Annual Review of Economics 6 (1): 951–971.CrossRefGoogle Scholar
  9. Banerjee, Abhijit V., und Ruimin He. 2003. The World Bank of the future. American Economic Review 93 (2): 39–44.CrossRefGoogle Scholar
  10. Banerjee, Abhijit V., und Ruimin He. 2007. Making aid work. In Making aid work, Hrsg. Abhijit V. Banerjee, 47–92. Cambridge: MIT Press.Google Scholar
  11. Banerjee, Abhijit V., Esther Duflo, Nathanael Goldberg, Dean Karlan, Robert Osei, William Pariente, Jeremy Shapiro, Bram Thuysbaert, und Christopher Udry. 2015. A multifaceted program causes lasting progress for the very poor. Evidence from six countries. Science 348 (6236): 772–890.CrossRefGoogle Scholar
  12. Behavioral Insights Team. 2016. Update report 2015–16. London: Nesta.Google Scholar
  13. Berndt, Christian. 2015. Behavioural economics, experimentalism and the marketization of development. Economy and Society 44 (4): 567–591.CrossRefGoogle Scholar
  14. Bhatt, Jigar. 2011. Causality and the experimental turn in development economics. New School Economic Review 5 (1): 50–67.Google Scholar
  15. Böhme, Juliane. 2015. ‚Doing‘ Laborexperimente. Eine ethnomethodologische Betrachtung der Praxis experimenteller Wirtschaftsforschung im Labor. Berliner Journal für Soziologie 25 (1–2): 33–59.CrossRefGoogle Scholar
  16. Bothwell, Laura E., Jeremy Greene, Scott Podolsky, und David Jones. 2016. Assessing the gold standard. Lessons from the history of RCTs. New England Journal of Medicine 374 (22): 2175–2181.CrossRefGoogle Scholar
  17. Camerer, Colin. 2015. The promise and success of lab-field generalizability in experimental economics. A critical reply to Levitt and List. In Handbook of experimental economic methodology, Hrsg. Guillaume R. Fréchette und Andrew Schotter, 249–295. Oxford: Oxford University Press.CrossRefGoogle Scholar
  18. Camerer, Colin, Anna Dreber, Eskil Forsell, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, Johan Almenberg, Adam Altmejd, Taizan Chan, Emma Heikenstein, Felix Holzmeister, Taisuke Imai, Siri Isaksson, Gideon Nave, Thomas Pfeiffer, Michasel Razen, Michael Razen, und Hang Wu. 2016. Evaluating replicability of laboratory experiments in economics. Science 351 (6280): 1433–1436.CrossRefGoogle Scholar
  19. Card, David, Stefano DellaVigna, und Ulrike Malmendier. 2011. The role of theory in field experiments. Journal of Economic Perspectives 25 (3): 39–62.CrossRefGoogle Scholar
  20. Cartwright, Nancy. 2007. Are RCTs the gold standard? BioSocieties 2 (1): 11–20.CrossRefGoogle Scholar
  21. Cartwright, Nancy, Andrew Goldfinch, und Jeremy Howick. 2010. Evidence-based policy. Where is our theory of evidence? Journal of Children’s Services 4 (4): 6–14.CrossRefGoogle Scholar
  22. Congdon, William J., Jeffrey R. King, Jens Ludwig, und Sendhil Mullainathan. 2017. Social policy. Mechanism experiments and policy evaluations. In Handbook of economic field experiments, Hrsg. Abhijit V. Banerjee und Esther Duflo, 389–426. Amsterdam: Elsevier.Google Scholar
  23. Deaton, Angus. 2010. Instruments, randomization, and learning about development. Journal of Economic Literature 48 (2): 424–455.CrossRefGoogle Scholar
  24. Deaton, Angus, und Nancy Cartwright. 2016. Understanding and misunderstanding randomized controlled trials. Working Paper No. 22595. Cambridge: National Bureau of Economic Research.Google Scholar
  25. Dübgen, Franziska. 2016. Epistemic injustice in practice. Wagadu: A Journal of Transnational Women’s and Gender Studies 15: 1–10.Google Scholar
  26. Duflo, Esther. 2010. Social experiments to fight poverty. Ted Talk. https://www.ted.com/talks/esther_duflo_social_experiments_to_fight_poverty. Zugegriffen: 15. Juli. 2017.
  27. Duflo, Esther. 2017. The economist as plumber. American Economic Review 107 (5): 1–26.CrossRefGoogle Scholar
  28. Duflo, Esther, Michael Kremer, und Jonathan Robinson. 2011. Nudging farmers to use fertilizer. Theory and experimental evidence from Kenya. American Economic Review 101 (6): 2350–2390.CrossRefGoogle Scholar
  29. Duflo, Esther, Pascaline Dupas, und Michael Kremer. 2015. School governance, teacher incentives, and pupil–teacher ratios. Experimental evidence from Kenyan primary schools. Journal of Public Economics 123: 92–110.CrossRefGoogle Scholar
  30. Every-Palmer, Susanna, und Jeremy Howick. 2014. How evidence-based medicine is failing due to biased trials and selective publication. Journal of Evaluation in Clinical Practice 20 (6): 908–914.CrossRefGoogle Scholar
  31. Falk, Armin, und James J. Heckman. 2009. Lab experiments are a major source of knowledge in the social sciences. Science 326 (5952): 535–538.CrossRefGoogle Scholar
  32. Falk, Armin, Fabian Kosse, Ingo Menrath, Pablo E. Verde, und Johannes Siegrist. 2017. Unfair pay and health. Management Science 64 (4): 1477–1488.CrossRefGoogle Scholar
  33. Favereau, Judith. 2016. On the analogy between field experiments in economics and clinical trials in medicine. Journal of Economic Methodology 23 (2): 203–222.CrossRefGoogle Scholar
  34. Feinstein, Alvan R., und Ralph I. Horwitz. 1997. Problems in the evidence of evidence-based medicine. The American Journal of Medicine 103 (6): 529–535.CrossRefGoogle Scholar
  35. Fréchette, Guillaume R., und Andrew Schotter. 2015. Handbook of experimental economic methodology. Oxford: Oxford University Press.CrossRefGoogle Scholar
  36. Fricker, Miranda. 2007. Epistemic injustice. Power and the ethics of knowing. Oxford: Oxford University Press.CrossRefGoogle Scholar
  37. Friedman, Milton. 1953. The methodology of positive economics. In Essays in positive economics, Hrsg. Milton Friedman, 3–43. Chicago: University of Chicago Press.Google Scholar
  38. Gigerenzer, Gerd. 2015. On the supposed evidence for libertarian paternalism. Review of Philosophy and Psychology 6 (3): 361–383.CrossRefGoogle Scholar
  39. Glennerster, Rachel, und Shawn Powers. 2016. Balancing risk and benefit. Ethical tradeoffs in running randomized evaluations. In The Oxford handbook of professional economic ethics, Hrsg. George DeMartino und Deirdre McCloskey, 367–400. Oxford: Oxford University Press.Google Scholar
  40. Gneezy, Uri, und John A. List. 2013. The why axis. Hidden motives and the undiscovered economics of everyday life. New York: Public Affairs.Google Scholar
  41. Greenberg, David H., und Mark Schroder. 1998. Digest of social experiments. Washington: Urban Institute Press.Google Scholar
  42. Guala, Francesco. 2005. The methodology of experimental economics. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  43. Hacking, Ian. 1983. Representing and intervening. Introductory topics in the philosophy of natural science. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  44. Halpern, David. 2015. What works? The rise of ‘experimental’ government. Civil Service Quarterly. https://quarterly.blog.gov.uk/2015/01/27/what-works-the-rise-of-experimental-government/. Zugegriffen: 15. Juli 2017.
  45. Halpern, David, und Daniele Mason. 2015. Radical incrementalism. Evaluation 21 (2): 143–149.CrossRefGoogle Scholar
  46. Hamermesh, Daniel S. 2013. Six decades of top economics publishing. Who and how? Journal of Economic Literature 51 (1): 162–172.CrossRefGoogle Scholar
  47. Harrison, Glenn W. 2013. Field experiments and methodological intolerance. Journal of Economic Methodology 20 (2): 103–117.CrossRefGoogle Scholar
  48. Harrison, Glenn W., und John List. 2004. Field experiments. Journal of Economic Literature 42 (4): 1009–1055.CrossRefGoogle Scholar
  49. Haynes, Laura, Owain Service, Ben Goldacre, und David Torgerson. 2012. Test, learn, adapt. Developing public policy with randomised controlled trials. London: Cabinet Office & Behavioral Insights Team.Google Scholar
  50. Heckman, James J. 1992. Randomization and social policy evaluation. In Evaluating welfare and training programs, Hrsg. Charles E. Manski und Irwin Garfinkel, 201–230. Cambridge: Harvard University Press.Google Scholar
  51. Heckman, James J., und Jeffrey A. Smith. 1995. Assessing the case for social experiments. The Journal of Economic Perspectives 9 (2): 85–110.CrossRefGoogle Scholar
  52. Heckman, James J., und Edward J. Vytlacil. 2007. Econometric evaluation of social programs, part II. Using the marginal treatment effect to organize alternative econometric estimators to evaluate social programs, and to forecast their effects in new environments. In Handbook of econometrics, Hrsg. James J. Heckman und Edward E. Leamer, 4875–5143. Amsterdam: Elsevier.Google Scholar
  53. Heckman, James J., Jeffrey A. Smith, und Nancy Clements. 1997. Making the most out of programme evaluations and social experiments. Accounting for heterogeneity in programme impacts. The Review of Economic Studies 64 (4): 487–535.CrossRefGoogle Scholar
  54. Henrich, Joseph, Steven J. Heine, und Ara Norenzayan. 2010. The weirdest people in the world? Behavioral and Brain Sciences 33 (2–3): 61–83.CrossRefGoogle Scholar
  55. Heukelom, Floris. 2011. How validity travelled to economic experimenting. Journal of Economic Methodology 18 (1): 13–28.CrossRefGoogle Scholar
  56. Heukelon, Floris. 2014. Behavioral economics. A history. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  57. Infante, Gerardo, Guilhem Lecouteux, und Robert Sugden. 2016. Preference purification and the inner rational agent. A critique of the conventional wisdom of behavioural welfare economics. Journal of Economic Methodology 23 (1): 1–25.CrossRefGoogle Scholar
  58. Kessler, Judd, und Lise Vesterlund. 2015. External validity of laboratory experiments. The misleading emphasis on quantitative effects. In Handbook of experimental economic methodology, Hrsg. Guillaume R. Fréchette und Andrew Schotter, 391–405. Oxford: Oxford University Press.CrossRefGoogle Scholar
  59. Lepenies, Robert, und Magdalena Malecka. 2016. Nudges, Recht und Politik. Institutionelle Implikationen. Zeitschrift für Praktische Philosophie 3 (1): 487–530.CrossRefGoogle Scholar
  60. Lepenies, Robert, und Magdalena Malecka. (2019). Behaviour change: Extralegal, apolitical, scientistic? In Handbook of behavioural change and public policy, Hrsg. Silke Beck und Holger Strassheim, 344–360. UK: Edward Elgar Publishing.Google Scholar
  61. Levitt, Steven D., und John A. List. 2007. What do laboratory experiments measuring social preferences reveal about the real world? Journal of Economic Perspectives 21 (2): 153–174.CrossRefGoogle Scholar
  62. Ludwig, Jens, Jeffrey R. Kling, und Sendhil Mullainathan. 2011. Mechanism experiments and policy evaluations. Journal of Economic Perspectives 25 (3): 17–38.CrossRefGoogle Scholar
  63. Mäki, Uskali. 2005. Models are experiments, experiments are models. Journal of Economic Methodology 12 (2): 303–315.CrossRefGoogle Scholar
  64. Mill, John Stuart. 1844. Essays on some unsettled questions of political economy. London: Longmans, Green, Reader, and Dyer.Google Scholar
  65. Mirowski, Philip. 2013. Never let a serious crisis go to waste. London: Verso.Google Scholar
  66. Morgan, Mary S. 2005. Experiments versus models. New phenomena, inference and surprise. Journal of Economic Methodology 12 (2): 317–329.CrossRefGoogle Scholar
  67. Morvant-Roux, Solène, Isabelle Guérin, Marc Roesch, und Jean-Yves Moisseron. 2014. Adding value to randomization with qualitative analysis. The case of microcredit in rural Morocco. World Development 56 (4): 302–312.CrossRefGoogle Scholar
  68. Muralidharan, Karthik, Paul Niehaus, und Sandip Sukhtankar. 2017. General equilibrium effects of (improving) public employment programs. Experimental evidence from India. Working Paper No. 23838. Cambridge: National Bureau of Economic Research.Google Scholar
  69. Ogden, Timothy. 2017. Angus Deaton. In Experimental conversations. Perspectives on randomized trials in development economics, Hrsg. Timothy Ogden, 37–48. Cambridge: MIT Press.Google Scholar
  70. Open Science Collaboration. 2015. Estimating the reproducibility of psychological science. Science 349 (6251): 943–951.CrossRefGoogle Scholar
  71. Quigley, Muireann. 2013. Nudging for health. On public policy and designing choice architecture. Medical Law Review 21 (4): 588–621.CrossRefGoogle Scholar
  72. Reddy, Sanjay D. 2012. Randomise this! On poor economics. Review of Agrarian Studies 2 (2): 60–73.Google Scholar
  73. Reiss, Julian. 2016. Thought experiments in economics and the role of coherent explanations. Studia Metodologiczne 36:113–130.Google Scholar
  74. Roth, Alvin E. 2002. The economist as engineer. Game theory, experimentation, and computation as tools for design economics. Econometrica 70 (4): 1341–1378.CrossRefGoogle Scholar
  75. Schmidt, Klaus. 2009. The role of experiments for the development of economic theories. Perspektiven der Wirtschaftspolitik 10 (S1): 14–30.CrossRefGoogle Scholar
  76. Sukhtankar, Sandip. 2017. Replications in development economics. American Economic Review 107 (5): 32–36.CrossRefGoogle Scholar
  77. Svorenčík, Andrej. 2015. The experimental turn in economics. A history of experimental economics. Utrecht School of Economics Dissertation Series #29. Utrecht: University of Utrecht.Google Scholar
  78. Thaler, Richard, und Cass Sunstein. 2008. Nudge. New Haven: Yale University Press.Google Scholar
  79. Thoma, Johanna. 2016. On the hidden thought experiments of economic theory. Philosophy of the Social Sciences 46 (2): 129–146.CrossRefGoogle Scholar
  80. Tversky, Amos, und Daniel Kahneman. 1974. Judgment under uncertainty. Heuristics and biases. Science 185 (4157): 1124–1131.CrossRefGoogle Scholar
  81. World Bank Group. 2015. Mind, society, and behavior. World Development Report 2015. Washington: International Bank for Reconstruction and Development/The World Bank.Google Scholar

Copyright information

© Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature 2019

Authors and Affiliations

  1. 1.Humboldt-Universität zu BerlinBerlinDeutschland
  2. 2.Helmholtz-Zentrum für UmweltforschungLeipzigDeutschland
  3. 3.Freie Universität BerlinBerlinDeutschland

Personalised recommendations