Advertisement

The rise of randomized controlled trials (RCTs) in international development in historical perspective

  • Luciana de Souza LeãoEmail author
  • Gil Eyal
Article

Abstract

This article brings a historical perspective to explain the recent dissemination of randomized controlled trials (RCTs) as the new “gold standard” method to assess international development projects. Although the buzz around RCT evaluations dates from the 2000s, we show that what we are witnessing now is a second wave of RCTs, while a first wave began in the 1960s and ended by the early 1980s. Drawing on content analysis of 123 RCTs, participant observation, and secondary sources, we compare the two waves in terms of the participants in the network of expertise required to carry out field experiments and the characteristics of the projects evaluated. The comparison demonstrates that researchers in the second wave were better positioned to navigate the political difficulties caused by randomization. We explain the differences in the expertise network and in the type of projects as the result of concurrent transformations in the fields of development aid and the economics profession. We draw on Andrew Abbott’s concept of “hinges,” as well as on Bourdieu’s concept of “homology” between fields, to argue that the similar positions and parallel struggles conducted by two groups of actors in the two fields served as the basis for a cross-field alliance, in which RCTs could function as a “hinge” linking together the two fields.

Keywords

Expertise Fields Hinges Philanthro-capitalists Policy evaluation Randomistas 

Notes

Acknowledgements

The authors want to thank XingJian Li for her meticulous work in this project, as well as Margarita Rayzberg, Diana Graizbord, Moran Levy, Joan Robinson, Josh Whitford, and Diane Vaughn for their feedback. Previous versions of this article were presented at the 2014 Society for Social Studies of Science Meeting, the 2015 Social Science History Association Meeting, the 2016 and 2018 American Sociological Association Conference, Columbia University’s SKAT workshop, Sciences Po, and the Federal University of Rio de Janeiro. We also thank the participants at these presentations for their helpful comments and suggestions.

References

  1. Abbott, A. (1988). The system of professions: An essay on the division of expert labor. Chicago: University of Chicago Press.Google Scholar
  2. Abbott, A. (2005). Linked ecologies: States and universities as environments for professions. Sociological Theory, 23(3), 245–274.Google Scholar
  3. Adams, V. (2016). Metrics: What counts in global health. Durham: Duke University Press.Google Scholar
  4. Angrist, J., & Pischke, J. (2010). The credibility revolution in empirical economics: How better research design is taking the con out of econometrics. Journal of Economic Perspectives, 24(2), 3–30.Google Scholar
  5. Angrist, J., Azoulay, P., Ellison, G., Hill, R., & Feng Lu, S. (2017). Economic research evolves: Fields and styles. American Economic Review, 107(5), 293–297.Google Scholar
  6. Babb, S. (2009). Behind the development banks: Washington politics, world poverty, and the wealth of nations. Chicago: The University of Chicago Press.Google Scholar
  7. Babb, S., & Chorev, N. (2016). International organizations: Loose and tight coupling in the development regime. Studies in Comparative International Development, 51(1), 81–102.Google Scholar
  8. Banerjee, A. (2007). Making aid work. Cambridge: MIT Press.Google Scholar
  9. Banerjee, A., & Duflo, E. (2011). Poor economics: A radical rethinking of the way to fight global poverty. United States: Public Affairs Book.Google Scholar
  10. Banerjee, A., Karlan, D., & Zinman, J. (2015). Six randomized evaluations of microcredit: Introduction and further steps. American Economic Journal: Applied Economics, 7(1), 1–21.Google Scholar
  11. Banerjee, A., Chassang, S, & Snowberg, E. (2016) Decision theoretic approaches to experiment design and external validity. NBER Working Paper 22167.Google Scholar
  12. Barnes, B., Bloor, D., & Henry, J. (1996). Scientific knowledge: A sociological approach. Chicago: University of Chicago Press.Google Scholar
  13. Bauman, K. (1997). The effectiveness of family planning programs evaluated with true experimental designs. American Journal of Public Health, 87(4), 666–669.Google Scholar
  14. Benko, J. (2013) The hyper efficient, highly scientific scheme to help the World's poor. Wired Magazine. Retrieved on May 2, 2017, https://www.wired.com/2013/11/jpal-randomized-trials/. Accessed 17 Oct 2018.
  15. Berk, R., Boruch, R., Chambers, D., Rossi, P., & Witte, A. (1985). Social policy experimentation: A position paper. Evaluation Review, 9(4), 387–429.Google Scholar
  16. Berndt, C. (2015). Behavioural economics, experimentalism and the marketization of development. Economy and Society, 44(4), 567–591.Google Scholar
  17. Berrios, R. (2000). Contracting for development: The role of for-profit contractors in U.S. foreign development assistance. Westport: Praeger.Google Scholar
  18. Bishop, M., & Green, M. (2008). Philanthrocapitalism: How the rich can save the world. New York: Bloomsbury Press.Google Scholar
  19. Borkum, E., He, F., & Linden, L. (2012). The effects of school libraries on language skills: Evidence from a randomized controlled trial in India. In NBER Working Paper 18183. Cambridge: National Bureau of Economic Research.Google Scholar
  20. Boruch, R., McSweeny, J., & Soderstrom, J. (1978). Randomized field experiments for program planning, development, and evaluation--an illustrative bibliography. Evaluation Quarterly, 2(4), 655–695.Google Scholar
  21. Bourdieu, P. (1975). The specificity of the scientific field and the social conditions of the Progress of reason. Social Science Information, 14, 19–47.Google Scholar
  22. Bourdieu, P. (1977). Outline of the theory of practice. Cambridge: Cambridge University Press.Google Scholar
  23. Carpenter, D. (2010). Reputation and power—Organizational image and pharmaceutical regulation at the FDA. Princeton: Princeton University Press.Google Scholar
  24. Cohen, J., & Dupas, P. (2010). “Free Distribution or Cost-Sharing? Evidence from a Randomized Malaria Prevention Experiment.” The Quarterly Journal of Economics 125(1), 1–45.Google Scholar
  25. Cooley, A., & Ron, J. (2002). The NGO scramble: Organizational insecurity and the political economy of transnational action. International Security, 27(1), 4–39.Google Scholar
  26. Cuca, R., & Pierce, C. (1977). Experiments in family planning – Lessons from developing world. Baltimore: The Johns Hopkins University Press.Google Scholar
  27. Daston, L., & Galison, P. (1992). The image of objectivity. Representations, 40, 81–128.Google Scholar
  28. Deaton, A. (2006) Evidence-Based Aid Must not Become the Latest in a Long String of Development Fads, Pp. 60–61 in Making Aid Work, edited by Abhijit Banerjee. Cambridge: MIT Press.Google Scholar
  29. Deaton, A. (2010). Instruments, randomization, and learning about development. Journal of Economic Literature, 48(2), 424–455.Google Scholar
  30. Deaton, A., & Cartwright, N. (2016). Understanding and misunderstanding randomized controlled trials. NBER Working Paper Series, 22595.Google Scholar
  31. Demortain, D. (2011). Scientists and the regulation of risk: Standardising control. Cheltenham: Edward Elgar.Google Scholar
  32. Dennis, M., & Boruch, R. (1989). Randomized experiments for planning and testing projects in developing countries—Threshold conditions. Evaluation Review, 13(3), 292–309.Google Scholar
  33. Donovan, K. (2018). The rise of Randomistas: On the experimental turn in international aid. Economy and Society, 47(1), 27–58.Google Scholar
  34. Drexler, A., Fischer, G., & Schoar, A. (2014). Keeping it simple: Financial literacy and rules of thumb. American Economic Journal: Applied Economics, 6(2), 1–31.Google Scholar
  35. Duflo, E. (2003) Poor, but rational?. MIT Working Paper 747.Google Scholar
  36. Duflo, E. (2006) Field experiments in development economics. Lecture delivered at the 2006 world congress of the econometric society. Retrieved June 1, 2017, https://economics.mit.edu/files/800. Accessed 17 Oct 2018.
  37. Duflo, E. (2010) Social experiments to fight poverty. Lecture delivered at TED conference in February 2010. Retrieved June 1, 2017, https://www.ted.com/talks/esther_duflo_social_experiments_to_fight_poverty. Accessed 17 Oct 2018.
  38. Duflo, E. (2011) The Power of Data in Decision Making. Lecture Delivered at the Center for Effective Philanthropy in May 10-11, 2011. Retrieved June 1, 2017, http://cep.org/programming/national-conferences/2011-conference/. Accessed 17 Oct 2018.
  39. Duflo, E. (2016) Randomized controlled trials, development economics and policy making in developing countries. Lecture delivered at the World Bank conference “the state of economics, the state of the world in June 2016. Retrieved June 1, 2017, http://pubdocs.worldbank.org/en/394531465569503682/Esther-Duflo-PRESENTATION.pdf. Accessed 17 Oct 2018.
  40. Duflo, E. (2017). The economist as plumber. American Economics Review: Papers and Proceedings, 107(5), 1–26.Google Scholar
  41. Duflo, E., Glennerster, R., & Kremer, M. (2007). Using randomization in development economics research: A toolkit. Handbook of Development Economics, 4, 3895–3962.Google Scholar
  42. Duflo, E., Kremer, M., & Robinson, J. (2011). Nudging farmers to use fertilizer: Theory and experimental evidence from Kenya. American Economic Review, 101(6), 2350–2390.Google Scholar
  43. Easterly, W. (2007). The White’s men burden: Why the West's efforts to aid the rest have done so much ill and so little good. New York: Penguin USA.Google Scholar
  44. Edwards, M. (2017). The Emperor’s new clothes. In M. Moody & B. Breeze (Eds.), The philanthropy reader. New York: Routledge.Google Scholar
  45. Eyal, G. (2000). Anti-politics and the Spirit of capitalism: Dissidents, monetarists and the Czech transition to capitalism. Theory and Society, 29(1), 49–92.Google Scholar
  46. Eyal, G. (2013). For a sociology of expertise: The social origins of the autism epidemic. American Journal of Sociology, 118(4), 863–907.Google Scholar
  47. Freeman, H., Rossi, P., & Wright, S. (1980). Evaluating social projects in developing countries. Paris: OECD Development Centre.Google Scholar
  48. Frumkin, P. (2003). Inside venture philanthropy. Society, 40(7), 7–15.Google Scholar
  49. Gates, B. (2011) Small adjustments to aid programs can yield big results. The Gates notes blog. Retrieved May 2, 2017, https://www.gatesnotes.com/Books/Poor-Economics. Accessed 17 Oct 2018.
  50. Gates, B. (2014) A cautionary tale from Africa. The Gates notes blog. Retrieved May 2, 2017, https://www.gatesnotes.com/Books/The-Idealist-A-Cautionary-Tale-From-Africa). Accessed 17 Oct 2018.
  51. Gates Foundation. (2016) Grants database” and “annual letter 2013. Retrieved October 12, 2016, http://www.gatesfoundation.org.
  52. GiveWell. (2017) Top charities” and “DMI. GiveWell Org. Website. Retrieved October 15, 2017, https://www.givewell.org/charities/DMI. Accessed 17 Oct 2018.
  53. Glennerster, R. (2015) So you want to do an RCT with a government: Things you should know. Running Randomized Evaluations Blog. Retrieved June 1, 2017, http://runningres.com/blog/2015/12/9/so-you-want-to-do-an-rct-with-a-government-things-you-should-know. Accessed 17 Oct 2018.
  54. Glennerster, R., & Takavarasha, K. (2013). Running randomized evaluations: A practical guide. Princeton: Princeton University Press.Google Scholar
  55. Guala, F. (2007) How to do things with experimental economics in Do economists make markets?, edited by Donald MacKenzie, Fabian Muniesa, and Lucia Siu. Princeton University Press.Google Scholar
  56. Gueron, J. (2017). The politics and practice of social experiments: Seeds of a revolution. In A. Banerjee & E. Duflo (Eds.), Handbook of field experiments. Oxford: Elsevier.Google Scholar
  57. Haydu, J. (1998). Making use of the past: Time periods as cases to compare and as sequences of problem solving. American Journal of Sociology, 104(2), 339–371.Google Scholar
  58. Heckman, J., Hohmann, N., Smith, J., & Khoo, M. (2000). Substitution and dropout Bias in social experiments: A study of an influential social experiment. The Quarterly Journal of Economics, 115(2), 651–694.Google Scholar
  59. Heukelom, F. (2012). Sense of Mission: The Alfred P. Sloan and Russell sage foundations’ behavioral economics program, 1984–1992. Science in Context, 25(2), 263–286.Google Scholar
  60. Heydemann, S., & Kinsey, R. (2010). The state and international philanthropy: The contribution of American foundations 1919-1991. In H. Anheier & D. Hammack (Eds.), American foundations: Roles and contributions. Washington: Brookings Institute.Google Scholar
  61. Hornick, R., Ingle, H., Mayo, J., Mcanany, E., & Schramm, W. (1973) Final report: Television and educational reform in El Salvador.Google Scholar
  62. Humpreys, M. (2015) What has been learned from the deworming replications: A nonpartisan view. Unpublished manuscript. Retrieved February 15, 2016, http://www.macartan.nyc/comments/worms2/. Accessed 17 Oct 2018.
  63. IADB. (2017) Production, use, and influence of IADB’s impact evaluations. Approach Paper Series, Inter-American Development Bank publication.Google Scholar
  64. JPAL. (2016) The Abdul Latif Jameel poverty action lab website. Retrieved January 13, 2016, https://www.povertyactionlab.org/evaluations. Accessed 17 Oct 2018.
  65. Karlan, D., & Appel, J. (2011). More than good intentions: How a new economics is helping to solve global poverty. New York: Dutton Press.Google Scholar
  66. Karlan, D., McConnell, M., Mullainathan, S., & Zinman, J. (2014). Getting to the top of mind: How reminders increase saving. Management Science, 62(12), 3393–3411.Google Scholar
  67. Kim, J. (2017) Rethinking development finance. Lecture delivered at the London School of Economics (LSE) on April 11, 2017. Retrieved October 15, 2017, http://www.lse.ac.uk/website-archive/newsAndMedia/videoAndAudio/channels/publicLecturesAndEvents/Home.aspx. Accessed 17 Oct 2018.
  68. Krause, M. (2014). The good project: Humanitarian relief NGOs and the fragmentation of reason. Chicago: Chicago University Press.Google Scholar
  69. Krueger, A. (1995). Policy lessons from development experience since the second world war. In J. Behrman & T. N. Srinivasan (Eds.), Handbook of development economics, Volume III. Oxford: Elsevier Science Books.Google Scholar
  70. Krueger, A., Michalopoulos, C., & Ruttan, V. (1989). Aid and development. Baltimore: Johns Hopkins University Press.Google Scholar
  71. Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge: Harvard University Press.Google Scholar
  72. Levitt, S., & List, J. (2008) Field experiments in economics: The past, the present and the future. NBER Working Paper 14356.Google Scholar
  73. Marks, H. (1997). The Progress of experiment: Science and therapeutic reform in the United States, 1900–1990. Cambridge: Cambridge University Press.Google Scholar
  74. Medevtz, T. (2012). Think thanks in America. Chicago: The University of Chicago Press.Google Scholar
  75. Miguel, E., & Kremer, M. (2004). Worms: Identifying impacts on education and health in the presence of treatment externalities. Econometrica, 72(1), 159–217.Google Scholar
  76. Moyo, D. (2009). Dead aid: Why aid is not working and how there is a better way for Africa. New York: Fahar, Straus and Giroux.Google Scholar
  77. Mudge, S., & Vauchez, A. (2012). Building Europe on a weak field: Law, economics, and scholarly avatars in transnational politics. American Journal of Sociology, 118(2), 449–492.Google Scholar
  78. Mullainathan, S., & Thaler, R. (2000) Behavioral economics. NBER Working Papers 7948.Google Scholar
  79. Murray, F. (2010). The Oncomouse that roared: Hybrid exchange strategies as a source of distinction at the boundary of overlapping institutions. American Journal of Sociology, 116(2), 341–388.Google Scholar
  80. OECD. (2011) Measuring aid: 50 years of DAC statistics: 1961–2011. OECD publications. Retrieved June 1, 2017, https://www.oecd.org/dac/stats/documentupload/MeasuringAid50yearsDACStats.pdf. Accessed 17 Oct 2018.
  81. Ogden, T. (2016). Experimental conversations: Perspectives on randomized trials in development economics. Cambridge: MIT Press.Google Scholar
  82. Panofsky, A. (2011). Generating sociability to drive science: Patient advocacy organizations and genetics research. Social Studies of Science, 41(1), 31–57.Google Scholar
  83. Parker, I. (2010) The Poverty Lab. Published in The New Yorker, on May 17th, 2010. Retrieved May 2, 2017, http://www.newyorker.com/magazine/2010/05/17/the-poverty-lab. Accessed 17 Oct 2018.
  84. Pinch, T., & Bijker, W. (1984). The social construction of facts and artefacts: Or how the sociology of science and the sociology of technology might benefit each other. Social Studies of Science, 14(3), 399–441.Google Scholar
  85. Population Council. (1986) An experimental study of the efficiency and effectiveness of an IUD insertion and back-up component (English summary of first six-month report, PCPES86). Lima, Peru: Population Council.Google Scholar
  86. Porter, T. (1995). Trust in Numbers: The pursuit of objectivity in science and public life. Princeton: Princeton University Press.Google Scholar
  87. Ravallion, M. (2009). Evaluation in the practice of development. World Bank Research Observer, 24(1), 29–53.Google Scholar
  88. Rayzberg, M. (2019). Fairness in the field: The ethics of resource allocation in randomized controlled field experiments. Science, Technology, & Human Values, 44(3), 371–398.Google Scholar
  89. Reckhow, S. (2013). Follow the money—How foundations dollars change public schools politics. New York: Oxford University Press.Google Scholar
  90. Riecken, H., & Boruch, R. (1975). Social experimentation: A method for planning and evaluating social intervention. New York: Academic Press.Google Scholar
  91. Rodrik, D. (2006). Goodbye Washington consensus, hello Washington confusion? A review of the World Bank's Economic Growth in the 1990s: Learning from a Decade of Reform. Journal of Economic Literature, 44(4), 973–987.Google Scholar
  92. Rodrik, D. (2008). The new development economics: We shall experiment, but how shall we learn? HKS Faculty Research Working Paper Series, RWP08–RW055.Google Scholar
  93. Rotemberg, M. (2009) Why academic involvement in RCTs is important? IPA blog. Retrieved June 1, 2017, http://www.poverty-action.org/node/2156). Accessed 17 Oct 2018.
  94. Sachs, J. (2005). The end of poverty: How we can make it happen in our lifetime. New York: Penguin USA.Google Scholar
  95. Santos, A. C. (2011). Behavioural and experimental economics: Are they really transforming economics? Cambridge Journal of Economics, 35, 705–728.Google Scholar
  96. Searle, B. (1985) Evaluation in World Bank education projects: Lessons from three case studies. World Bank Discussion Paper EDT5.Google Scholar
  97. Sommer, J. (1977). Beyond charity: U.S. voluntary aid for a changing Thrid world. Washington: Overseas Development Council.Google Scholar
  98. Stampnitzky, L. (2013). Disciplining terror: How experts invented “terrorism”. Cambridge: Cambridge University Press.Google Scholar
  99. Swidler, A., & Watkins, S. (2017). A fraught embrace: The romance and reality of AIDS altruism in Africa. Princeton: Princeton University Press.Google Scholar
  100. Teele, D. (2014). Field experiments and their critics—Essays on the uses and abuses of experimentation in the social sciences. New Haven: Yale University Press.Google Scholar
  101. Thaler, R., & Sunstein, C. (2009). Nudge: Improving decisions about health, wealth and happiness. London: Penguin Books.Google Scholar
  102. USAID. (2009). Trends in development evaluation theory, policies and practices. Washington: United States Agency for International Development Publications.Google Scholar
  103. Wacquant, L. (1992). "Toward a Social Praxeology: The Structure and Logic of Bourdieu's Sociology," in Wacquant, L. and Bourdieu, P. An Invitation to Reflexive Sociology. Chicago: Chicago University Press.Google Scholar
  104. Watkins, S., Swidler, A., & Hannan, T. (2012). Outsourcing social transformation: Development NGOs as organizations. Annual Review of Sociology, 38, 285–315.Google Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.Department of SociologyColumbia UniversityNew YorkUSA

Personalised recommendations