Abstract
This article brings a historical perspective to explain the recent dissemination of randomized controlled trials (RCTs) as the new “gold standard” method to assess international development projects. Although the buzz around RCT evaluations dates from the 2000s, we show that what we are witnessing now is a second wave of RCTs, while a first wave began in the 1960s and ended by the early 1980s. Drawing on content analysis of 123 RCTs, participant observation, and secondary sources, we compare the two waves in terms of the participants in the network of expertise required to carry out field experiments and the characteristics of the projects evaluated. The comparison demonstrates that researchers in the second wave were better positioned to navigate the political difficulties caused by randomization. We explain the differences in the expertise network and in the type of projects as the result of concurrent transformations in the fields of development aid and the economics profession. We draw on Andrew Abbott’s concept of “hinges,” as well as on Bourdieu’s concept of “homology” between fields, to argue that the similar positions and parallel struggles conducted by two groups of actors in the two fields served as the basis for a cross-field alliance, in which RCTs could function as a “hinge” linking together the two fields.
Similar content being viewed by others
Notes
Since Abbott (1988, p. 343) himself suggested that his concept of “task area” is similar to Bourdieu’s “field,” and since the distinction between “ecologies” and “task areas” is immaterial for our purposes, we use “ecology” and “field” here interchangeably. We follow Mudge and Vauchez (2012) in incorporating Abbott’s concepts into field analysis and treating the two approaches as compatible and complementary.
The term “philanthro-capitalists” has been coined by Bishop and Green (2009) to describe new, large foundations, established by successful individuals from the worlds of finance and technology, who “apply business techniques and ways of thinking about their philanthropy” (ibid., p. 6). The often-cited example is the Bill and Melinda Gates Foundation (created in 2000), though the term captures a broader movement in the philanthropy sector (Reckhow 2013).
We also conducted interviews and exchanged emails with key academic figures from the first wave, as well as with Trials Search Coordinators from the Campbell Collaboration and the Cochrane Library to look for possibly missing repositories and reviews from the period.
Experimental designs that did not clearly explain how the control group was assigned, or that used average data from the population as control, were considered quasi-experimental.
A list with all the studies in our two samples is available upon request.
There is a large number of missing values (unknown disciplinary affiliation) in our first wave data (28% of authors), but we do not think this casts doubt on the finding. Given the decades that have passed, it is not surprising that it is more difficult to identify the disciplinary affiliation of first wave authors. The unidentified, however, are unlikely to be US-based economists because this is the category easiest to identify. We suspect that most were local researchers.
Of 32 development papers in the top 5 economics journals in 2015, 10 were RCTs, up from zero in 2000 (Duflo 2016). No first wave studies, in contrast, appeared in high-visibility disciplinary journals. The main venue was Studies in Family Planning, an interdisciplinary journal published by The Population Council.
Another necessary component are the policy beneficiaries themselves, but we do not possess data that would allow us to compare them.
These replicas of a unit from one ecology (a profession, a discipline, a political party,) within another are what Abbott (2005, pp. 265–269) calls “avatars.” For example, applied economics is an avatar of academic economics in the “policy and advice arena.” While an avatar is “an institutionalized hinge,” over time it can become independent and even compete with its original creators.
The existing estimates are not yet reliable enough to determine the relative share of development aid coming from private foundations. The Development Assistance Committee (DAC) estimates that, since 2002, aid to developing countries from Private Voluntary Organizations had been three times as large as ODA. This number, however, includes not just direct development investments, but also private bank lending and remittances, so it is an over-estimate. More accurate estimates are likely in the future since in 2010, the Gates Foundation became the first private aid donor to report to the DAC, encouraging other foundations to do the same (OECD 2011, p. 4).
The problems faced by the Inter-American Development Bank (IADB), which works primarily with governments, underscore this point, demonstrating how hard it is for governments to randomize benefits. Despite explicitly requiring all loans to undergo impact evaluations, IADB was able to conduct RCTs only in 26% of its loans and had to resort to a quasi-experimental design in the reminder. IADB (2017) representatives reported that RCTs were seen as “imposed on country governments, which are reluctant to appropriate RCTs by themselves.”
Pritchett makes the same point, though perhaps more bluntly: “The only people for which the RCT movement is in fact a tool for the job are philanthropists…. From the charity perspective, there’s a nice confluence between the methodological demand for statistical power and of being able to tweak at the individual level. I can give this person food, but not that person. (…) I’m not trying to affect the government; I’m not trying to affect national development processes.” (in Ogden 2016, p. 142)
For example, total donations to the four charities shortlisted by GiveWell increased from $3,000,000 in 2010 to $110,000,000 in 2015. Conversely, when RCT evidence is inconclusive, projects lose their recommended status and funding, as happened to Development Media International (Givewell 2017).
References
Abbott, A. (1988). The system of professions: An essay on the division of expert labor. Chicago: University of Chicago Press.
Abbott, A. (2005). Linked ecologies: States and universities as environments for professions. Sociological Theory, 23(3), 245–274.
Adams, V. (2016). Metrics: What counts in global health. Durham: Duke University Press.
Angrist, J., & Pischke, J. (2010). The credibility revolution in empirical economics: How better research design is taking the con out of econometrics. Journal of Economic Perspectives, 24(2), 3–30.
Angrist, J., Azoulay, P., Ellison, G., Hill, R., & Feng Lu, S. (2017). Economic research evolves: Fields and styles. American Economic Review, 107(5), 293–297.
Babb, S. (2009). Behind the development banks: Washington politics, world poverty, and the wealth of nations. Chicago: The University of Chicago Press.
Babb, S., & Chorev, N. (2016). International organizations: Loose and tight coupling in the development regime. Studies in Comparative International Development, 51(1), 81–102.
Banerjee, A. (2007). Making aid work. Cambridge: MIT Press.
Banerjee, A., & Duflo, E. (2011). Poor economics: A radical rethinking of the way to fight global poverty. United States: Public Affairs Book.
Banerjee, A., Karlan, D., & Zinman, J. (2015). Six randomized evaluations of microcredit: Introduction and further steps. American Economic Journal: Applied Economics, 7(1), 1–21.
Banerjee, A., Chassang, S, & Snowberg, E. (2016) Decision theoretic approaches to experiment design and external validity. NBER Working Paper 22167.
Barnes, B., Bloor, D., & Henry, J. (1996). Scientific knowledge: A sociological approach. Chicago: University of Chicago Press.
Bauman, K. (1997). The effectiveness of family planning programs evaluated with true experimental designs. American Journal of Public Health, 87(4), 666–669.
Benko, J. (2013) The hyper efficient, highly scientific scheme to help the World's poor. Wired Magazine. Retrieved on May 2, 2017, https://www.wired.com/2013/11/jpal-randomized-trials/. Accessed 17 Oct 2018.
Berk, R., Boruch, R., Chambers, D., Rossi, P., & Witte, A. (1985). Social policy experimentation: A position paper. Evaluation Review, 9(4), 387–429.
Berndt, C. (2015). Behavioural economics, experimentalism and the marketization of development. Economy and Society, 44(4), 567–591.
Berrios, R. (2000). Contracting for development: The role of for-profit contractors in U.S. foreign development assistance. Westport: Praeger.
Bishop, M., & Green, M. (2008). Philanthrocapitalism: How the rich can save the world. New York: Bloomsbury Press.
Borkum, E., He, F., & Linden, L. (2012). The effects of school libraries on language skills: Evidence from a randomized controlled trial in India. In NBER Working Paper 18183. Cambridge: National Bureau of Economic Research.
Boruch, R., McSweeny, J., & Soderstrom, J. (1978). Randomized field experiments for program planning, development, and evaluation--an illustrative bibliography. Evaluation Quarterly, 2(4), 655–695.
Bourdieu, P. (1975). The specificity of the scientific field and the social conditions of the Progress of reason. Social Science Information, 14, 19–47.
Bourdieu, P. (1977). Outline of the theory of practice. Cambridge: Cambridge University Press.
Carpenter, D. (2010). Reputation and power—Organizational image and pharmaceutical regulation at the FDA. Princeton: Princeton University Press.
Cohen, J., & Dupas, P. (2010). “Free Distribution or Cost-Sharing? Evidence from a Randomized Malaria Prevention Experiment.” The Quarterly Journal of Economics 125(1), 1–45.
Cooley, A., & Ron, J. (2002). The NGO scramble: Organizational insecurity and the political economy of transnational action. International Security, 27(1), 4–39.
Cuca, R., & Pierce, C. (1977). Experiments in family planning – Lessons from developing world. Baltimore: The Johns Hopkins University Press.
Daston, L., & Galison, P. (1992). The image of objectivity. Representations, 40, 81–128.
Deaton, A. (2006) Evidence-Based Aid Must not Become the Latest in a Long String of Development Fads, Pp. 60–61 in Making Aid Work, edited by Abhijit Banerjee. Cambridge: MIT Press.
Deaton, A. (2010). Instruments, randomization, and learning about development. Journal of Economic Literature, 48(2), 424–455.
Deaton, A., & Cartwright, N. (2016). Understanding and misunderstanding randomized controlled trials. NBER Working Paper Series, 22595.
Demortain, D. (2011). Scientists and the regulation of risk: Standardising control. Cheltenham: Edward Elgar.
Dennis, M., & Boruch, R. (1989). Randomized experiments for planning and testing projects in developing countries—Threshold conditions. Evaluation Review, 13(3), 292–309.
Donovan, K. (2018). The rise of Randomistas: On the experimental turn in international aid. Economy and Society, 47(1), 27–58.
Drexler, A., Fischer, G., & Schoar, A. (2014). Keeping it simple: Financial literacy and rules of thumb. American Economic Journal: Applied Economics, 6(2), 1–31.
Duflo, E. (2003) Poor, but rational?. MIT Working Paper 747.
Duflo, E. (2006) Field experiments in development economics. Lecture delivered at the 2006 world congress of the econometric society. Retrieved June 1, 2017, https://economics.mit.edu/files/800. Accessed 17 Oct 2018.
Duflo, E. (2010) Social experiments to fight poverty. Lecture delivered at TED conference in February 2010. Retrieved June 1, 2017, https://www.ted.com/talks/esther_duflo_social_experiments_to_fight_poverty. Accessed 17 Oct 2018.
Duflo, E. (2011) The Power of Data in Decision Making. Lecture Delivered at the Center for Effective Philanthropy in May 10-11, 2011. Retrieved June 1, 2017, http://cep.org/programming/national-conferences/2011-conference/. Accessed 17 Oct 2018.
Duflo, E. (2016) Randomized controlled trials, development economics and policy making in developing countries. Lecture delivered at the World Bank conference “the state of economics, the state of the world in June 2016. Retrieved June 1, 2017, http://pubdocs.worldbank.org/en/394531465569503682/Esther-Duflo-PRESENTATION.pdf. Accessed 17 Oct 2018.
Duflo, E. (2017). The economist as plumber. American Economics Review: Papers and Proceedings, 107(5), 1–26.
Duflo, E., Glennerster, R., & Kremer, M. (2007). Using randomization in development economics research: A toolkit. Handbook of Development Economics, 4, 3895–3962.
Duflo, E., Kremer, M., & Robinson, J. (2011). Nudging farmers to use fertilizer: Theory and experimental evidence from Kenya. American Economic Review, 101(6), 2350–2390.
Easterly, W. (2007). The White’s men burden: Why the West's efforts to aid the rest have done so much ill and so little good. New York: Penguin USA.
Edwards, M. (2017). The Emperor’s new clothes. In M. Moody & B. Breeze (Eds.), The philanthropy reader. New York: Routledge.
Eyal, G. (2000). Anti-politics and the Spirit of capitalism: Dissidents, monetarists and the Czech transition to capitalism. Theory and Society, 29(1), 49–92.
Eyal, G. (2013). For a sociology of expertise: The social origins of the autism epidemic. American Journal of Sociology, 118(4), 863–907.
Freeman, H., Rossi, P., & Wright, S. (1980). Evaluating social projects in developing countries. Paris: OECD Development Centre.
Frumkin, P. (2003). Inside venture philanthropy. Society, 40(7), 7–15.
Gates, B. (2011) Small adjustments to aid programs can yield big results. The Gates notes blog. Retrieved May 2, 2017, https://www.gatesnotes.com/Books/Poor-Economics. Accessed 17 Oct 2018.
Gates, B. (2014) A cautionary tale from Africa. The Gates notes blog. Retrieved May 2, 2017, https://www.gatesnotes.com/Books/The-Idealist-A-Cautionary-Tale-From-Africa). Accessed 17 Oct 2018.
Gates Foundation. (2016) Grants database” and “annual letter 2013. Retrieved October 12, 2016, http://www.gatesfoundation.org.
GiveWell. (2017) Top charities” and “DMI. GiveWell Org. Website. Retrieved October 15, 2017, https://www.givewell.org/charities/DMI. Accessed 17 Oct 2018.
Glennerster, R. (2015) So you want to do an RCT with a government: Things you should know. Running Randomized Evaluations Blog. Retrieved June 1, 2017, http://runningres.com/blog/2015/12/9/so-you-want-to-do-an-rct-with-a-government-things-you-should-know. Accessed 17 Oct 2018.
Glennerster, R., & Takavarasha, K. (2013). Running randomized evaluations: A practical guide. Princeton: Princeton University Press.
Guala, F. (2007) How to do things with experimental economics in Do economists make markets?, edited by Donald MacKenzie, Fabian Muniesa, and Lucia Siu. Princeton University Press.
Gueron, J. (2017). The politics and practice of social experiments: Seeds of a revolution. In A. Banerjee & E. Duflo (Eds.), Handbook of field experiments. Oxford: Elsevier.
Haydu, J. (1998). Making use of the past: Time periods as cases to compare and as sequences of problem solving. American Journal of Sociology, 104(2), 339–371.
Heckman, J., Hohmann, N., Smith, J., & Khoo, M. (2000). Substitution and dropout Bias in social experiments: A study of an influential social experiment. The Quarterly Journal of Economics, 115(2), 651–694.
Heukelom, F. (2012). Sense of Mission: The Alfred P. Sloan and Russell sage foundations’ behavioral economics program, 1984–1992. Science in Context, 25(2), 263–286.
Heydemann, S., & Kinsey, R. (2010). The state and international philanthropy: The contribution of American foundations 1919-1991. In H. Anheier & D. Hammack (Eds.), American foundations: Roles and contributions. Washington: Brookings Institute.
Hornick, R., Ingle, H., Mayo, J., Mcanany, E., & Schramm, W. (1973) Final report: Television and educational reform in El Salvador.
Humpreys, M. (2015) What has been learned from the deworming replications: A nonpartisan view. Unpublished manuscript. Retrieved February 15, 2016, http://www.macartan.nyc/comments/worms2/. Accessed 17 Oct 2018.
IADB. (2017) Production, use, and influence of IADB’s impact evaluations. Approach Paper Series, Inter-American Development Bank publication.
JPAL. (2016) The Abdul Latif Jameel poverty action lab website. Retrieved January 13, 2016, https://www.povertyactionlab.org/evaluations. Accessed 17 Oct 2018.
Karlan, D., & Appel, J. (2011). More than good intentions: How a new economics is helping to solve global poverty. New York: Dutton Press.
Karlan, D., McConnell, M., Mullainathan, S., & Zinman, J. (2014). Getting to the top of mind: How reminders increase saving. Management Science, 62(12), 3393–3411.
Kim, J. (2017) Rethinking development finance. Lecture delivered at the London School of Economics (LSE) on April 11, 2017. Retrieved October 15, 2017, http://www.lse.ac.uk/website-archive/newsAndMedia/videoAndAudio/channels/publicLecturesAndEvents/Home.aspx. Accessed 17 Oct 2018.
Krause, M. (2014). The good project: Humanitarian relief NGOs and the fragmentation of reason. Chicago: Chicago University Press.
Krueger, A. (1995). Policy lessons from development experience since the second world war. In J. Behrman & T. N. Srinivasan (Eds.), Handbook of development economics, Volume III. Oxford: Elsevier Science Books.
Krueger, A., Michalopoulos, C., & Ruttan, V. (1989). Aid and development. Baltimore: Johns Hopkins University Press.
Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge: Harvard University Press.
Levitt, S., & List, J. (2008) Field experiments in economics: The past, the present and the future. NBER Working Paper 14356.
Marks, H. (1997). The Progress of experiment: Science and therapeutic reform in the United States, 1900–1990. Cambridge: Cambridge University Press.
Medevtz, T. (2012). Think thanks in America. Chicago: The University of Chicago Press.
Miguel, E., & Kremer, M. (2004). Worms: Identifying impacts on education and health in the presence of treatment externalities. Econometrica, 72(1), 159–217.
Moyo, D. (2009). Dead aid: Why aid is not working and how there is a better way for Africa. New York: Fahar, Straus and Giroux.
Mudge, S., & Vauchez, A. (2012). Building Europe on a weak field: Law, economics, and scholarly avatars in transnational politics. American Journal of Sociology, 118(2), 449–492.
Mullainathan, S., & Thaler, R. (2000) Behavioral economics. NBER Working Papers 7948.
Murray, F. (2010). The Oncomouse that roared: Hybrid exchange strategies as a source of distinction at the boundary of overlapping institutions. American Journal of Sociology, 116(2), 341–388.
OECD. (2011) Measuring aid: 50 years of DAC statistics: 1961–2011. OECD publications. Retrieved June 1, 2017, https://www.oecd.org/dac/stats/documentupload/MeasuringAid50yearsDACStats.pdf. Accessed 17 Oct 2018.
Ogden, T. (2016). Experimental conversations: Perspectives on randomized trials in development economics. Cambridge: MIT Press.
Panofsky, A. (2011). Generating sociability to drive science: Patient advocacy organizations and genetics research. Social Studies of Science, 41(1), 31–57.
Parker, I. (2010) The Poverty Lab. Published in The New Yorker, on May 17th, 2010. Retrieved May 2, 2017, http://www.newyorker.com/magazine/2010/05/17/the-poverty-lab. Accessed 17 Oct 2018.
Pinch, T., & Bijker, W. (1984). The social construction of facts and artefacts: Or how the sociology of science and the sociology of technology might benefit each other. Social Studies of Science, 14(3), 399–441.
Population Council. (1986) An experimental study of the efficiency and effectiveness of an IUD insertion and back-up component (English summary of first six-month report, PCPES86). Lima, Peru: Population Council.
Porter, T. (1995). Trust in Numbers: The pursuit of objectivity in science and public life. Princeton: Princeton University Press.
Ravallion, M. (2009). Evaluation in the practice of development. World Bank Research Observer, 24(1), 29–53.
Rayzberg, M. (2019). Fairness in the field: The ethics of resource allocation in randomized controlled field experiments. Science, Technology, & Human Values, 44(3), 371–398.
Reckhow, S. (2013). Follow the money—How foundations dollars change public schools politics. New York: Oxford University Press.
Riecken, H., & Boruch, R. (1975). Social experimentation: A method for planning and evaluating social intervention. New York: Academic Press.
Rodrik, D. (2006). Goodbye Washington consensus, hello Washington confusion? A review of the World Bank's Economic Growth in the 1990s: Learning from a Decade of Reform. Journal of Economic Literature, 44(4), 973–987.
Rodrik, D. (2008). The new development economics: We shall experiment, but how shall we learn? HKS Faculty Research Working Paper Series, RWP08–RW055.
Rotemberg, M. (2009) Why academic involvement in RCTs is important? IPA blog. Retrieved June 1, 2017, http://www.poverty-action.org/node/2156). Accessed 17 Oct 2018.
Sachs, J. (2005). The end of poverty: How we can make it happen in our lifetime. New York: Penguin USA.
Santos, A. C. (2011). Behavioural and experimental economics: Are they really transforming economics? Cambridge Journal of Economics, 35, 705–728.
Searle, B. (1985) Evaluation in World Bank education projects: Lessons from three case studies. World Bank Discussion Paper EDT5.
Sommer, J. (1977). Beyond charity: U.S. voluntary aid for a changing Thrid world. Washington: Overseas Development Council.
Stampnitzky, L. (2013). Disciplining terror: How experts invented “terrorism”. Cambridge: Cambridge University Press.
Swidler, A., & Watkins, S. (2017). A fraught embrace: The romance and reality of AIDS altruism in Africa. Princeton: Princeton University Press.
Teele, D. (2014). Field experiments and their critics—Essays on the uses and abuses of experimentation in the social sciences. New Haven: Yale University Press.
Thaler, R., & Sunstein, C. (2009). Nudge: Improving decisions about health, wealth and happiness. London: Penguin Books.
USAID. (2009). Trends in development evaluation theory, policies and practices. Washington: United States Agency for International Development Publications.
Wacquant, L. (1992). "Toward a Social Praxeology: The Structure and Logic of Bourdieu's Sociology," in Wacquant, L. and Bourdieu, P. An Invitation to Reflexive Sociology. Chicago: Chicago University Press.
Watkins, S., Swidler, A., & Hannan, T. (2012). Outsourcing social transformation: Development NGOs as organizations. Annual Review of Sociology, 38, 285–315.
Acknowledgements
The authors want to thank XingJian Li for her meticulous work in this project, as well as Margarita Rayzberg, Diana Graizbord, Moran Levy, Joan Robinson, Josh Whitford, and Diane Vaughn for their feedback. Previous versions of this article were presented at the 2014 Society for Social Studies of Science Meeting, the 2015 Social Science History Association Meeting, the 2016 and 2018 American Sociological Association Conference, Columbia University’s SKAT workshop, Sciences Po, and the Federal University of Rio de Janeiro. We also thank the participants at these presentations for their helpful comments and suggestions.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
de Souza Leão, L., Eyal, G. The rise of randomized controlled trials (RCTs) in international development in historical perspective. Theor Soc 48, 383–418 (2019). https://doi.org/10.1007/s11186-019-09352-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11186-019-09352-6