Skip to main content
Log in

The rise of randomized controlled trials (RCTs) in international development in historical perspective

  • Published:
Theory and Society Aims and scope Submit manuscript

Abstract

This article brings a historical perspective to explain the recent dissemination of randomized controlled trials (RCTs) as the new “gold standard” method to assess international development projects. Although the buzz around RCT evaluations dates from the 2000s, we show that what we are witnessing now is a second wave of RCTs, while a first wave began in the 1960s and ended by the early 1980s. Drawing on content analysis of 123 RCTs, participant observation, and secondary sources, we compare the two waves in terms of the participants in the network of expertise required to carry out field experiments and the characteristics of the projects evaluated. The comparison demonstrates that researchers in the second wave were better positioned to navigate the political difficulties caused by randomization. We explain the differences in the expertise network and in the type of projects as the result of concurrent transformations in the fields of development aid and the economics profession. We draw on Andrew Abbott’s concept of “hinges,” as well as on Bourdieu’s concept of “homology” between fields, to argue that the similar positions and parallel struggles conducted by two groups of actors in the two fields served as the basis for a cross-field alliance, in which RCTs could function as a “hinge” linking together the two fields.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. Since Abbott (1988, p. 343) himself suggested that his concept of “task area” is similar to Bourdieu’s “field,” and since the distinction between “ecologies” and “task areas” is immaterial for our purposes, we use “ecology” and “field” here interchangeably. We follow Mudge and Vauchez (2012) in incorporating Abbott’s concepts into field analysis and treating the two approaches as compatible and complementary.

  2. The term “philanthro-capitalists” has been coined by Bishop and Green (2009) to describe new, large foundations, established by successful individuals from the worlds of finance and technology, who “apply business techniques and ways of thinking about their philanthropy” (ibid., p. 6). The often-cited example is the Bill and Melinda Gates Foundation (created in 2000), though the term captures a broader movement in the philanthropy sector (Reckhow 2013).

  3. We also conducted interviews and exchanged emails with key academic figures from the first wave, as well as with Trials Search Coordinators from the Campbell Collaboration and the Cochrane Library to look for possibly missing repositories and reviews from the period.

  4. Experimental designs that did not clearly explain how the control group was assigned, or that used average data from the population as control, were considered quasi-experimental.

  5. A list with all the studies in our two samples is available upon request.

  6. There is a large number of missing values (unknown disciplinary affiliation) in our first wave data (28% of authors), but we do not think this casts doubt on the finding. Given the decades that have passed, it is not surprising that it is more difficult to identify the disciplinary affiliation of first wave authors. The unidentified, however, are unlikely to be US-based economists because this is the category easiest to identify. We suspect that most were local researchers.

  7. Of 32 development papers in the top 5 economics journals in 2015, 10 were RCTs, up from zero in 2000 (Duflo 2016). No first wave studies, in contrast, appeared in high-visibility disciplinary journals. The main venue was Studies in Family Planning, an interdisciplinary journal published by The Population Council.

  8. Another necessary component are the policy beneficiaries themselves, but we do not possess data that would allow us to compare them.

  9. These replicas of a unit from one ecology (a profession, a discipline, a political party,) within another are what Abbott (2005, pp. 265–269) calls “avatars.” For example, applied economics is an avatar of academic economics in the “policy and advice arena.” While an avatar is “an institutionalized hinge,” over time it can become independent and even compete with its original creators.

  10. The existing estimates are not yet reliable enough to determine the relative share of development aid coming from private foundations. The Development Assistance Committee (DAC) estimates that, since 2002, aid to developing countries from Private Voluntary Organizations had been three times as large as ODA. This number, however, includes not just direct development investments, but also private bank lending and remittances, so it is an over-estimate. More accurate estimates are likely in the future since in 2010, the Gates Foundation became the first private aid donor to report to the DAC, encouraging other foundations to do the same (OECD 2011, p. 4).

  11. The problems faced by the Inter-American Development Bank (IADB), which works primarily with governments, underscore this point, demonstrating how hard it is for governments to randomize benefits. Despite explicitly requiring all loans to undergo impact evaluations, IADB was able to conduct RCTs only in 26% of its loans and had to resort to a quasi-experimental design in the reminder. IADB (2017) representatives reported that RCTs were seen as “imposed on country governments, which are reluctant to appropriate RCTs by themselves.”

  12. Pritchett makes the same point, though perhaps more bluntly: “The only people for which the RCT movement is in fact a tool for the job are philanthropists…. From the charity perspective, there’s a nice confluence between the methodological demand for statistical power and of being able to tweak at the individual level. I can give this person food, but not that person. (…) I’m not trying to affect the government; I’m not trying to affect national development processes.” (in Ogden 2016, p. 142)

  13. For example, total donations to the four charities shortlisted by GiveWell increased from $3,000,000 in 2010 to $110,000,000 in 2015. Conversely, when RCT evidence is inconclusive, projects lose their recommended status and funding, as happened to Development Media International (Givewell 2017).

References

  • Abbott, A. (1988). The system of professions: An essay on the division of expert labor. Chicago: University of Chicago Press.

    Book  Google Scholar 

  • Abbott, A. (2005). Linked ecologies: States and universities as environments for professions. Sociological Theory, 23(3), 245–274.

    Article  Google Scholar 

  • Adams, V. (2016). Metrics: What counts in global health. Durham: Duke University Press.

    Book  Google Scholar 

  • Angrist, J., & Pischke, J. (2010). The credibility revolution in empirical economics: How better research design is taking the con out of econometrics. Journal of Economic Perspectives, 24(2), 3–30.

    Article  Google Scholar 

  • Angrist, J., Azoulay, P., Ellison, G., Hill, R., & Feng Lu, S. (2017). Economic research evolves: Fields and styles. American Economic Review, 107(5), 293–297.

    Article  Google Scholar 

  • Babb, S. (2009). Behind the development banks: Washington politics, world poverty, and the wealth of nations. Chicago: The University of Chicago Press.

    Book  Google Scholar 

  • Babb, S., & Chorev, N. (2016). International organizations: Loose and tight coupling in the development regime. Studies in Comparative International Development, 51(1), 81–102.

    Article  Google Scholar 

  • Banerjee, A. (2007). Making aid work. Cambridge: MIT Press.

    Google Scholar 

  • Banerjee, A., & Duflo, E. (2011). Poor economics: A radical rethinking of the way to fight global poverty. United States: Public Affairs Book.

    Google Scholar 

  • Banerjee, A., Karlan, D., & Zinman, J. (2015). Six randomized evaluations of microcredit: Introduction and further steps. American Economic Journal: Applied Economics, 7(1), 1–21.

    Google Scholar 

  • Banerjee, A., Chassang, S, & Snowberg, E. (2016) Decision theoretic approaches to experiment design and external validity. NBER Working Paper 22167.

  • Barnes, B., Bloor, D., & Henry, J. (1996). Scientific knowledge: A sociological approach. Chicago: University of Chicago Press.

    Google Scholar 

  • Bauman, K. (1997). The effectiveness of family planning programs evaluated with true experimental designs. American Journal of Public Health, 87(4), 666–669.

    Article  Google Scholar 

  • Benko, J. (2013) The hyper efficient, highly scientific scheme to help the World's poor. Wired Magazine. Retrieved on May 2, 2017, https://www.wired.com/2013/11/jpal-randomized-trials/. Accessed 17 Oct 2018.

  • Berk, R., Boruch, R., Chambers, D., Rossi, P., & Witte, A. (1985). Social policy experimentation: A position paper. Evaluation Review, 9(4), 387–429.

    Article  Google Scholar 

  • Berndt, C. (2015). Behavioural economics, experimentalism and the marketization of development. Economy and Society, 44(4), 567–591.

    Article  Google Scholar 

  • Berrios, R. (2000). Contracting for development: The role of for-profit contractors in U.S. foreign development assistance. Westport: Praeger.

    Google Scholar 

  • Bishop, M., & Green, M. (2008). Philanthrocapitalism: How the rich can save the world. New York: Bloomsbury Press.

    Google Scholar 

  • Borkum, E., He, F., & Linden, L. (2012). The effects of school libraries on language skills: Evidence from a randomized controlled trial in India. In NBER Working Paper 18183. Cambridge: National Bureau of Economic Research.

    Google Scholar 

  • Boruch, R., McSweeny, J., & Soderstrom, J. (1978). Randomized field experiments for program planning, development, and evaluation--an illustrative bibliography. Evaluation Quarterly, 2(4), 655–695.

    Article  Google Scholar 

  • Bourdieu, P. (1975). The specificity of the scientific field and the social conditions of the Progress of reason. Social Science Information, 14, 19–47.

    Article  Google Scholar 

  • Bourdieu, P. (1977). Outline of the theory of practice. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Carpenter, D. (2010). Reputation and power—Organizational image and pharmaceutical regulation at the FDA. Princeton: Princeton University Press.

    Google Scholar 

  • Cohen, J., & Dupas, P. (2010). “Free Distribution or Cost-Sharing? Evidence from a Randomized Malaria Prevention Experiment.” The Quarterly Journal of Economics 125(1), 1–45.

  • Cooley, A., & Ron, J. (2002). The NGO scramble: Organizational insecurity and the political economy of transnational action. International Security, 27(1), 4–39.

    Article  Google Scholar 

  • Cuca, R., & Pierce, C. (1977). Experiments in family planning – Lessons from developing world. Baltimore: The Johns Hopkins University Press.

    Google Scholar 

  • Daston, L., & Galison, P. (1992). The image of objectivity. Representations, 40, 81–128.

    Article  Google Scholar 

  • Deaton, A. (2006) Evidence-Based Aid Must not Become the Latest in a Long String of Development Fads, Pp. 60–61 in Making Aid Work, edited by Abhijit Banerjee. Cambridge: MIT Press.

  • Deaton, A. (2010). Instruments, randomization, and learning about development. Journal of Economic Literature, 48(2), 424–455.

    Article  Google Scholar 

  • Deaton, A., & Cartwright, N. (2016). Understanding and misunderstanding randomized controlled trials. NBER Working Paper Series, 22595.

  • Demortain, D. (2011). Scientists and the regulation of risk: Standardising control. Cheltenham: Edward Elgar.

    Book  Google Scholar 

  • Dennis, M., & Boruch, R. (1989). Randomized experiments for planning and testing projects in developing countries—Threshold conditions. Evaluation Review, 13(3), 292–309.

    Article  Google Scholar 

  • Donovan, K. (2018). The rise of Randomistas: On the experimental turn in international aid. Economy and Society, 47(1), 27–58.

    Article  Google Scholar 

  • Drexler, A., Fischer, G., & Schoar, A. (2014). Keeping it simple: Financial literacy and rules of thumb. American Economic Journal: Applied Economics, 6(2), 1–31.

    Google Scholar 

  • Duflo, E. (2003) Poor, but rational?. MIT Working Paper 747.

  • Duflo, E. (2006) Field experiments in development economics. Lecture delivered at the 2006 world congress of the econometric society. Retrieved June 1, 2017, https://economics.mit.edu/files/800. Accessed 17 Oct 2018.

  • Duflo, E. (2010) Social experiments to fight poverty. Lecture delivered at TED conference in February 2010. Retrieved June 1, 2017, https://www.ted.com/talks/esther_duflo_social_experiments_to_fight_poverty. Accessed 17 Oct 2018.

  • Duflo, E. (2011) The Power of Data in Decision Making. Lecture Delivered at the Center for Effective Philanthropy in May 10-11, 2011. Retrieved June 1, 2017, http://cep.org/programming/national-conferences/2011-conference/. Accessed 17 Oct 2018.

  • Duflo, E. (2016) Randomized controlled trials, development economics and policy making in developing countries. Lecture delivered at the World Bank conference “the state of economics, the state of the world in June 2016. Retrieved June 1, 2017, http://pubdocs.worldbank.org/en/394531465569503682/Esther-Duflo-PRESENTATION.pdf. Accessed 17 Oct 2018.

  • Duflo, E. (2017). The economist as plumber. American Economics Review: Papers and Proceedings, 107(5), 1–26.

    Article  Google Scholar 

  • Duflo, E., Glennerster, R., & Kremer, M. (2007). Using randomization in development economics research: A toolkit. Handbook of Development Economics, 4, 3895–3962.

    Article  Google Scholar 

  • Duflo, E., Kremer, M., & Robinson, J. (2011). Nudging farmers to use fertilizer: Theory and experimental evidence from Kenya. American Economic Review, 101(6), 2350–2390.

    Article  Google Scholar 

  • Easterly, W. (2007). The White’s men burden: Why the West's efforts to aid the rest have done so much ill and so little good. New York: Penguin USA.

    Google Scholar 

  • Edwards, M. (2017). The Emperor’s new clothes. In M. Moody & B. Breeze (Eds.), The philanthropy reader. New York: Routledge.

    Google Scholar 

  • Eyal, G. (2000). Anti-politics and the Spirit of capitalism: Dissidents, monetarists and the Czech transition to capitalism. Theory and Society, 29(1), 49–92.

    Article  Google Scholar 

  • Eyal, G. (2013). For a sociology of expertise: The social origins of the autism epidemic. American Journal of Sociology, 118(4), 863–907.

    Article  Google Scholar 

  • Freeman, H., Rossi, P., & Wright, S. (1980). Evaluating social projects in developing countries. Paris: OECD Development Centre.

    Google Scholar 

  • Frumkin, P. (2003). Inside venture philanthropy. Society, 40(7), 7–15.

    Article  Google Scholar 

  • Gates, B. (2011) Small adjustments to aid programs can yield big results. The Gates notes blog. Retrieved May 2, 2017, https://www.gatesnotes.com/Books/Poor-Economics. Accessed 17 Oct 2018.

  • Gates, B. (2014) A cautionary tale from Africa. The Gates notes blog. Retrieved May 2, 2017, https://www.gatesnotes.com/Books/The-Idealist-A-Cautionary-Tale-From-Africa). Accessed 17 Oct 2018.

  • Gates Foundation. (2016) Grants database” and “annual letter 2013. Retrieved October 12, 2016, http://www.gatesfoundation.org.

  • GiveWell. (2017) Top charities” and “DMI. GiveWell Org. Website. Retrieved October 15, 2017, https://www.givewell.org/charities/DMI. Accessed 17 Oct 2018.

  • Glennerster, R. (2015) So you want to do an RCT with a government: Things you should know. Running Randomized Evaluations Blog. Retrieved June 1, 2017, http://runningres.com/blog/2015/12/9/so-you-want-to-do-an-rct-with-a-government-things-you-should-know. Accessed 17 Oct 2018.

  • Glennerster, R., & Takavarasha, K. (2013). Running randomized evaluations: A practical guide. Princeton: Princeton University Press.

    Book  Google Scholar 

  • Guala, F. (2007) How to do things with experimental economics in Do economists make markets?, edited by Donald MacKenzie, Fabian Muniesa, and Lucia Siu. Princeton University Press.

  • Gueron, J. (2017). The politics and practice of social experiments: Seeds of a revolution. In A. Banerjee & E. Duflo (Eds.), Handbook of field experiments. Oxford: Elsevier.

    Google Scholar 

  • Haydu, J. (1998). Making use of the past: Time periods as cases to compare and as sequences of problem solving. American Journal of Sociology, 104(2), 339–371.

    Article  Google Scholar 

  • Heckman, J., Hohmann, N., Smith, J., & Khoo, M. (2000). Substitution and dropout Bias in social experiments: A study of an influential social experiment. The Quarterly Journal of Economics, 115(2), 651–694.

    Article  Google Scholar 

  • Heukelom, F. (2012). Sense of Mission: The Alfred P. Sloan and Russell sage foundations’ behavioral economics program, 1984–1992. Science in Context, 25(2), 263–286.

    Article  Google Scholar 

  • Heydemann, S., & Kinsey, R. (2010). The state and international philanthropy: The contribution of American foundations 1919-1991. In H. Anheier & D. Hammack (Eds.), American foundations: Roles and contributions. Washington: Brookings Institute.

    Google Scholar 

  • Hornick, R., Ingle, H., Mayo, J., Mcanany, E., & Schramm, W. (1973) Final report: Television and educational reform in El Salvador.

  • Humpreys, M. (2015) What has been learned from the deworming replications: A nonpartisan view. Unpublished manuscript. Retrieved February 15, 2016, http://www.macartan.nyc/comments/worms2/. Accessed 17 Oct 2018.

  • IADB. (2017) Production, use, and influence of IADB’s impact evaluations. Approach Paper Series, Inter-American Development Bank publication.

  • JPAL. (2016) The Abdul Latif Jameel poverty action lab website. Retrieved January 13, 2016, https://www.povertyactionlab.org/evaluations. Accessed 17 Oct 2018.

  • Karlan, D., & Appel, J. (2011). More than good intentions: How a new economics is helping to solve global poverty. New York: Dutton Press.

    Google Scholar 

  • Karlan, D., McConnell, M., Mullainathan, S., & Zinman, J. (2014). Getting to the top of mind: How reminders increase saving. Management Science, 62(12), 3393–3411.

    Article  Google Scholar 

  • Kim, J. (2017) Rethinking development finance. Lecture delivered at the London School of Economics (LSE) on April 11, 2017. Retrieved October 15, 2017, http://www.lse.ac.uk/website-archive/newsAndMedia/videoAndAudio/channels/publicLecturesAndEvents/Home.aspx. Accessed 17 Oct 2018.

  • Krause, M. (2014). The good project: Humanitarian relief NGOs and the fragmentation of reason. Chicago: Chicago University Press.

    Book  Google Scholar 

  • Krueger, A. (1995). Policy lessons from development experience since the second world war. In J. Behrman & T. N. Srinivasan (Eds.), Handbook of development economics, Volume III. Oxford: Elsevier Science Books.

  • Krueger, A., Michalopoulos, C., & Ruttan, V. (1989). Aid and development. Baltimore: Johns Hopkins University Press.

    Google Scholar 

  • Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge: Harvard University Press.

    Google Scholar 

  • Levitt, S., & List, J. (2008) Field experiments in economics: The past, the present and the future. NBER Working Paper 14356.

  • Marks, H. (1997). The Progress of experiment: Science and therapeutic reform in the United States, 1900–1990. Cambridge: Cambridge University Press.

    Google Scholar 

  • Medevtz, T. (2012). Think thanks in America. Chicago: The University of Chicago Press.

    Book  Google Scholar 

  • Miguel, E., & Kremer, M. (2004). Worms: Identifying impacts on education and health in the presence of treatment externalities. Econometrica, 72(1), 159–217.

    Article  Google Scholar 

  • Moyo, D. (2009). Dead aid: Why aid is not working and how there is a better way for Africa. New York: Fahar, Straus and Giroux.

    Google Scholar 

  • Mudge, S., & Vauchez, A. (2012). Building Europe on a weak field: Law, economics, and scholarly avatars in transnational politics. American Journal of Sociology, 118(2), 449–492.

    Article  Google Scholar 

  • Mullainathan, S., & Thaler, R. (2000) Behavioral economics. NBER Working Papers 7948.

  • Murray, F. (2010). The Oncomouse that roared: Hybrid exchange strategies as a source of distinction at the boundary of overlapping institutions. American Journal of Sociology, 116(2), 341–388.

    Article  Google Scholar 

  • OECD. (2011) Measuring aid: 50 years of DAC statistics: 1961–2011. OECD publications. Retrieved June 1, 2017, https://www.oecd.org/dac/stats/documentupload/MeasuringAid50yearsDACStats.pdf. Accessed 17 Oct 2018.

  • Ogden, T. (2016). Experimental conversations: Perspectives on randomized trials in development economics. Cambridge: MIT Press.

    Google Scholar 

  • Panofsky, A. (2011). Generating sociability to drive science: Patient advocacy organizations and genetics research. Social Studies of Science, 41(1), 31–57.

    Article  Google Scholar 

  • Parker, I. (2010) The Poverty Lab. Published in The New Yorker, on May 17th, 2010. Retrieved May 2, 2017, http://www.newyorker.com/magazine/2010/05/17/the-poverty-lab. Accessed 17 Oct 2018.

  • Pinch, T., & Bijker, W. (1984). The social construction of facts and artefacts: Or how the sociology of science and the sociology of technology might benefit each other. Social Studies of Science, 14(3), 399–441.

    Article  Google Scholar 

  • Population Council. (1986) An experimental study of the efficiency and effectiveness of an IUD insertion and back-up component (English summary of first six-month report, PCPES86). Lima, Peru: Population Council.

  • Porter, T. (1995). Trust in Numbers: The pursuit of objectivity in science and public life. Princeton: Princeton University Press.

    Google Scholar 

  • Ravallion, M. (2009). Evaluation in the practice of development. World Bank Research Observer, 24(1), 29–53.

    Article  Google Scholar 

  • Rayzberg, M. (2019). Fairness in the field: The ethics of resource allocation in randomized controlled field experiments. Science, Technology, & Human Values, 44(3), 371–398.

    Article  Google Scholar 

  • Reckhow, S. (2013). Follow the money—How foundations dollars change public schools politics. New York: Oxford University Press.

    Google Scholar 

  • Riecken, H., & Boruch, R. (1975). Social experimentation: A method for planning and evaluating social intervention. New York: Academic Press.

    Google Scholar 

  • Rodrik, D. (2006). Goodbye Washington consensus, hello Washington confusion? A review of the World Bank's Economic Growth in the 1990s: Learning from a Decade of Reform. Journal of Economic Literature, 44(4), 973–987.

    Article  Google Scholar 

  • Rodrik, D. (2008). The new development economics: We shall experiment, but how shall we learn? HKS Faculty Research Working Paper Series, RWP08–RW055.

  • Rotemberg, M. (2009) Why academic involvement in RCTs is important? IPA blog. Retrieved June 1, 2017, http://www.poverty-action.org/node/2156). Accessed 17 Oct 2018.

  • Sachs, J. (2005). The end of poverty: How we can make it happen in our lifetime. New York: Penguin USA.

    Google Scholar 

  • Santos, A. C. (2011). Behavioural and experimental economics: Are they really transforming economics? Cambridge Journal of Economics, 35, 705–728.

    Article  Google Scholar 

  • Searle, B. (1985) Evaluation in World Bank education projects: Lessons from three case studies. World Bank Discussion Paper EDT5.

  • Sommer, J. (1977). Beyond charity: U.S. voluntary aid for a changing Thrid world. Washington: Overseas Development Council.

    Google Scholar 

  • Stampnitzky, L. (2013). Disciplining terror: How experts invented “terrorism”. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Swidler, A., & Watkins, S. (2017). A fraught embrace: The romance and reality of AIDS altruism in Africa. Princeton: Princeton University Press.

    Book  Google Scholar 

  • Teele, D. (2014). Field experiments and their critics—Essays on the uses and abuses of experimentation in the social sciences. New Haven: Yale University Press.

    Google Scholar 

  • Thaler, R., & Sunstein, C. (2009). Nudge: Improving decisions about health, wealth and happiness. London: Penguin Books.

    Google Scholar 

  • USAID. (2009). Trends in development evaluation theory, policies and practices. Washington: United States Agency for International Development Publications.

    Google Scholar 

  • Wacquant, L. (1992). "Toward a Social Praxeology: The Structure and Logic of Bourdieu's Sociology," in Wacquant, L. and Bourdieu, P. An Invitation to Reflexive Sociology. Chicago: Chicago University Press.

  • Watkins, S., Swidler, A., & Hannan, T. (2012). Outsourcing social transformation: Development NGOs as organizations. Annual Review of Sociology, 38, 285–315.

    Article  Google Scholar 

Download references

Acknowledgements

The authors want to thank XingJian Li for her meticulous work in this project, as well as Margarita Rayzberg, Diana Graizbord, Moran Levy, Joan Robinson, Josh Whitford, and Diane Vaughn for their feedback. Previous versions of this article were presented at the 2014 Society for Social Studies of Science Meeting, the 2015 Social Science History Association Meeting, the 2016 and 2018 American Sociological Association Conference, Columbia University’s SKAT workshop, Sciences Po, and the Federal University of Rio de Janeiro. We also thank the participants at these presentations for their helpful comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luciana de Souza Leão.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

de Souza Leão, L., Eyal, G. The rise of randomized controlled trials (RCTs) in international development in historical perspective. Theor Soc 48, 383–418 (2019). https://doi.org/10.1007/s11186-019-09352-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11186-019-09352-6

Keywords

Navigation