The Copenhagen Consensus is one of the few organizations outside the EA community which conducts cause prioritization research on a global scale.
Nearly everything on their "Post-2015 Consensus" list, which covers every cause they've looked at, fits into "global development"; they don't examine animal causes or global catastrophic risks aside from climate change (though they do discuss population ethics in the case of demographic interventions).
Still, given the depth of the research, and the sheer number of experts who worked on this project, it seems like their list ought to be worth reading. On the page I linked, you can find links to all of the different cause areas they examined; here's a PDF with just cost-effectiveness estimates for every goal across all of their causes.
I didn't have the time to examine a full report for any of the cause areas, but I wanted to open a thread by noting numbers and priorities which I found interesting or surprising:
- The most valuable types of intervention, according to CC:
- Reduce restrictions on trade (10-20 times as valuable per-dollar as anything else on the list)
- Increase access to contraception (CC says "universal" access, but I don't see why we wouldn't get roughly the same value-per-dollar, if not more, by getting half the distance from where we are to the goal of universal access)
- Aspirin therapy for people at the onset of a heart attack
- Increase immunization rates (their estimates on the value of this don't seem too far off from GiveWell's if I compare to their numbers on malaria)
- "Make beneficial ownership info public" (making it clear who actually owns companies, trusts, and foundations, making it harder to transfer money illegally between jurisdictions). Notably, CC argues justifiably for reducing hidden information to zero, since "a partial solution to the transparency issue would simply allow alternative jurisdictions to continue to be used".
- Allow more migration
- Two interventions within food security: Working to reduce child malnutrition (a common EA cause) and research into increasing crop yields (something EA has barely touched on, though The Life You Can Save does recommend One Acre Fund)
- Areas that CC found surprisingly weak, compared to what I'd expected:
- Cut outdoor air pollution (about 3% as valuable as cutting indoor air pollution)
- Data collection on how well UN Millennium Development Goals are being met (measurement is very expensive, and could cost more than actual development assistance)
- Social protection system coverage (helping more people access government benefits); CC estimates that this is less than one-fifth as valuable as cash transfers
Reading the full position papers for some interventions could be a really valuable exercise for anyone who cares a lot about global development (particularly if you think EA may be neglecting certain opportunities in that space). If you spot anything interesting (and/or anything that seems wrong), leave a comment!
Thanks! My understanding of CC being controversial: Lomborg once was a member of Greenpeace, then became disillusioned with popular environmentalism and wrote the extremely controversial The Skeptical Environmentalist arguing against most popular environmental causes. The Economist and Wall Street Journal celebrated it as a fresh new look, while the Scientific American lambasted Lomborg as wrong and even scientifically dishonest. One Danish government commission accused Lomborg of fabricating data and plagiarism, while another criticized the previous commission's investigations.
I've read the book and tried to form my own view, but the rabbit hole is too deep. If anyone's interested in the object level question, try Slate Star Codex.
In any case, Lomborg is highly controversial in some circles and has had extremely high reputational stakes in the environmentalism debate for over a decade. So on second thought, I significantly agree with Jan: EA should be wary of close association with controversial figures (not to mention possibly unethical).
Separately, I think this gets to a more central question of EA's nature: Will we always demand truth at all costs, or is good enough really good enough? Will we work with pragmatic allies that don't share all of our underlying motivations? EA evolved in very high fidelity, academic-style circles, where truth-seeking and intelligence are paramount. But if doing good is the single objective, while truth is clearly extremely important, so is influence. GiveWell claims to have moved ~$500m or so at this point; CC is working in an arena with tens of billions at stake. Should we accept lower intellectual rigor if it means we can increase our scale 100x over?
I default to a commitment to truth, if only because lowering your standards is always possible at a later date, while regaining intellectual rigor is likely not. But it's certainly a question worth discussing.