A number of arguments in favor of either broad or narrow interventions have been offered., [3] A commonly given consideration in favor of broad interventions concerns their apparently superior historical track record. This point has been made independently by a number of authors at around the same time.[4] Beckstead himself writes:[5]
Beckstead, Nick (2013) On the Overwhelming Importance of Shaping the Far Future, Doctoral thesis, Rutgers University.
Cotton-Barratt, Owen (2015) Comment on “What is a ‘broad intervention’ and what is a ‘narrow intervention’? Are we confusing ourselves?”, Effective Altruism Forum, December 19.
For exampleexample, Nick Beckstead (2013) How to compare broad and targeted attempts to shape the far future, July 13.
The philosopher J. J. C. Smart made this point decades earlier: "Could Jeremy Bentham or Karl Marx (to take two very different political theorists) have foreseen the atom bomb? Could they have foreseen automation? Can we foresee the technology of the next century?" (Smart, J. J. C. (1973) An outline of a system of utilitarian ethics, in J. J. C. Smart & Bernard Williams (eds.) Utilitarianism: For and Against, Cambridge: Cambridge University Press, pp. 1–74, p. 64)
Beckstead, Nick (2013) On the Overwhelming Importance of Shaping the Far Future, p. 145.
Tomasik, Brian (2013) Charity cost-effectiveness in an uncertain world, Center on Long-Term Risk, October 28.
Branwen, Gwern (2014) Optimal existential risk reduction investment, Gwern.net, July 17.
Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, ch. 6.
A number of arguments in favor of either broad or narrow interventions have been offered.[3] , A commonly given consideration in favor of broad interventions concerns their apparently superior historical track record. This point has been made independently by a number of authors at around the same time.[4] Beckstead himself writes:[5]
The philosopher J. J. C. Smart made this point decades earlier: "Could Jeremy Bentham or Karl Marx (to take two very different political theorists) have foreseen the atom bomb? Could they have foreseen automation? Can we foresee the technology of the next century?" (Smart 1973: 64) ↩︎
The philosopher Nick Beckstead has distinguished between two different ways of influencing the long-term future: broad interventions, which "focus on unforeseeable benefits from ripple effects", and narrow (or targeted) interventions, which "aim for more specific effects on the far future, or aim at a relatively narrow class of possible ripple effects." (Beckstead 2013a)[1]
However, interventions with many causal steps may have few causal paths, and interventions with many causal paths may have few causal steps. It is therefore convenient to have separate terms for each of these dimensions of variation. Some effective altruists reserve the terms "narrow" and "broad" for interventions with few or many causal paths, and use the terms "direct" and "indirect" for interventions with few or many causal steps (Cotton-Barratt 2015).steps.[2]
A number of arguments in favor of either broad or narrow interventions have been offered (e.g. Beckstead 2013b).offered.[3] A commonly given consideration in favor of broad interventions concerns their apparently superior historical track record. This point has been made independently by a number of authors at around the same time.[1]4] Beckstead himself writes (Beckstead 2013a: 145):writes:[5]
Similarly, Brian Tomasik writes (2013):writes:[6]
And Gwern Branwen writes (Branwen 2014):writes:[7]
In response to these claims, Toby Ord argues that comparisons with previous centuries may be misleading, because the bulk of the existential risk to which humanity is currently exposed is anthropogenic in nature, and originates in technologies developed only since around the mid-20th century. Narrow interventions aimed specifically at mitigating the risks posed by such technologies should thus be expected to accomplish much more than similar efforts in previous centuries. Ord also points out that broad interventions receive tens of thousands of times more funding than do narrow interventions, so even people with reasonable differences about the relative merits of broad and targeted interventions should favor the latter, given their much higher neglectedness (Ord 2020: ch. 6).[8]
Beckstead, Nick (2013a)(2013) On the Overwhelming Importance of Shaping the Far Future, Doctoral thesis, Rutgers University.
Beckstead, Nick (2013b) How to compare broad and targeted attempts to shape the far future, July 13.
Branwen, Gwern (2014) Optimal existential risk reduction investment, Gwern.net, July 17.
Cotton-Barratt, Owen (2015) Comment on “What is a ‘broad intervention’ and what is a ‘narrow intervention’? Are we confusing ourselves?”, Effective Altruism Forum, December 19.
Smart, J. J. C. (1973) An outline of a system of utilitarian ethics, in J. J. C. Smart & Bernard Williams (eds.) Utilitarianism: For and Against, Cambridge: Cambridge University Press, pp. 1–74.
Tomasik, Brian (2013) Charity cost-effectiveness in an uncertain world, Center on Long-Term Risk, October 28.
Beckstead, Nick (2013) On the Overwhelming Importance of Shaping the Far Future, Doctoral thesis, Rutgers University.
Cotton-Barratt, Owen (2015) Comment on “What is a ‘broad intervention’ and what is a ‘narrow intervention’? Are we confusing ourselves?”, Effective Altruism Forum,
Koehler, Arden, Benjamin Todd, Robert Wiblin & Keiran Harris (2020) Benjamin Todd on varieties of longtermism and things 80,000 Hours might be getting wrong, The 80,000 Hours Podcast, September.
A number of arguments in favor of either broad or narrow interventions have been offered (e.g. Beckstead 2013b). A commonly given consideration in favor of broad interventions concerns their apparently superior historical track record. This point has been made independently by a number of authors at around the same time.[1] Thus, Beckstead himself writes (Beckstead 2013a: 145):
The philosopher Nick Beckstead has distinguished between two alternativedifferent ways of influencing the long-term future.future: Broadbroad interventions, which "focus on unforeseeable benefits from ripple effects", whileand narrow (or targeted) interventions, which "aim for more specific effects on the far future, or aim at a relatively narrow class of possible ripple effects." (Beckstead 2013a)
However, interventions with many causal steps may have few causal paths, and interventions with many causal paths may have few causal steps. It is therefore convenient to have separate terms for each of these dimensions of variation. Some effective altruists reserve the terms 'narrow'"narrow" and 'broad'"broad" for interventions with few or many causal paths, and use the terms 'direct'"direct" and 'indirect'"indirect" for interventions with few or many causal steps (Cotton-Barratt 2015).
I think it'd probably be good to replace the bednets example, because (a) these terms are usually reserved for longtermist interventions, (b) having one example be a near-termist one could obscure what the relevant difference between the examples is, and (c) bednets are also seen as having positive effects other than saving lives (though maybe saving lives is seen as the overwhelmingly most important effect; I can't remember), which could further obscure the broad vs narrow distinction here.
Basically, I think the distinction would be clearest if we use two examples that are both aimed at the same ultimate objective, but with one having its effects travel via fewer paths and steps.
The philosopher Nick Beckstead has distinguished between two alternative ways of influencing the long-term future. Broad interventions «focus"focus on unforeseeable benefits from ripple effects»effects", while narrow (or targeted) interventions «aim"aim for more specific effects on the far future, or aim at a relatively narrow class of possible ripple effects.»" (Beckstead 2013a)
The philosopher J. J. C. Smart made this point decades earlier: «Could"Could Jeremy Bentham or Karl Marx (to take two very different political theorists) have foreseen the atom bomb? Could they have foreseen automation? Can we foresee the technology of the next century?»" (Smart 1973: 64) ↩︎
Ord, Toby (2020) The precipice: existential riskPrecipice: Existential Risk and the futureFuture of humanityHumanity, London: Bloomsbury Publishing.
I think it'd be good to integrate ideas from, or at least a link to, this podcast episode: https://80000hours.org/podcast/episodes/ben-todd-on-varieties-of-longtermism/