It seems to me that many people have intuitions in the direction of "it's extremely hard to know with any confidence anything about the eventual consequences of our actions". The place these intuitions are coming from provides some support for at least two problems for trying to do good in the world:
- (1) Maybe we just have so little idea that even in principle the idea of trying to choose actions aiming at getting good eventual consequences is misguided.
- (2) The massive amounts of uncertainty around consequences mean that doing good is a very hard problem, and that a key part of pursuing it well is finding strategies which are somewhat robust to this uncertainty.
In some sense (2) is a weaker version of the concern (1), and it only looks attractive to address conditional on concern (1) not biting.
What should these be called? I think (1) is almost always called cluelessness, and (2) is sometimes called cluelessness, but it seems like it would be helpful to have distinct terms to refer to them. Also on my perspective (1) is a reasonable thing to worry about but it looks like the concern ultimately doesn't stand up, whereas I think that (2) is perhaps the central problem for the effective altruist project, so I'm particularly interested in having a good name for (2).
There are trivial examples, like when the decay of a given uranium atom will occur, but it seems likely there are macroscopic phenomena that are also irreducibly uncertain over time.
For instance, it's probably the case that long-term weather prediction is fundamentally impossible past some point. Currently, we use 10-meter grids for simulating atmospheric dynamics, and have decent precision out to 2 weeks. But if we knew the positions / velocities / temperatures of every particle in the atmosphere as of today, let's say, to 2 decimal places, (alongside future solar energy input fluctuations, temperature of the earth, etc.) we could in theory simulate it in full detail to know what things would be like in, say, a month - but we would lose precision over time, and because weather is a chaotic system, more than a couple months in the future, the loss of precision would be so severe that we would have essentially no information. And at some point, the degree of precision needed to extend how long we can predict hits hard limits due to quantum uncertainties, at which point we have fundamental reasons to think it's impossible to know more.