Epistemic status: a random idea I got while taking a shower, would love to hear if other people have thought about this a bit more seriously.
Key idea: Longterm forecasts might be relevant for many longtermist interventions. However, we lack evidence about the accuracy of such forecasts. Could we maybe help future researchers to better understand how good or bad long-term forecasting is, by creating a database of long-term predictions today that they could then evaluate in maybe 500 years?
The starting point of my thoughts was this article that is skeptical of longtermist interventions because they are based on very speculative forecasts. The argument roughly is this: We can not really be confident in the forecasts that many longtermist interventions are - either explicitly or implicitly - based on. It seems reasonable to assume that our predictions about the very long-run are pretty bad. But it’s not just that they are bad. Maybe even more importantly, we also simply do not really know how bad our predictions are.
One way to presumably solve the issue of uncertainty in our predictions about the long-run is to resort to conservative order of magnitude estimates: something along the lines of “even if my predictions are wrong by a factor of 1000, cause xyz would still be a good thing to do”. The question is, maybe they are not just wrong by 3 orders of magnitude, but by 5. Or 10? Or 50? Really, who knows? We essentially have no empirical track record of how far off long-term forecasting actually is.
Yet, it seems like clarity on this would be extremely helpful in order to evaluate how realistic longtermist interventions are. This made me think of just how nice it would be if some people 500 years ago had created a huge number of predictions about the world of today. Then we could look at these forecasts now and compare them to the real world. The comparison could offer a good baseline estimate as to how feasible it is to make reasonably accurate forecasts about long-term trends.
Unfortunately, to my knowledge such a project has not been done 500 years ago, so we cannot use it today. But what about creating a database today in which skilled forecasters try to predict trends over the next 50-1000 years? Ideally, this would put researchers in 500 or 1000 years into a much better position to evaluate longterm forecasting accuracy because they could work with some real evidence.
Projects like gjopen.com seem to have contributed a lot to our understanding of how forecast on timelines of maybe 1-2 years work. Has anyone ever thought about creating a similar project, just that it has timelines of, let’s say, 50-1000 years? Does it maybe already exist?
I could imagine such a project do be beneficial, but also have quite a number of uncertainties about its usefulness, feasibility, and implementation:
- Let’s assume a best-case: such a platform is actually built, it does not get destroyed, there will still be researchers interested in longterm forecasting in 500 years, and they will be able to access and interpret it. How useful would the data actually be? How to even evaluate this in theory?
- What kind of forecasting questions would be most relevant to include?
- How to make sure that the measurements in which we give the forecasts today would still be interpretable in the far future? E.g. does it even make sense to forecast something like GDP, or will such a concept maybe be irrelevant and thus no longer be measured in 500 years?
- How likely is it that such a database would actually survive long enough to be used for its intended purpose?
- Will human long-term forecasting still be of any interest in 500 years or do we expect all forecasting to be done by AI by then anyway?
- ...
Ironically, some of these questions themselves also require some sort of forecast about the far future :p
Currently, I do not have any concrete plans to actually implement this, but I would love to hear if people think it would be worth considering.
The Long Now Foundation started something in this direction: "Long Bets".