Metaculus is an online platform where users make and comment on forecasts, which has recently been particularly notable for its forecasting of various aspects of the pandemic, on a dedicated subdomain. As well as displaying summary statistics of the community prediction, Metaculus also uses a custom algorithm to produce an aggregated "Metaculus prediction". More information on forecasting can be found in this interview with Philip Tetlock on the 80,000 hours podcast.
Questions on Metaculus are submitted by users, and a thread exists on the platform where people can suggest questions they'd like to see but do not have the time/skill/inclination to construct themselves. Question construction is non-trivial, not least because for forecasting to work, clear criteria need to be set for what counts as positive resolution. A useful intuition pump here is "if two people made a bet on the outcome of this question, would everyone agree who had won?"
Although there is already a significant overlap between the EA community and the Metaculus userbase, I think it is likely that there exist many forecasting questions which would be very useful from an EA perspective, but that have not yet been written. As such, I've written this question as both a request and an offer.
The request:
Have a think about whether there are any forecasts you think have the potential to have a large impact on decision making within the EA community.
The offer:
If you do think of one, and post it below, and I'll write it up for you and submit it to the site. The closer it is to "fully formed", the more quickly this is likely to happen, but please don't feel the need to spend ages choosing resolution criteria, I'm happy to help with this. I intend to choose questions based on some combination of number of upvotes the suggestion has and how easy the question is to operationalise.
Examples of my question-writing on Metaculus are here, and I also recently become a moderator on the platform.
Some examples of EA-adjacent questions already on the platform:
How many reviews will Toby Ord's book The Precipice have on Amazon on January 1st 2021?
If you're interested in having someone make a forecast about a question that's more personal to you, and/or something that you wouldn't expect the Metaculus community as a whole to have the right combination of interest in and knowledge of, I'd recommend checking out this offer from amandango.
I'd also be interested in forecasts on these topics.
It seems to me that there'd be a risk of self-fulfilling prophecies.
That is, we'd hope that what'd happen is:
...But what might instead happen is:
(Perhaps this is like a time-travelling information cascade?)
I'm not saying the latter scenario is more likely than the former, nor that this means we shouldn't solicit these forecasts. But the latter scenario seems likely enough to perhaps be an argument against soliciting these forecasts, and to at least be worth warning readers about clearly and repeatedly if these forecasts are indeed solicited.
Also, this might be especially bad if EAs start noticing that community beliefs are indeed moving towards the forecasted future beliefs, and don't account sufficiently well for the possibility that this is just a self-fulfilling prophecy, and thus increase the weight they assign to these forecasts. (There could perhaps be a feedback loop.)
I imagine there's always some possibility that forecasts will influence reality in a way that makes the forecasts more or less likely to come true that they would've been otherwise. But this seems more-than-usually-likely when forecasting EA community beliefs (compared to e.g. forecasting geopolitical events).