Metaculus is an online platform where users make and comment on forecasts, which has recently been particularly notable for its forecasting of various aspects of the pandemic, on a dedicated subdomain. As well as displaying summary statistics of the community prediction, Metaculus also uses a custom algorithm to produce an aggregated "Metaculus prediction". More information on forecasting can be found in this interview with Philip Tetlock on the 80,000 hours podcast.
Questions on Metaculus are submitted by users, and a thread exists on the platform where people can suggest questions they'd like to see but do not have the time/skill/inclination to construct themselves. Question construction is non-trivial, not least because for forecasting to work, clear criteria need to be set for what counts as positive resolution. A useful intuition pump here is "if two people made a bet on the outcome of this question, would everyone agree who had won?"
Although there is already a significant overlap between the EA community and the Metaculus userbase, I think it is likely that there exist many forecasting questions which would be very useful from an EA perspective, but that have not yet been written. As such, I've written this question as both a request and an offer.
The request:
Have a think about whether there are any forecasts you think have the potential to have a large impact on decision making within the EA community.
The offer:
If you do think of one, and post it below, and I'll write it up for you and submit it to the site. The closer it is to "fully formed", the more quickly this is likely to happen, but please don't feel the need to spend ages choosing resolution criteria, I'm happy to help with this. I intend to choose questions based on some combination of number of upvotes the suggestion has and how easy the question is to operationalise.
Examples of my question-writing on Metaculus are here, and I also recently become a moderator on the platform.
Some examples of EA-adjacent questions already on the platform:
How many reviews will Toby Ord's book The Precipice have on Amazon on January 1st 2021?
If you're interested in having someone make a forecast about a question that's more personal to you, and/or something that you wouldn't expect the Metaculus community as a whole to have the right combination of interest in and knowledge of, I'd recommend checking out this offer from amandango.
To lay out my tentative position a bit more:
I think forecasts about what some actor (a person, organisation, community, etc.) will overall believe in future about X can add value compared to just having a large set of forecasts about specific events that are relevant to X. This is because the former type of forecast can also account for:
On the other hand, forecasts about what some actor will believe in future about X seem more at risk of causing undesirable feedback loops and distorted beliefs than forecasts about specific events relevant to X do.
I think forecasting the donation split of the EA Funds[1] would be interesting, and could be useful. This seems to be a forecast of a specific event that's unusually well correlated with an actor's overall beliefs. I think that means it would have more of both the benefits and the risks mentioned above than the typical forecast of a specific event would, but less than a forecast that's directly about an actor's overall belief would.
This also makes me think that another thing potentially worth considering is predicting the beliefs of an actor which:
Some spitballed examples, to illustrate the basic idea: Paul Christiano, Toby Ord, a survey of CEA staff, a survey of Open Phil staff.
This would still pose a risk of causing the EA community to update too strongly on erroneous forecasts of what this actor will believe. But it seems to at least reduce the risk of self-fulfilling prophecies/feedback loops, which somewhat blunts the effect.
I'm pretty sure this sort of thing has been done before (e.g., sort-of, here). But this is a rationale for doing it that I hadn't thought of before.
But this is just like a list of considerations and options. I don't know how to actually weigh it all up to work out what's best.
[1] I assume you mean EA Funds rather than the EA Forum or the Effective Altruism Foundation - lots of EAFs floating about!
[2] I only give this criterion because of the particular context and goals at hand; there are of course many actors outside the EA community whose beliefs we should attend to.