A couple years ago I was wondering why all the focus is on Superforecasters when really we should be emphasizing the best arguments or the best persuaders. It seems like knowing who is best at forecasting is is less useful to me that knowing what (or who) would persuade me to change my mind (since I only care about forecasts in so far as they change my mind, anyways).
The incentive system for this seems simple enough. Imagine instead of upvoting a comment, the comment has a "update your forecast" button. Comments that are persuasive get boosted by the algorithm. Authors who create convincing arguments can get prestige. Authors who create convincing argument that, on balance, lead to people making better forecasts, get even more prestige.
It could even be a widget that you embed at the beginning and end of off-site articles. That way we could find the "super-bloggers" or "super-journalists" or whatever you want to call them.
Heck, you could even create another incentive system for the people who are best at finding arguments worth updating on.
The point is, you need to incentivize more than good forecasts. You need an entire knowledge generation economy.
There is probably all kinds of ways this gets gamed. But it seems at least worth exploring. Forecasts by themselves are just not that useful. Explanations, not probabilities, are what expert decision-makers rely on. At least that is the case within my field of Naturalistic Decision Making, and also seems true in Managerial Decision Making - managers don't seem to use probabilities in order to do Expected Utility calculations, but rather to try and understand the situation and its uncertainties.
This is the conclusion Dominic Cummings came to during the pandemic, as well. Summarized here
> During the pandemic, Dominic Cummings said some of the most useful stuff that he received and circulated in the British government was not forecasting, it was qualitative information explaining the general model of what’s going on, which enabled decision-makers to think more clearly about their options for action and the likely consequences. If you’re worried about a new disease outbreak, you don’t just want a percentage probability estimate about future case numbers, you want an explanation of how the virus is likely to spread, what you can do about it, how you can prevent it. Not the best estimate for how many COVID cases there will be in a month, but why forecasters believe there will be X COVID cases in a month. https://www.samstack.io/p/five-questions-for-michael-story
A couple years ago I was wondering why all the focus is on Superforecasters when really we should be emphasizing the best arguments or the best persuaders. It seems like knowing who is best at forecasting is is less useful to me that knowing what (or who) would persuade me to change my mind (since I only care about forecasts in so far as they change my mind, anyways).
The incentive system for this seems simple enough. Imagine instead of upvoting a comment, the comment has a "update your forecast" button. Comments that are persuasive get boosted by the algorithm. Authors who create convincing arguments can get prestige. Authors who create convincing argument that, on balance, lead to people making better forecasts, get even more prestige.
It could even be a widget that you embed at the beginning and end of off-site articles. That way we could find the "super-bloggers" or "super-journalists" or whatever you want to call them.
Heck, you could even create another incentive system for the people who are best at finding arguments worth updating on.
The point is, you need to incentivize more than good forecasts. You need an entire knowledge generation economy.
There is probably all kinds of ways this gets gamed. But it seems at least worth exploring. Forecasts by themselves are just not that useful. Explanations, not probabilities, are what expert decision-makers rely on. At least that is the case within my field of Naturalistic Decision Making, and also seems true in Managerial Decision Making - managers don't seem to use probabilities in order to do Expected Utility calculations, but rather to try and understand the situation and its uncertainties.
This is the conclusion Dominic Cummings came to during the pandemic, as well. Summarized here
> During the pandemic, Dominic Cummings said some of the most useful stuff that he received and circulated in the British government was not forecasting, it was qualitative information explaining the general model of what’s going on, which enabled decision-makers to think more clearly about their options for action and the likely consequences. If you’re worried about a new disease outbreak, you don’t just want a percentage probability estimate about future case numbers, you want an explanation of how the virus is likely to spread, what you can do about it, how you can prevent it. Not the best estimate for how many COVID cases there will be in a month, but why forecasters believe there will be X COVID cases in a month.
https://www.samstack.io/p/five-questions-for-michael-story