Hi everyone! We, Ought, have been working on Elicit, a tool to express beliefs in probability distributions. This is an extension of our previous work on delegating reasoning. We’re experimenting with breaking down the reasoning process in forecasting into smaller steps and building tools that support and automate these steps.
In this specific post, we’re exploring the dynamics of Q&A with distributions by offering to make a forecast for a question you want answered. Our goal is to learn:
- Whether people would appreciate delegating predictions to a third party, and what types of predictions they want to delegate
- Whether a distribution can more efficiently convey information (or convey different types of information) than text-based interactions
- Whether conversing in distributions isolates disagreements or assumptions that may be obscured in text
- How to translate the questions people care about or think about naturally into more precise distributions (and what gets lost in that translation)
We also think that making forecasts is quite fun. In that spirit, you can ask us (mainly Amanda Ngo and Eli Lifland) to forecast any continuous question that you want answered. Just make a comment on this post with a question, and we’ll make a distribution to answer it.
Some examples of questions you could ask:
- When will I be able to trust a virtual personal assistant to make important decisions for me?
- I live in the US. How much happier will I be if I move to Germany?
- How many EA organizations will be founded in 2021?
- I live in New York. When will I be able to go to the gym again?
- In 2021, what percentage of my working hours will I spend on things that I would consider to be forecasting or forecasting-adjacent?
We’ll spend <=1 hour on each one, so you should expect about that much rigor and information density. If there’s context on you or the question that we won’t be able to find online, you can include it in the comment to help us out.
We’ll answer as many questions as we can from now until Monday 8/3. We expect to spend about 10-15 hours on this, so we may not get to all the questions. We’ll post our distributions in the comments below. If you disagree or think we missed something, you can respond with your own distribution for the question.
We’d love to hear people’s thoughts and feedback on outsourcing forecasts, providing beliefs in probability distribution, or Elicit generally as a tool. If you’re interested in more of what we’re working on, you can also check out the competition we’re currently running on LessWrong to amplify Rohin Shah’s forecast on when the majority of AGI researchers will agree with safety concerns.
Here’s my Q1 2021 prediction, with more detailed notes in a spreadsheet here. I started out estimating the size of the market, to get reference points. Based on very rough estimates of CEA subscriptions, # of people Effective Altruism Coaching has worked with, and # of people who have gone through a CFAR workshop, I estimated the number of EAs who are interested enough in productivity to pay for a service to be ~8000. The low number of people who have done Effective Altruism Coaching (I estimated 100, but this is an important assumption that could be wrong since I don’t think Lynette has published this number anywhere) suggests a range for your course (which is more expensive) of ~10 to 45 people in Q1. Some other estimates, which are in the spreadsheet linked above, gave me a range of $8,000 to $42,000. I didn’t have enough time to properly look into 2021 as a whole, so I just did a flat 10% growth rate across all the numbers and got this prediction. Interestingly, I notice a pressure to err on the side of optimistic when publicly evaluating people’s companies/initiatives.
Your detailed notes were very helpful in this. I noticed that I wanted more information on:
Do these estimates align with what you're currently thinking? Are there any key assumptions I made that you disagree with? (here are blank distributions for Q1 and 2021 if you want to share what you're currently projecting).