Hi all - we’re the management team for the Long-Term Future Fund. This post is where we're hosting the AMA for you to ask us about our grant making, as Marek announced yesterday.
We recently made this set of grants (our first since starting to manage the fund), and are planning another set in February 2019. We are keen to hear from donors and potential donors about what kind of grant making you are excited about us doing, what concerns you may have, and anything in between.
Please feel free to start posting your questions from now. We will be available here and actively answering questions between roughly 2pm and 6pm PT (with some breaks) on December 20th.
Please ask different questions in separate comments, for discussion threading.
edit: Exciting news! The EA Foundation has just told us that donations to the Long-Term Future are eligible for the matching drive they're currently running. See the link for details on how to get your donation matched.
edit 2: The "official" portion of the AMA has now concluded, but feel free to post more questions; we may be able to respond to them over the coming week or two. Thanks for participating!
We won’t generally have access to work that isn’t shared with the general public, but may incidentally have access to such work through individual fund members having private conversations with researchers. Thus far, we’ve evaluated organizations based on the quality of their past research and the quality of their team.
We may also evaluate private research by evaluating the quality of its general direction, and the quality of the team pulling it off. For example, I think the discourse around AI safety could use a lot of deconfusion. I also recognize that such deconfusion could be an infohazard, but nevertheless want such research to be carried out, and think MIRI is one of the most competent organizations around to do it.
In the event that our decision for whether to fund an organization hinges on the content of their private research, we’ll probably reach out to them and ask them if they’re willing to disclose it.