I'm planning to spend time on the afternoon (UK time) of Wednesday 2nd September answering questions here (though I may get to some sooner). Ask me anything!
A little about me:
- I work at the Future of Humanity Institute, where I run the Research Scholars Programme, which is a 2-year programme to give space for junior researchers (or possible researchers) to explore or get deep into something
- (Applications currently open! Last full day we're accepting them is 13th September)
- I've been thinking about EA/longtermist strategy for the better part of a decade
- A lot of my research has approached the question of how we can make good decisions under deep uncertainty; this ranges from the individual to the collective, and the theoretical to the pragmatic
- e.g. A bargaining-theoretic approach to moral uncertainty; Underprotection of unpredictable statistical lives compared to predictable ones; or Defence in depth against human extinction
- Recently I've been thinking around the themes of how we try to avoid catastrophic behaviour from humans (and how that might relate to efforts with AI); how informational updates propagate through systems; and the roles of things like 'aesthetics' and 'agency' in social systems
- I think my intellectual contributions have often involved clarifying or helping build more coherent versions of ideas/plans/questions
- I predict that I'll typically have more to say to relatively precise questions (where broad questions are more likely to get a view like "it depends")
Ahh, I think I was interpreting your general line of questioning as being:
A) Absent ability to get sufficient mentorship within EA circles, should people go outside to get mentorship?
... whereas this comment makes me think you were more asking:
B) Since research mentorship/management is such a bottleneck, should we get people trying to skill up a lot in that?
I think that some of the most important skills for research mentorship from an EA perspective include transferring intuitions about what is important to work on, and that this will be hard to learn properly outside an EA context (although there are probably some complementary skills one can effectively learn).
I do think that if the questions were in the vein of B) I'm more wary in my agreement: I kind of think that research mentorship is a valuable skill to look for opportunities to practice, but a little hard to be >50% of what someone focuses on? So I'm closer to encouraging people doing research that seems valuable to look for opportunities to do this as well. I guess I am positive on people practicising mentorship generally, or e.g. reading a lot of different pieces of research and forming inside views on what makes some pieces seem more valuable. I think the demand for these skills will become slightly less acute but remain fairly high for at least a decade.