I'm a moderator for the EA Forum, and an AI Systems Specialist for 80,000 Hours.
I ran the EA Forum team for 2 years, and was the original developer working on the EA Forum 2.0 for 5 years before that.
Prior to CEA, I was a data engineer at an aerospace startup. I got into EA through reading the entire archive of Slate Star Codex in 2015. I found EA naturally compelling, and donated to AMF, then GFI, before settling on my current cause prioritization of meta-EA, with AI x-risk as my object-level preference. I try to have a wholehearted approach to morality, rather than thinking of it as an obligation or opportunity. You see my LessWrong profile here.
In my personal life, I hang out in the Boston EA and Gaymer communities, enjoy houseplants, table tennis, and playing coop games with my partner, who has more karma than me.
That seems like the reverse, right? Given candidates, find me the best ones? So you'd want different prompts, though potentially some of the same logic carries over. In any case with the primitive nature of my prompts, and the complexity of the approach, I would probably advise someone to start from scratch. I'm generally a fan of open source though, and I could imagine releasing it.
Quick ask: I’m working on an AI tool that takes your resume/LinkedIn and shows you the most relevant opportunities from the 80k Job Board, then lets you discuss them with an AI.
I like to talk to 5-10 people for quick user interviews (30 mins). You might be interested if you:
I agree they're generally useful. I claim[1]Â they're especially useful in EA. But that's not enough to make this interesting.
There are many generally useful correlated traits where the correlation with general mental ability is stronger than the correlation between those traits. So being a good writer is good, and also being good at solving math problems is good. And they're positively correlated. But most things are, just through general intelligence.
What becomes interesting is identifying a cluster of traits that are more correlated with each other than with intelligence, and also predictive of success. These are rare, and usually wrong.[2]Â If such a cluster can be identified, than knowing about it can help you identify talent that will do well.
If such a cluster is also trainable, well then, you've got a real prize.Â
Fair, I was probably too loose there. I believe specifically that posts which were copied from google docs[1]Â failed to wrap at the screen width. But I wasn't a much of a mobile reader at the time.
Also thank you! I didn't realize you were the one who added mobile support.
IIRC, maybe it was some other cause that affected a subset of posts
Nice, I like this. Have you considered crossposting the full content? Usually those get a lot more people reading them, and more visibility, though do note the CC-BY restriction.
Nice, I hope you train this bundle well! Was it linked somewhere in the In-Depth Program?
Here are 4 hypotheses for what could be going on:
1 & 2 are kinda interesting, but you don't need a post on the EA Forum to tell you about intelligence and something like "competence for intellectual work."
3 starts to get interesting because then you can take someone's ability to both speak fluently about fish welfare and to use probabilities as a sign that someone will be able to understand and improve your organizations strategy, and even to be unusually cooperative.[1]
If that last clause sounds dangerous to you, I don't disagree. I think over-reliance of the cooperativeness of other people sharing these traits has caused more than one problem. I nevertheless think this is one of the things that makes EA extremely powerful as an idea.
4 is where you might start getting really hyped about projects like an In Depth Fellowship or an EA Forum. 🙂
That's what correlation means. Learning about correlations means you can update your bayesian prior about one fact when you learn about a correlated fact.