Bio

I currently lead EA funds.

Before that, I worked on improving epistemics in the EA community at CEA (as a contractor), as a research assistant at the Global Priorities Institute, on community building, and Global Health Policy.

Unless explicitly stated otherwise, opinions are my own, not my employer's.

You can give me positive and negative feedback here.

Comments
448

Topic contributions
6

We do fund a small amount of non AI/bio work so it seems bad to rule those areas out.

 It could be worth bringing more attention to the breakdown of our public grants if the application distribution is very different to the funded one, I’ll check next week internally to see if that’s the case.

Answer by calebp5
0
0

We evaluate grants in other longtermist areas but you’re correct that it’s rare for us to fund things that aren’t AI or bio (and biosecurity grants more recently have been relatively rare).  We occasionally fund work in forecasting, macrostrategy, and fieldbuilding. 

It’s possible that we’ll support a broader array of causes in the future but until we make an announcement I think the status quo of investigating a range of areas in longtermism and then funding the things that seem most promising to us (as represented by our public reporting) will persist.

Roles unlocking funds should ideally be paid more until the point where increasing earnings by 1 $ only increases funds by 1 $.


Do you think in real life that's a sensible expectation, or are you saying that's how you wish it worked?

I think I follow and agree with "spirit" of the reasoning, but don't think it's very cruxy. I don't have cached takes on what it implies for the people replying to the EA survey.

Some general confusions I have that make this exercise hard:
* not sure how predictive choice of org to work at is of choice of org to donate to, lots of people I know donate to the org they work at because they think it's the best, some donate to think they think are less impactful (at least on utilitarian grounds) than the place they work (e.g. see CEA giving season charity recs) - you seem to think that orgs people donate to are better than orgs they work at but Idk if that's true
* a bit confused about the net effects of joining an org on its capital, e.g. lots of hires unlock more funding via fundraising capacity, credibility, etc.
* most people earning to give (at least people that I meet) aren't (imo) salary max-ing (i.e. earning way more than they do in direct work roles). If we were to restrict e2g to the top earners (e.g. stratup founders, AI company employees, lawyers, hedgies etc.) then I think. it's much easier to consider the hypotehtical - if you buy value drift claims maybe donations from direct workers go up from being surrounded by EAs?
* replacement arguments are confusing, it actually matters what the person you would have otherwise hired goes on to do (and so on)
* It's not super clear to me that rough ex-ante impact distributions are extremely skewed like ex-post ones are
* I don't know how to value the effects of collecting information being much easier in direct work than in e2g (hopefully, EA Funds and similar make this a little less important)

I don't really like my comment here, I feel like I'm pulling away from the actual question but I don't think a myopic response is very helpful for discourse - the above considerations are actual cruxes for me in the real sense (I could imagine my overall take changing if I changed my mind on them). 

But if they're really sort of at all different, then you should really want quite different people to work on quite different things.

 

I agree, but I don't know why you think people should move from direct work (or skill building) to e2g. Is the argument that the best things require very specialised labour, so on priors, more people should e2g (or raise capital in other ways) than do direct work?

I don’t understand why this is relevant to the question of whether there are enough people doing e2g. Clearly there are many useful direct impact or skill building jobs that aren’t at ea orgs. E.g. working as a congressional staffer.

I wouldn’t find it surprising at all if most EAs are a good fit for good non e2g roles. In fact, earning a lot of money is quite hard, I expect most people won’t be a very good fit for it.

I think we’re talking past each other when we say “ea job”, but if you mean job at an ea org I’d agree there aren’t enough roles for everyone, but most useful direct work/skill building roles aren’t at ea orgs so it doesn’t seem very relevant, and if you mean directly impactful job or useful for skill building your claim seems wrong, seems like there are many jobs that will be better fits for people than e2g motivated ones (imo).

This is because I think that we are not able to evaluate what replacement candidate would fill the role if the employed EA had done e2g.


Idk I feel like you can get a decent sense of this from running hiring rounds with lots of work tests. I think many talented EAs are looking for EA jobs, but often it's a question of "fit" over just raw competence.

> My understanding is that many non-EA jobs provide useful knowledge and skills that are underrepresented in current EA organizations, albeit my impression is that this is improving as EA organizations professionalize

This seems plausible, though I personally think it's somewhat overstated on the forum. I agree that more EAs should be "skill maxing" over direct work or e2g, but I don't think we should use e2g as a shorthand for optimising for developing valuable skills in the short term.

calebp
62
5
3
2
30% ➔ 50% disagree

The percentage of EAs earning to give is too low


(I wasn't going to comment, but rn I'm the only person who disagrees)

Some reasons against the current proportion of e2g'ers being too low.
* There aren't many salient examples of people doing direct work that I want to switch to e2g.
* Doing direct work gives you a lot more exposure to great giving opportunities.
* Many people doing direct work I know wouldn't earn dramatically more if they switched to e2g.
* Most people doing e2g aren't doing super ambitious e2g (e.g. earning putting themselves in a position to donate >> $1M/year).
* E2g is often less well optimised for learning useful object-level knowledge and skills than direct work.

* Some EAs were early at AI companies and now have net worths of >> $100M - they will likely spend some of this on EA aligned philanthropy 
* There are already billions of dollars in philanthropic capital for EA-aligned projects, and basically all funders I've spoken to feel that there aren't enough very exciting fundable projects - so directionally, I'd feel a bit surprised if fewer people should be following paths that are less optimised for directly working on exciting projects.



Otoh, if someone has a very small chance of donating as much as Dustin Moskovitz did, then it's very plausible they should do that - I certainly wouldn't discourage people from earning to give if they are succeeding at it.

This is really cool. I suspect that you'd make it a lot easier to find users if you didn't need either side of the bet to be at all familiar with crypto. How hard would it be to accept venmo/paypal/bank transfers?

calebp
2
1
0
60% agree

Depopulation is Bad


Though I don’t think it’s as big a deal as x-risk or factory farming. Main crux is probably the effect on factory farming, as is the case with many economic growth influencing interventions. 

Load more