This is a special post for quick takes by Vaidehi Agarwalla. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I found this post by Rob Bensinger of anonymous comments on EA from 2017 , with the question prompt: 

If you could magically change the effective altruism community tomorrow, what things would you change? [...] If possible, please mark your level of involvement/familiarity with EA[.]. 

Many still resonate today. I recommend reading the whole list, but there are a lot - so I've chosen a few highlights and comment exchanges I thought were particularly interesting. I've shortened a few for brevity (indicated by ellipses).

I don't agree with many of these comments, but it's interesting to see how people perceived things back then. 


On supporting community members

Related: Should the EA community be cause-first or member-first?  

#28 - on people dismissing those who start as "ineffective altruists" (top voted comment with 23 karma)

I have really positive feelings towards the effective altruism community on the whole. I think EA is one of the most important ideas out there right now.

However, I think that there is a lot of hostility in the movement towards those of us who started off as 'ineffective altruists,' as opposed to coming from the more typical Silicon

... (read more)
This meme about ‘being the ones who show up’ is not something I’d heard before, but it explains a lot.

Suggestion for EA Forum posts: First Draft

Create a new type of post - a "First Draft" post, with it's own section "WIP". Basically like the current collaborative draft mode editing, but public. 

This could be a expansion / continuation of the "amnesty day" posts, but more ongoing and more focused on changing the culture of the post. 

  • Looks like a google doc with easy commenting on specific sections, maybe more voting options that have to do with feedback (e.g. needs more structure etc.)
  • You can give suggestions on what people can post e.g. "Idea bunny" "post outline" "unpolished draft" "polished draft" and give people options on the kinds of feedback they could seek e.g. "copyediting / grammar" or "tone" or "structure" or "factchecking" etc.  
  • Maybe Karma-free, or separate karma score so people don't worry about how it'll be taken
  • Maybe people who give comments and feedback can get some kind of "helper karma" and be automatically tagged when the post is published and get credit of some kind for contributing to common knowledge
  • Potentially have it be gated in some way or have people opt-in to see it (e.g. so more engaged people opt-in, so it becomes like the Facebook peer-e
... (read more)
JP Addison
Hi Vaidehi! You’ve written us quite the feature spec there. I’m not opposed to ambitious suggestions (at all! for real! though it is true that they’re less likely to happen), but I would find this one if it were written in the style of a user problem. I am un-embarassed to ask you for this extra product work because I know you’re a product manager. (That said, I’d understand if you didn’t want to spend any time on it without a stronger signal from us of how likely we are to act on it.)
Vaidehi Agarwalla
Broader statement / use case I could imagine All claims you could disagree with.   * Many EAs primarily experience EA online (both initially and as they progress on their EA journeys).  * There are limited opportunities for people to practice EA principles online * The forum is visited by many people * The forum should be a place where people can actively practice EA principles * Specifically, it can be a place collaborative truthseeking happens, but it isn't really a place for that. Instead, it's more often a place to share the results of collaborative truthseeking * Truthseeking involves: * Being wrong * Saying dumb / naïve things * Making mistakes * Appearing less intelligent than you are * Asking questions of people  * Challenging people (of higher status / position than you) * Saying you were wrong publicly * Not getting defensive and being open to criticism * The forum doesn't feel like a place where people can do those things today without some degree of reputational / career harm (or unless they invest a lot of time in legibly explaining themselves / demonstrating they've updated) * There are poor incentives for people to help each other collaboratively truth-seek on the Forum today. The forum can sometimes feels competitive or critical, rather than collaborative and supportive * I've commented previously during the era of Forum competition posts, that it would be nice to recognize people helping each other  * Edo makes the nice comment that the strategy session is one of the few forum posting events that's not explicitly competitive  * Nathan proposes Community posts: The Forum needs a way to work in public  which is somewhat similar in terms of changing incentives towards collaborative truthseeking * This is a current pratice users already (in the product question of are you replacing an existing practice or introducing a new one - it's easier to replace a new practice because it's more likely to be used). 
Aaron Bergman
Not OP but here are some "user problems" either I have or am pretty sure a bunch of people have: * Lots of latent, locked up insight/value in drafts * Implicitly high standards discourage posting these as normal posts, which is good for avg post quality and bad for total quality * Would want to collaborate on either an explicit idea or something tbd, but making this happen as is takes a bunch of effort * Reduces costs to getting and giving feedback * Currently afaik there's no market where feedback buyers and sellers can meet - just ad hoc Google doc links  * In principle you can imagine posts almost being written like a Wikipedia page: lots and lots of editors and commenters contributing a bit here and there Here's a post of mine that should be a public draft, for example. But as things stand I'd rather it be a shitty public post than a probably-perpetual private draft (if anyone wants to build on it, go to town!)
Vaidehi Agarwalla
+1 to all of this also. 
Vaidehi Agarwalla
I've heard a new anecdote from someone who's actively working on a AI research project who feels less connected to relevant people in their domain to get feedback on it. 
Vaidehi Agarwalla
(also would love to hear what doubts & hesitations you have / red-team this idea more - I think devil's definitely in the details and there are lots of interesting MVP's here)
Vaidehi Agarwalla
hehe you know i like to give y'all some things to do! would be interested to know how likely you'd be to act on it, but also happy to expand since it's not a big lift. Not linking to all the stuff I mention to save some time.  Here's my hypothesis of the user problem : Personal statement (my own case) 1. I often want to write a post, but struggle to get it over the line, and so it remains in a half baked state and not shared with people 1. I want to find the "early adopters" of my posts to give me lots of feedback when my personal circles may not have the right context and/or I don't know who from my personal circles is available to look at a draft.  1. (there's a big cognitive load / ugh field / aversion in general here e.g. whenever you have to go through a list of people to curate for e.g. inviting them to an event or asking people for favors.) 2. Sometimes it can be good to release things into the world even if they are low quality because then they're out of your system and you can focus on other, more valuable ideas.  3. Personal Experiment: I've posted a draft on twitter and get some of this effect (where people not in my radar read and engage with stuff). This is mostly really good.  1. But, as might be obvious, it's really not a good forum for sharing thoughts longer than 2 paragrapsh.  2. Sometimes I don't know what posts or ideas will resonate with people, and it's nice to find that out early. Also, I am better able to take feedback when I haven't invested a bunch of time in polishing and editing a draft  3. I also just want to share a lot of thoughts that I don't think are post-level quality but are also a bit more thought through than shortforms (without losing the benefit of in-line commenting, suggest mode - essentially, the UX of google docs) 
JP Addison
Sadly I've been informed this is a pathological case for the pricing model of our collaborative editor SaaS tool.
Vaidehi Agarwalla

(Pretty confident about the choice, but finding it hard to explain the rationale)

I have started using "member of the EA community" vs "EAs" when I write publicly.

Previously I cared a lot less about using these terms interchangeabley, mainly because referring to myself as an EA didn't seem inaccurate, it's quicker and I don't really see it as tying my identity closely to EA, but over time have changed my mind for a few reasons:

Many people I would consider "EA" in the sense that they work on high impact causes, socially engage with other community members etc. don't consider themselves EA, might I think would likely consider themselves community members. I wonder if they read things about what "EAs" should do and don't think it applies to them.

Using the term "an EA" contributes to the sense that there is one (monolithic?) identity that's very core to a person's being. E.g. if you leave the community do you lose a core part of your identity?

Finally it also helps be specific about the correct reference class. E.g consider terms like "core EAs" with "leaders of EA-aligned organisations" or "decision makers at leading EA meta organisations" or "thought leaders of the EA community". (there is also a class for people who don't directly wield power but have influence over decision makers, I'm not sure what a good phrase to describe this role is).

Interested in thoughts!

Gemma Paterson
I started defaulting to saying people trying to do EA - less person focused more action focused
  This is reasonable, but I think the opposite applies as well. i.e. people can be EA (committed to the philosophy, taking EA actions) but not a member of the community. Personally, this seems a little more natural than the reverse, but YMMV (I have never really felt the intuitive appeal of believing in EA and engaging in EA activities but not describing oneself as "an EA"). 
Vaidehi Agarwalla
There are people who I would consider "EA" who I wouldn't consider a "community member" (e.g. if they were not engaging much with other people in the community professionally or socially), but I'd be surprised if they label themselves "EA" (maybe they want to keep their identity small, or don't like being associated with the EA community).  I think there's actually one class of people I've forgotten - which is "EA professionals" - someone who might professionally collaborate or even work at an EA-aligned organization, but doesn't see themselves as part of the community. So they would treat an EAG as a purely professional conference (vs. a community event). 

There are people who I would consider "EA" who I wouldn't consider a "community member" (e.g. if they were not engaging much with other people in the community professionally or socially), but I'd be surprised if they label themselves "EA" (maybe they want to keep their identity small, or don't like being associated with the EA community). 


Fwiw, I am broadly an example of this category, which is partly why I raised the example: I strongly believe in EA and engage in EA work, but mostly don't interact with EAs outside professional contexts. So I would say "I am an EA", but would be less inclined to say "I am a member of the EA community" except insofar as this just means believes in EA/does EA work.

Aaron Gertler
I also try not to use "EA" as a noun. Alternatives I've used in different places: * "People in EA" (not much better, but hits the amorphous group of "community members plus other people who engage in some way" without claiming that they'd all use a particular label) * "People practicing EA" (for people who are actually taking clear actions) * "Community members" * "People" (for example, I think that posts like "things EAs [should/shouldn't] do" are better as "things people [should/shouldn't] do" — we aren't some different species, we are just people with feelings and goals)

Reflecting on the question of CEA's mandate, I think it's challenging that CEA has always tried to be both, and this has not worked out well.

1) a community org

2) a talent recruitment org

When you're 1) you need to think about the individual's journey in the movement. You invest in things like community health and universal groups support. It's important to have strong lines of communication and accountability to the community members you serve. You think about the individual's journey and how to help addres those issues. (Think your local Y, community center or church)

When you're 2) you care about finding and supporting only the top talent (and by extension actors that aid you in this mission). You care about having a healthy funnel of individuals who are at the top of their game. You care about fostering an environment that is attractive (potentially elite), prestigious and high status. (Think Y-Combinator, Fullbright or Emergent Ventures Fellows).

I think these goals are often overlapping and self-reinforcing, but also at odds with each other. 

It is really hard to thread that needle well - it requires a lot of nuanced, high-fidelity communication - which in turn requires a lot... (read more)

This has been discussed regarding intro fellowships:
I think the combination of 1 and 2 is such that you want the people who come through 1 to become people who are talented and noted down as 2. We should be empowering one another to be more ambitious. I don't think I would have gotten my emergent ventures grant without EA. 

Curious if people have tried to estimate the cost of burnout. 

The things I think we should care about, but without numbers or estimates: 

  1. How much directly the burnout reduce productivity?
    1. E.g. 6-12 months on average, the longer the grind the longer the burnout
  2. The long-term reduction in total max capacity over time or something 
    1. lets say you had 60K hours before burnout, after you have like 40K because you just can't work as hard.
  3. How much does burnout increase the likelihood the person doesn't puruse high impact career  (i.e. leaves direct work roles) 
  4. What effect does burnout of a person have on their EA network (e.g. their colleagues, friends etc.?)
    1. E.g. if they're on a team, it could marginally increase the chance other team members burn out because they now have more work (+ creating a negative associations to work) 
    2.  e.g. their friends & local community might have a more negative view of the community as one where your friends burnout
  5. others?

Resources for feeling good about prioritising

Note: This is collected from a number of people in an EA facebook group that I found in my google drive. I figured it was worth posting up as a shortform in case others find it valuable. 


  • Delegation to those with more time/suitability
  • Don't have un-prioritised tasks on your to-do list. Put them somewhere else out of sight - if they're a good idea in a 'some day' pile, if they're not very important bin (or delegate)
    • I keep my actual "To Do" list v. small these days and don't agree to do anything outside of it, but I do put a lot of stuff on a list I call "Ideas" (which is still roughly prioritised). So then I have achievable goals and anything else feels like a bonus.
  • On saying no but not wanting to offend: I think that saying I'm too busy for what they've requested and offering something much smaller instead has helped me. Seems to be a good deal in terms of time saved and how offended they are. (This was when I gave up on my plan of just caring much less about upsetting people...too difficult!)
  • For regular things, I see two angles to work on: saying no more before commiting, and letting things you've got on already go/be less sticky
... (read more)
Thank you so much for this - I found it surprisingly comprehensive given its brevity. I especially appreciate you outlining the various ways in which you address the motivations behind something being hard to let go, which feel more concrete than some advice I've come across. I would be really interested in taking a peek at the form. : ) Delegating more is something I'm working on and I feel like I'm slowly becoming better at it, but clearly still not good enough since I continue to burn out. 
Vaidehi Agarwalla
Hey Miranda! Actually this was a collection of other people's (pretty cool) responses so sadly I don't have the form :(. Agreed that delegating is hard! let me see if I can ask the original commentor - I definitely think it would be valuable!
oh my bad, I must've misread. Thank you!

(very) Quick Reflections on EAG 2021 

Most of these are logistical / operational things on how I can improve my own experience at EAG

  • Too much talking / needing to talk too loudly
    • Carry an info card / lanyard which has blurbs on my role, organisation & the projects I want to talk to people about and ask them to read it before we start our 1-1. (This is probably a little suboptimal if you need to actively get people super pumped about your ideas)
    • More walking conversations in quiet areas. This year the EAG had a wonderful second location with an indoor conservatory that was peaceful, quiet and beautiful. Everyone I brought there really liked it because it was a refreshing break from the conference. If there isn't something like this in future EAGs I'll try to find a good ~25 minute walking route on the first day of the conference.
  • Shop talk vs socializing during 1-1s
    • I feel quite conflicted on this
      • One one hand, it makes sense especially when doing more open-ended networking (e.g. this person does climate stuff, I feel it's useful to meet them but not sure how exactly it would be useful). Hopefully, the info card saves some time.Shop talk is absolutely exhausting and I feel that I
... (read more)

Status: This was a post i'd drafted 4 years ago on climate change in EA. Not sure I stand by all of it, but I thought it might be worth sharing. 

let's make room for climate change

What this post is NOT saying:

*   depriortise other cause areas
*   redirect significant resources fromt other cause areas
*   the average EA should go into climate change or become a climate scientist/advocate
*   the average EA has comparative advantage in climate change

What this post IS saying

*   having an EA org or projects related to climate change w... (read more)

Collection of Constraints in EA

Vaidehi Agarwalla
Related posts:  The Case for the EA Hotel by Halffull. Kind of a summary of the above constraints, explaining how the EA hotel could fill the need for the lack of mobility in the middle (what OP calls a “chasm”), trying to explain the vetting and talent constraints in the EA community. The first part is especially useful for outlining this underlying model. Which community building projects get funded? By AnonymousEAForumAccount. It raises an important question, but I (Vaidehi) think the analysis misses the important questions. I’ve built off the original spreadsheet with categories here.

I was going to post something for careers week but it was delayed for various reasons (including the mandatory last minute rewrite). I plan to post it in the next couple of weeks.

Vaidehi Agarwalla
Update: It's posted!

Offer: if you've been affected by the future fund situation and are actively job hunting i'd liek to help: i can give feedback on resumes (am most familiar with US / ops / consulting / business resumes). Just DM a gdoc of your resume & LMK what kinds of jobs you're applying for.

Mini Collection - Non-typical EA Movement Building

Basically, these are ways of spreading EA ideas, philosophies or furthering concrete EA goals in ways that are different from the typical community building models that local groups use.

Suggestions welcome!

Vaidehi Agarwalla
This quote from Kelsey Piper:

Quick BOTEC of person-hours spent on EA Job Applications per annum.

I created a Guesstimate model to estimate a total of ~14,000 to 100,000 person-hours or ~7 to 51 FTE are spent per year (90% CI). This comes to an estimated USD $ 320,000 to $3,200,000 unpaid labour time. 

  • All assumptions for my calculations are in the Guesstimate
  • The distribution of effort spent by candidates is heavy-tailed; a small percentage of candidates may spend 3 to 10x more time than the median candidate.
  • I am not very good at interpreting the guesstimate, so if someone can state this better / more accurately than would be helpful
  • Keen to get feedback on whether I've over/underestimated any variables.
  • I'd expect this to grow at a rate of ~5-10% per year at least.

Sources: My own experience as a recruiter, applying to EA jobs and interviewing staff at some EA orgs.


Edited the unpaid labour time to reflect Linch's suggestions.

I think  As a normal distribution between $20-30 is too low, many EA applicants counterfactually have upper middle class professional jobs in the US.  I also want to flag that you are assuming that the time is but many EA orgs do in fact pay for work trials. "trial week" especially should almost always be paid. 
Vaidehi Agarwalla
Hi Linch, thanks for the input! I'll adjust the estimate a bit higher. In the Guesstimate I do discount the hours to say that 75% of the total hours are unpaid (trial week hours cone to 5% of the total hours).
Josh Jacobson
I did not review the model, but only 75% of hours being unpaid seems much too low based on my experience having gone through the job hiring process (including later stages) with 10-15 EA orgs.
Vaidehi Agarwalla
Okay, so I used a different method to estimate the total manhours and my new estimate is something like 60%. I basically assumed that 50% of Round 2 -4 in the application process is paid, and 100% of the work trial.  I expect that established / longtermist orgs are disproportionately likely to pay for work tests, compared to new or animal / GH&D orgs. 
Aaron Gertler
I think Josh was claiming that 75% was "too low", as in the total % of unpaid hours being more like 90% or something. When I applied to a bunch of jobs, I was paid for ~30 of the ~80 hours I spent (not counting a long CEA work trial — if you include that, it's more like 80 out of 130 hours). If you average Josh and I, maybe you get back to an average of 75%? ***** This isn't part of your calculation,  but I wonder what fraction of unique applicants to EA jobs have any connection to the EA community beyond applying for one job? In my experience trying to hire for one role with ~200 applicants, ~1/3 of them neither had any connection to EA in their resumes nor provided further information in their applications about what drew them to EA. This doesn't mean there wasn't some connection, but a lot of people just seemed to be looking for any job they could find. (The role was more generic than some and required no prior EA experience, so maybe drew a higher fraction of outside applicants.) Someone having no other connection to the EA community doesn't mean we should ignore the value of their time, and the people who apply to the most jobs are likely to have the strongest connections, so this factor may not be too important, but it could bear consideration for a more in-depth analysis.

Reflections on making commiting to a specific career path 

Imagine Alisha is making a decision whether to pursue job X or job Y. She is currently leaning in favor of job X 55% to 45%, so decides to pursue job X. Over the next couple years, Alisha gains knowledge and expertise as an Xer, and is passionate and excited by her job. She's finding new opportunities and collaborations, and things are going well. But she often wonders if things would have gone even better if she went with job Y. 

I believe that you get a lot more value from committing to o... (read more)

Project: More expert object-level observations & insights 

Many subject matter experts with good object-level takes don't post on the forum because they perceive the bar is too high

  • Other E.g. that I know personally : Founders of impactful charities don't post regular updates on the progress their organizations are making, lessons they are learning, theory of change updates, how others can help etc. 
  • People who aren't naturally writers (e.g. they are doers and more on the ground building partnerships etc)
  • People who don't realise they could add v
... (read more)

PROPOSAL: EA Career Coaches - quick evaluation

Experimenting to see what kind of feedback this gets and whether it's useful to share very early stage thoughts publicly. If anyone is interested or already exploring this topic, feel free to reach out, I have written up a (slightly more indepth) proposal I can share.

Problem: There might be many people in EA that could benefit from career coaching.

Size: I estimate ~300 to 1000 (60% CI) people might be underconfident or less ambitious than they might be. 80K frequently mentions underconfidence. These are people with basic intro knowledge but are unable to make career progress due to miscalibration, lack of knowledge, negative feelings associated with networking / applying etc.

Tractability: Career coaches are very common & help people become confident and land dream jobs / improve their current career situation.

Neglectedness: Seems unlikely to me that existing coaches cover this need. I am also fairly confident that existing groups / CBs do not cover this need.

Proposal: An EA career coach who's: Improving client's calibration on good fit positions (both EA & non-EA) Giving practical advice & guidance (e.g. resumes, interviews,... (read more)

Some ideas for improving or reducing the costs of failure transparency 

This is an open question. The following list is intended to be a starting point for conversation. Where possible tried to make these examples as shovel-ready as possible. It would be great to hear more ideas, or examples of successfully implemented things. 

Thanks to Abi Olvera, Nathan Young, Ben Millwood, Adam Gleave & Arjun Khandelwal for many of these suggestions.

Create a range of space(s) to discuss failure of any size.

  • I think the explicit intention of helping the commu
... (read more)

Idea: EA Library

Many colleges and universities have access via their libraries to a number of periodicals, papers, journals etc. But once you graduate, you lose access 

With something like sci-hub we don't really need access to many things on the academic side. 

But it seems valuable for EAs to not be pay-walled for access to various journals or news outlets (e.g. Harvard Business Review or Wall Street Journal or something) if they want to do resaerch (if there's a sci-hub for stuff like that that could also work!)

We could probably develop this in ... (read more)

How valuable is building a high-quality (for-profit) event app for future EA conferences?

There are 6 eag(x) conferences a year. this number will probably increase over time and more conferences will come up as EA grows- I'd expect somewhere between 80-200 EA-related conferences and related events in the next 10 years. This includes cause-area specific conferences, like Catalyst and other large events.

A typical 2.5 day conference with on average ~300 attendees spending 30 hours = 9,000 man-hours would be a range of 720,000-1,800,000 man hours over 10 years. Of this time, I'd expect 90% to be taken up doing meetings, attending events, eating etc. Of the remaining 10%, so 7,200-18,000 saving 1% of this time is in the range of 7,200- 18,000 hours or roughly seems pretty useful!

For reference, 1 year of work (a 40 hours work-week for 50 weeks) = 2000 hours.

Vaidehi Agarwalla
Pricing estimate if we pay for an event conferencing app: Swapcard, recently used by CEA for EAGx events costs approximately USD$7 per user. Using my previous estimate, the total cost over 10 years would be between USD $168,000-420,000 without any discounting. Discounting 50% for technology becoming cheaper, and charity discounts, we could conservatively say $84,000-$210,000 total cost.  Not sure what to do with this information, or how to compute the value of this money saved (assuming our benevolent EA ally / app creator gives us access for a heavily discounted price, otherwise the savings are not that important).
Vaidehi Agarwalla
Given the pandemic, I would actually upgrade the potential cost effectiveness of this, because we can now add Student Summits and EAGxVirtuals as potentially regular events, bringing the total in a non-COVID year to up to 8 events. 
Hm I think Swapcard is good enough for now, and I like it more than the Grip app. I think this comes down to what specific features people want in the conference app and why this would make things easier or better. Of course it would be good to centralize platforms in the future (i.e. maybe the EA Hub also becomes a Conference platform), but I don't see that being a particularly good use of time.
+1 the math there. How does building an app compare to throwing more resources at finding better pre-existing apps?  I'll just add I find it kind of annoying how the event app keeps getting switched up. I thought Grip was better than whatever was used recently for EAGxAsia_Pacific (Catalyst?). 
Vaidehi Agarwalla
I think CEA has looked at a number of apps - it wold definitely be worth checking with them to see how many apps they've considered out of the total number of apps available, and possibly follow the 37% rule. 
It seems plausible, though overall not that likely, to me that maybe the LessWrong team should just build our own conference platform into the forum. We might look into that next year as we are also looking to maybe organize some conferences.
Vaidehi Agarwalla
That would be interesting! I'd be interested to see if that happens - I think there are probably a benefits from integration with the LW/EA Forum. In what scenario do you think this would be the most likely?
I think it's most likely if the LessWrong team decides to run a conference, and then after looking into alternatives for a bit, decides that it's best to just build our own thing.  I think it's much more likely if LW runs a conference than if CEA runs another conference, not because I would want to prioritize a LW conference app over an EAG app, but because I expect the first version of it to be pretty janky, and I wouldn't want to inflict that on the poor CEA team without being the people who built it directly and know in which ways it might break. 

An incomplete list of movement building terms

I plan to keep adding and refining this list over time, I've just put my current list here in case anyone else has ideas. 

Movement Building: Any work to build a movement including but not limited to: community infrastructure, community building, cause, field or discipline development and outreach. Movement building does not need to involve using or spreading the EA brand.

Movement building is a subset of "Meta EA" or "meta altruism".

Types of Movement Building:

Community Infrastructure: The development of community-wide products and services that help develop the community. Online examples include building wikis, forums, tools and websites. Offline examples include conferences, community houses, and regional networks.

Note: Some community infrastructure may be limited to certain subgroups within the community, such as events and services for leaders or affiliated organisations. Such events might still provide benefits to the wider community, especially when they improve coordination and communication, and where relevant should be considered as infrastructure.  

Community Building: Influencing individuals to take actions based o... (read more)

How important is it to measure the medium term (5-50 years) impact of interventions?

I think that taking the medium-term impact into account is especially lacking in the meta space, since building out infrastructure is exactly the kind of project that could take several years to set up with little progress before gains are made. 

I'd also be interested in how many /which organisations plan to measure their impact on this 5-50 year timescale. I think it would be very interesting to see the impact of various GH&D charities on a 5 or 10 year timescale.  

A Typology of EA Careers Advice

The Local Career Advice Network recently completed a pilot workshop to help group organiers develop and implement robust career 1-1 strategies. During this process we compiled all existing EA careers advice & strategy, and found several open questions. This post provides an overview of the different kinds of careers research one could do. We will write more posts trying to explain the value of the different kinds of research. 

Movement-level research 

  • This research identifies bottlenecks in top causes and makes re
... (read more)
Vaidehi Agarwalla
I think movement-level advice is most useful for setting movement-level strategy, rather than informing individual actions because personal fit considerations are quite important. However, I think this has the consequence that some paths are much more clearly defined than others, making it difficult for people who don't have those interests to define a path.

Meta-level thought:

When asking about resources, a good practice might be to mention resources you've already come across and why those sources weren't helpful (if you found any), so that people don't need to recommend the most common resources multiple times.

Also, once we have an EA-relevant search engine, it would be useful to refer people to that even before they ask a question in case that question has been asked or that resource already exists.

The primary goal of both suggestions would be to make questions more specific, in-depth and hopefully either expanding movement knowledge or identifying gaps in knowledge. The secondary goal would be to save time!

Reasons for/against Facebook & plans to migrate the community out of there

Epistemitc Status: My very rough thoughts. I am confident of the reasons for/against, but the last section is mostly speculation so I won't attempt to clarify my certainty levels

Reasons for moving away from Facebook

  • Facebook promotes bad discussion norms (see Point 4 here)
  • Poor movement knowledge retention
  • Irritating to navigate: It's easy to not be aware that certain groups exist (since there are dozens) and it's annoying to filter through all the other stuff in Facebook to get
... (read more)
Neel Nanda
I want to emphasise this point, since I think it applies to both new and more experienced members. I personally find it quite high mental load to actively pay attention to communities on a new platform. Some of these are start-up costs (learning a new interface etc), but there are also ongoing costs of needing to check the new site, etc. And it is much easier to add something to an existing place I already check
Aaron Gertler
I don't think the Forum is likely to serve as a good "group discussion platform" at any point in the near future. This isn't about culture so much as form; we don't have Slack's "infinite continuous thread about one topic" feature, which is also present on Facebook and Discord, and that seems like the natural form for an ongoing discussion to take. You can configure many bits of the Forum to feel more discussion-like (e.g. setting all the comment threads you see  to be "newest first"), but it feels like a round peg/square hole situation. On the other hand, Slack seems reasonable for this!
There is also a quite active EA Discord server, which serves the function of "endless group discussions" fairly well, so another Slack workspace might have negligible benefits.
Another possible reason against might be: In some countries there is a growing number of people who intentionally don't use Facebook. Even if their reasons for their decision may be flawed, it might make recruiting more difficult. While I perceive this as quite common among German academics, Germany might also just be an outlier. I think the EA Hub is in a good position to grow and replace some of the functions that Facebook is currently being used for in the community.

What are the low-hanging fruit or outliers of EA community building?

(where community building is defined as growing the number of engaged EAs who are likely to take medium-to-large sized actions in accordance to the EA values and/or framework. it could include group activities, events, infrastructure building, resource)

  • the EA community talks a lot about low-hanging fruits and the outlier interventions that are 100x or 1000x better than the next best intervention
  • it seems plausible that either of these exist for community building


Low hanging fruits

  • from
... (read more)
I think introductory fellowships are extreme outlier interventions. EA Philippines' 8-week Intro to EA Discussion Group (patterned after Stanford's Arete fellowship) in May-July 2020  was by far our best activity yet. 31 signed up and 15 graduated, and out of the graduates, I believe we've created the following counterfactual impact: 1. One became the president of our student chapter EA Blue 2. Another became a core team member of EA Blue 3. Two have since taken the GWWC pledge 4. Three have become new volunteers (spending ~1-2 hrs/week) for EA Philippines (we actually got two more volunteers aside from these three, but those two I would say were not counterfactual ones) 5. Helped lead to a few career plan changes (I will write a separate impact report about EA PH's 2020, and can talk about this more there). EA Blue is now doing an Introductory Fellowship similar to ours with 26 participants, which I'm a facilitator for, and I think we're having similarly good results!
I don't have an answer, but I'm curious - why don't you publish it as a proper post?
Vaidehi Agarwalla
This is a very rough post and I don't know how much I would stick to this framing of the question if I spent more time thinking it over!
Makes sense, even though it feels alright to me as a post :)  I'd really like to see more answers to this question! 

Could regular small donations to Facebook Fundraisers increase donations from non-EAs?

The day before Giving Tuesday, I made a donation to a EA Facebook charity that had seen no donations in a few weeks. After I donated to about 3 other people donated within the next 2 hours (well before the Giving Tuesday start time). From what I remember, the total amount increased by more than the minimum amount and the individuals appeared not to be affiliated with EA, so it seems possible that this fundraiser might have somehow been raised to their attention. (Of cour

... (read more)

CGD launched a Global Skills Partnership program to reduce brain drain and improve migration (

It would be interesting to think about this from the perspective of EA groups, where brain drain is quite common. Part of their solution is to offer training and recognized certifications to a broader group of people in the home country to increase the overall pool of talent.

I will probably add more thoughts in the coming days when I have time to read the case studies in more depth.

Collection of anecdotal evidence of EA career/impact frustrations

After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation by EA Applicant. Most upvoted post on the forum, sparked a lot of recent discussion on the topic. 8 commenters resonated with OP on the time investment and/or disappointment (1,2,3,4,5,6,7,8). There were 194 unique upvotes. 

My mistakes on the path to impact by Denise Melchin. Another highly upvoted post talking about the emphasis on working at EA organisations and direct EA work. There we... (read more)

You could add this recent post to the list:
Vaidehi Agarwalla
EA’s Image Problem by Tom Davidson. 4 years old but the criticisms are still relevant. See also many comments. 

I brainstormed a list of questions that might help evaluate how promising climate change adaptation efforts would be.

Would anyone have any additions/feedback or answers to these questions?

Is anyone aware of/planning on doing any research related to the expected spike in interest for pandemic research due to COVID? 

It would be interesting to see how much new interest is generated, and for which types of roles (e.g. doctors vs researchers). This could be useful to a) identify potential skilled biosecurity recruits b) find out what motivated them about COVID-19 c) figure out how neglected this will be in 5-10 years 

I'd imagine doing a survey after the pandemic starts to die down might be more valuable than right now (maybe after the

... (read more)
Having done some research on post-graduate education in the past, it's surprisingly difficult to access application rates for classes of programs. Some individual schools publish their application/admission rates, but usually as advertising, so there's a fair bit of cherry picking. It's somewhat more straightforward to access completion rates (at least in the US, universities report this to government). However, that MVP would still be interesting with just a few data points: if any EAs have relationships to a couple relevant programs (in say biosecurity, epidemiology), it may be worth reaching out directly in 6-12 months! A more general point, which I've seen some discussion of here, is how near-miss catastrophes prepare society for a more severe version of the same catastrophe. This would be interesting to explore both theoretically (what's the sweet spot for a near-miss to encourage further work, but not dissuade prevention policies) and empirically. One historical example might be, for example, does a civilization which experienced a bad famine experience fewer famines in a period following that bad famine? How long is that period? In particular, that makes me think of MichaelA's recently excellent Some history topics it might be very valuable to investigate.
In the UK could you access application numbers with a Freedom of Information request?

Some thoughts on stage-wise development of moral circle

Status: Very rough, I mainly want to know if there's already some research/thinking on this.

  • Jean Piaget, a early childhood psychologist from the 1960s, suggested a stage sequential model of childhood developemnt. He suggesting that we progress through different levels of development, and each stage is necessary to develop to the next.
  • Perhaps we can make a similar argument for moral circle expansion. In other words: you cannot run when you don't know how to walk. If you ask someone to believ
... (read more)
My sense is that the idea of sequential stages for moral development is exceedingly likely to be false and in the case of the most prominent theory of this kind, Kolhlberg's, completely debunked in the sense that there was never any good evidence for it (I find the social intuitionist model much more plausible), so I don't see much appeal to trying to understand cause selection in these terms. That said, I'm sure there's a rough sense in which people tend to adopt less weird beliefs before they adopt more weird ones and I think that thinking about this in terms of more/less weird beliefs is likely more informative than thinking about this in terms of more/less distant areas in a "moral circle". I don't think there's a clear non-subjective sense in which causes are more or less weird though. For example, there are many EAs who value the wellbeing of non-actual people in the distant future and not suffering wild animals and vice versa, so which is weirder or more distant from the centre of this posited circle? I hear people assume conflicting answers to this question from time to time (people tend to assume their area is less weird). I would also agree that getting people to agree to beliefs which are less far from what they currently believe can make them more positively inclined to subsequently adopt beliefs related to that belief which are further from their current beliefs. It seems like there are a bunch of non-competing reasons why this could be the case though. For example: * Sometimes belief x1 itself gives a person epistemic reason to believe x2 * Sometimes believing x1 increases your self-identity as a person who believes weird things, making you more likely to believe weird things * Sometimes believing x2 increases your affiliation with a group associated with x1 (e.g. EA) making you more likely to believe x3 which is also associated with that group Notably none of these require that we assume anything about moral circles or general sequences of bel
Vaidehi Agarwalla
Yeah I think you're right. I didn't need to actually reference Piaget (it just prompted the thought). To be clear, I wasn't trying to imply that Piaget/Kohlberg's theories were correct or sound, but rather applying the model to another issue. I didn't make that very clear.  I don't think my argument really requirs the empirical implications of the model (especially because I wasn't trying to imply moral judgement that one moral circle is necessary better/worse). However I didn't flag this. [meta note: I also posted it pretty quickly, didn't think it through it much since it's a short form] I broadly agree with all your points.  I think my general point of x,10x,100x makes more sense if you're looking along one axes (eg. A class of beings like future humans) rather than all the ways you can expand your moral circle - which I also think might be better to think of as a sphere or more complex shape to account for different dimensions/axes.  I was thinking about the more concrete cases where you go from cats and dogs -> pigs and cows or people in my home country -> people in other countries.  Re the other reasons you gave: I think this is kind of what I was trying to say, where there can be some important incremental movement here. (Of course if x2 is very different from x1 then maybe not). This is an interesting point I haven't thought much about.  I think this is probably the strongest non-step-wise reason. 
If longtermism is one of the latest stages of moral circle development than your anecdotal data suffers from major selection effects.
Curated and popular this week
Relevant opportunities