This is a special post for quick takes by Lorenzo Buonanno🔸. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since:

Bill Gates just endorsed GiveWell!

I don't know how much we should update on this, but I'm now personally a bit less concerned about the "self-recommending" issues of EA resources being mostly recommended by people in the EA social community.[1]

I think this is a good sign for the effective giving ecosystem, and will make my relatives much less worried about how I spend my money.

  1. ^

    Not that I was super concerned after digging deeper into things in the past year, but I remember being really concerned about it ~2 years ago, and most people don't have that much time to look into things.

I am confused as to why he doesn't give $1bn a year to GiveWell, but I guess let's wait a year and see if that happens.

I don't think his foundations really does marketing for donations. My guess is that from his point of view, he thinks it's preferable - but that he thinks GiveWell could be a good fit as a donation target to many audience members. 

That's fantastic!! Great to see, thanks for sharing!

Lorenzo Buonanno🔸
Moderator Comment35
8
2
Community

Postmortem on mod action on the April fools post “A modest proposal for women in EA”

Summary

Last Sunday, a bad post (definitely violating multiple forum norms) from a new anonymous user was published on the Forum. It got a lot of views and we removed it 11 hours and 30 minutes after it went up. I think we made some mistakes and wanted to write a quick postmortem.

Timeline (all times are Sunday April 2, 2023 and in CEST, although the mod team is in different timezones):

10:49 AMThe post “A modest proposal for women in EA” from a new user “ozaiscometh” goes up
1:03 PMI notice the first user report on the forum, "maybe this is well intentioned satire, but it misses on the rather sensitive issue of sexual assault", and start a discussion in the moderation Slack
1:57 PMI notice a second user report on the forum
4:56 PMJP proposes banning ozaiscometh for the post
9:47 PMWe notice a lot of other reports (~6 from memory). We notice that the post (despite being heavily downvoted) is getting a lot more views than expected (~190 at the time)
9:55 PMA user suggests deleting the post in a comment
10:06 PMWe ban ozaiscometh, and post a comment mentioning that
10:18 PMWe remove the post


 Key mistake: I allowed the post to be visible on the Forum

I read it as a bad taste (satirical) criticism arguing something like "EA doesn't listen to my reports on sexual assault", which in hindsight is also not ok written like that, but I was worried about censoring criticism and didn’t examine enough whether it was just straightforwardly norm-violating. I didn’t realize other readers might interpret it very differently and didn’t consider enough how sensitive the topic is.
We currently approve almost all new users, except in extreme cases (e.g. spam, calls to violence, etc.). Our general approach is to let karma take care of hiding low-quality content, and actively remove content very rarely.

Learnings

  1. Before approving users whose first post/comment is dubious or in bad taste, I’ll default to double-checking with other moderators.
  2. The moderation team is not synched up on which posts to approve, or our criteria for approving posts are wrong
    1. We’ll think more about this in the future.

Other considerations

  1. The post went up on a Sunday morning and was removed less than 12 hours later. The offender was banned at the same time. Overall, we think this is an ok response time, especially given that the post was so downvoted that it was not showing up for people who weren't actively looking for it (at least 84% of views came from Twitter, and we think that many were after the post was removed).
  2. We think the karma system did its job. The post was at -185 karma with 55 downvotes less than 12 hours after it was posted. If I remember correctly, it was at -10 karma after less than 10 minutes.
  3. I’m particularly grateful to the (many) users who reported the post on the Forum.
  4. We’re doing a quick check to see if this account is an alt of another user.
    Edit: the forum team checked and there is no other user using the same IP address.

I read it as a bad taste (satirical) criticism arguing something like "EA doesn't listen to my reports on sexual assault", which in hindsight is also not ok written like that, but I was worried about censoring criticism and didn’t examine enough whether it was just straightforwardly norm-violating.

I think this is a reasonable class of concerns for you to have had; there has been a lot of media criticism of schools and libraries for removing offensive books about sex or violence, including satire, though this post does seem to have been unusually bad. It seems like you guys have been focused on making sure the forum works well for people engaging with it organically, where as you pointed out the karma system worked well. People on twitter selectively highlighting content that would not be visible to ordinary users is a very different threat model that seems both less crucial and harder to address.

I agree that this is a reasonable class of concerns to have in general, and I agree that moderators probably shouldn't optimize for the threat of people selectively highlighting content, though it does seem to be stretching the bounds of charitability for this specific post to count as a good-faith criticism/satire.

If the issue was that the post was left up longer than you feel it should have been left up in retrospect due to deliberation time, then I'd suggest adding a feature to temporarily remove a post. This would be similar to deleting and undeleting a post except that it would show a message saying that the post is currently under mod consideration.

I would generally hate to see situations where mods feel like they have to rush a decision as I suspect that this would lead to worse decisions.

Thanks for the suggestion.
There is already a similar feature (we moved the post to the user's drafts, and could theoretically undraft it), but it doesn't show a special message.

I personally think that the response time, ~12 hours on a Sunday, was ok (but I might be wrong on this) and that the key mistake was allowing the post in the first place.

(Didn't see this chain when I commented below, sorry!)

I agree 12 hours on a Sunday does seem reasonable and the main decision to let the post through seems less reasonable. I do think that whatever extent you expect to be less active with moderating decisions on a weekend, the bar for allowing the post should probably be higher as a result. I do recognize this was probably made much more difficult with April Fool's though, thanks for all the work you do for the forum!

Thanks for this postmortem.

I didn't see the post when it first came up, so I agree the karma system was working well. There might be some hindsight bias here, but I think the post clearly seemed to fall within what I would have hoped the mod check would screen out.

I also think another commentor's suggestion in the original post about deanonymizing the main account of this user, if one exists, is also worth considering given how clearly egregious and malicious this post was. I'd also be happy for this person to not be at future EAGs or CEA-organized events, especially given the caliber of people I have met in EA spaces who have missed out.

deanonymizing the main account of this user, if one exists, is also worth considering given how clearly egregious and malicious this post was.

Added an update to the end of the postmortem, the forum team checked and there is no other user using the same IP address.

There might be some hindsight bias here, but I think the post clearly seemed to fall within what I would have hoped the mod check would screen out.

Agree, it was a mistake.

Curated and popular this week
Jim Chapman
 ·  · 12m read
 · 
By Jim Chapman, Linkedin. TL;DR: In 2023, I was a 57-year-old urban planning consultant and non-profit professional with 30 years of leadership experience. After talking with my son about rationality, effective altruism, and AI risks, I decided to pursue a pivot to existential risk reduction work. The last time I had to apply for a job was in 1994. By the end of 2024, I had spent ~740 hours on courses, conferences, meetings with ~140 people, and 21 job applications. I hope that by sharing my experiences, you can gain practical insights, inspiration, and resources to navigate your career transition, especially for those who are later in their career and interested in making an impact in similar fields. I share my experience in 5 sections - sparks, take stock, start, do, meta-learnings, and next steps. [Note - as of 03/05/2025, I am still pursuing my career shift.] Sparks – 2022 During a Saturday bike ride, I admitted to my son, “No, I haven’t heard of effective altruism.” On another ride, I told him, “I'm glad you’re attending the EAGx Berkely conference." Some other time, I said, "Harry Potter and Methods of Rationality sounds interesting. I'll check it out." While playing table tennis, I asked, "What do you mean ChatGPT can't do math? No calculator? Next token prediction?" Around tax-filing time, I responded, "You really think retirement planning is out the window? That only 1 of 2 artificial intelligence futures occurs – humans flourish in a post-scarcity world or humans lose?" These conversations intrigued and concerned me. After many more conversations about rationality, EA, AI risks, and being ready for something new and more impactful, I decided to pivot my career to address my growing concerns about existential risk, particularly AI-related. I am very grateful for those conversations because without them, I am highly confident I would not have spent the last year+ doing that. Take Stock - 2023 I am very concerned about existential risk cause areas in ge
jackva
 ·  · 3m read
 · 
 [Edits on March 10th for clarity, two sub-sections added] Watching what is happening in the world -- with lots of renegotiation of institutional norms within Western democracies and a parallel fracturing of the post-WW2 institutional order -- I do think we, as a community, should more seriously question our priors on the relative value of surgical/targeted and broad system-level interventions. Speaking somewhat roughly, with EA as a movement coming of age in an era where democratic institutions and the rule-based international order were not fundamentally questioned, it seems easy to underestimate how much the world is currently changing and how much riskier a world of stronger institutional and democratic backsliding and weakened international norms might be. Of course, working on these issues might be intractable and possibly there's nothing highly effective for EAs to do on the margin given much attention to these issues from society at large. So, I am not here to confidently state we should be working on these issues more. But I do think in a situation of more downside risk with regards to broad system-level changes and significantly more fluidity, it seems at least worth rigorously asking whether we should shift more attention to work that is less surgical (working on specific risks) and more systemic (working on institutional quality, indirect risk factors, etc.). While there have been many posts along those lines over the past months and there are of course some EA organizations working on these issues, it stil appears like a niche focus in the community and none of the major EA and EA-adjacent orgs (including the one I work for, though I am writing this in a personal capacity) seem to have taken it up as a serious focus and I worry it might be due to baked-in assumptions about the relative value of such work that are outdated in a time where the importance of systemic work has changed in the face of greater threat and fluidity. When the world seems to
 ·  · 3m read
 · 
Written anonymously because I work in a field where there is a currently low but non-negligible and possibly high future risk of negative consequences for criticizing Trump and Trumpism. This post is an attempt to cobble together some ideas about the current situation in the United States and its impact on EA. I invite discussion on this, not only from Americans, but also those with advocacy experience in countries that are not fully liberal democracies (especially those countries where state capacity is substantial and autocratic repression occurs).  I've deleted a lot of text from this post in various drafts because I find myself getting way too in the weeds discoursing on comparative authoritarian studies, disinformation and misinformation (this is a great intro, though already somewhat outdated), and the dangers of the GOP.[1] I will note that I worry there is still a tendency to view the administration as chaotic and clumsy but retaining some degree of good faith, which strikes me as quite naive.  For the sake of brevity and focus, I will take these two things to be true, and try to hypothesize what they mean for EA. I'm not going to pretend these are ironclad truths, but I'm fairly confident in them.[2]  1. Under Donald Trump, the Republican Party (GOP) is no longer substantially committed to democracy and the rule of law. 1. The GOP will almost certainly continue to engage in measures that test the limits of constitutional rule as long as Trump is alive, and likely after he dies. 2. The Democratic Party will remain constrained by institutional and coalition factors that prevent it from behaving like the GOP. That is, absent overwhelming electoral victories in 2024 and 2026 (and beyond), the Democrats' comparatively greater commitment to rule of law and democracy will prevent systematic purging of the GOP elites responsible for democratic backsliding; while we have not crossed the Rubicon yet, it will get much worse before things get better. 2. T