This is a crosspost from my new Substack Power and Priorities where I’ll be posting about power grabs, AI governance strategy, and prioritization, as well as some more general thoughts on doing useful things.
Tl;dr
I argue that maintaining nonpartisan norms on the EA Forum, in public communications by influential community members, and in funding decisions may be more costly than people realize. Lack of discussion in public means that people don’t take political issues as seriously as they should, research which depends on understanding the political situation doesn’t get done, and the community moves forward with a poor model of probably the most consequential actor in the world for any given cause area - the US government. Importantly, I don’t mean to say most community members shouldn’t maintain studious nonpartisanship! I merely want to argue that we should be aware of the downsides and do what we can to mitigate them.
Why nonpartisan norms in EA are a big deal
Individual politicians (not naming names) are likely the most important single actors affecting the governance of AI. The same goes for most of the cause areas EAs care about. While many prominent EAs think political issues may be a top priority, and politics is discussed somewhat behind closed doors, there is almost no public discussion of politics. I argue the community’s lack of a public conversation about the likely impacts of these political actors and what to do in response to them creates large costs for how the community thinks about and addresses important issues (i.e. self-censorship matters actually). Some of these costs include:
- Perceived unimportance: I suspect a common, often subconscious, thought is, 'no prominent EAs are talking about politics publicly so it's probably not as big of a deal as it seems'. Lack of public conversation means social permission is never granted to discuss the issue as a top priority, it means the topic comes up less & so is thought about less, and it means institutions don't feel pressured to do anything.
- There is a culture of deferring to experts in EA (i.e. people base their opinions on those of senior figures in the community) and, while I think EA is better than most communities about deference & that deference is often a reasonable strategy, here it backfires in a big way.
- Newcomers looking to download the wisdom of EA will look around (e.g. at the EA Forum or the 80k website) and see that no one seems to care about politics even a little. I expect many will think either, ‘woah, EA is kinda stupid. Seems like they’re missing something big’ or ‘I guess politics is another one of those things that seems important but actually doesn’t matter’.
- Nonpartisanship forces self-censorship & erodes epistemic honesty: Many EAs and rationalists believe it best to present a relatively unfiltered and authentic version of their beliefs to the public. Aiming for nonpartisanship in your public communications means you cannot convey your honest opinions to the world or maintain a more authentic presence (which does matter, I think people are good at noticing inauthenticity and they do care about it).
- Failure to develop accurate models of (political) reality: I fear that because many people in the community aren’t really keeping up on what’s happening and the above factors, many people in the community don’t really know what’s going on (e.g. DOGE cuts were a total surprise to most but are probably the single biggest event in global health this decade). When I tell people facts about the current situation they’re often surprised and update their models at least somewhat. Given the lack of understanding, some people act as if we’re in a business-as-usual world where the US can be modeled as acting roughly as it did under Biden but with some Republican flavoring. I think this is incorrect and could hurt decision-making in crucial ways.
- Research gap: There is only a tiny amount of funding in the EA space available to do work which is necessarily political in nature. This is both because of a desire by researchers & funders to be nonpartisan, but also because most donors either strongly prefer or are legally obligated to only donate to nonpartisan nonprofits (501(c)3s).
- Lack of knowledge transfer: It would appear that most organizations do essentially no analysis of the political situation, those that have will likely never share those publicly and, I suspect, don’t even share them privately. When no one publishes, there’s no build-up of knowledge, the frontier of understanding advances dramatically slower than it would under an uncensored information-sharing regime.
- Being nonpartisan closes down many options for impact: While maintaining nonpartisanship is a way to keep options open, it is more costly than I think many realize. Maintaining a nonpartisan stance means you miss many potentially impactful opportunities (e.g. getting involved in political campaigns is a common way to get roles in future administrations) and I worry people are overweighting the value of preserving option value.
- Failure to prioritize and act on high-impact opportunities : All of the above culminate in a failure to consider political interventions and engage in real thinking about one of the most important leverage points in determining the trajectory of the future. If the single most important thing to do right now were to intervene in the political situation in the US right now, it feels unlikely the EA community would recognize that and act on it.
Why EAs tend not to engage much with politics
There are plenty of excellent reasons why being openly political can be a bad idea. By engaging publicly with political issues, organizations and individuals risk:
- Antagonizing political opponents
- Losing influence with influential people
- Losing or being unable to get funding (e.g. 501(c)3 aren’t allowed to engage in political advocacy [1])
- Polarizing the EA community and top-priority cause areas (especially if there’s an imbalance in support for certain candidates or parties)
- Being canceled by the other side
There’s also some more cultural and temperamental reasons why EAs tend not to engage with politics as much as might be ideal. EAs tend to focus mostly on issues where there isn’t a ton of public attention, tend to dislike the adversarial atmosphere of partisan politics, and prefer to work on the margins (this recent piece goes into more detail). I also suspect the meme that ‘politics is the mindkiller’[2] has been influential in the community and pushes many toward believing it’s not worth spending time trying to be rational about such an emotionally charged space.
Many should remain nonpartisan but improvements can be made
While there are large costs, I believe most of the community should remain largely nonpartisan. What I am saying is that, given the importance of politics, we should find more ways to ensure political analysis and action are happening. How might we go about doing this?
- Some community members could talk more openly about their analysis of the current political situation. I.e. Someone should be willing to name names & spell out how politics affects the cause areas they care about (e.g. Kelsey Piper does this well). Importantly, it’s best to avoid making claims on behalf of a given cause area or EA as a whole as I think the broader community should remain largely nonpartisan.
- More advocacy movements should aim for bipartisanship rather than nonpartisanship by having partisan advocates, operatives, and funders on both sides of the aisle, as is often the case with successful advocacy and lobbying (e.g. corporations, AIPAC, crypto, etc).
- In an ideal world, there would be many Republicans and many Democrats who care about nonpartisan issues like mitigating existential risks and ensuring a flourishing future. These priorities and causes are likely to get more buy-in when politicians and others involved in party politics see that people with these priorities are also allied on other causes they already care about.
- For this to happen we need to have more members of the community who are openly partisan on both sides. Admittedly the community is likely to lean more toward one side than the other and I’m not sure what to do about that.
- Some community members should get involved directly with politics (with both parties).
- Organizations could have internal discussions and commission research to keep track of what’s happening and consider potential interventions.
- There could be a few more funders open to funding research which is necessarily political.
- There could be broader awareness of just how consequential the political situation is to ~everything the EA community cares about.
- It could be useful for events, organizations, and places where discussion happens to make it more clear that the reason people don’t talk about politics in these spaces is not because they think it’s unimportant but because partisan speech is restricted or disincentivized.
- ^
Though this is easier to circumvent than I think many realize and there are clear, albeit moderately costly, ways to get around these restrictions. After all, Charlie Kirk runs a 501(c)3 (along with a 501(c)4) and has no qualms about endorsing candidates. It seems quite common for advocacy groups to have both & do very explicit partisan activities with purely nominal separation.
- ^
There’s an old Yudkowsky post called ‘Politics is the Mindkiller’. The title has become a meme in the EA & Rationalist community and I think it’s contributed to the community’s general allergy to politics. I’ve certainly heard the phrase ‘politics in the mindkiller’ said, often jokingly, in EA circles. While the post doesn’t actually advocate against talking about politics (it’s pretty narrowly saying to not use politically charged examples when making an unrelated point), I think the meme it’s created has pushed many away from engaging with the topic.
This aspect of EA is massively alienating to me in this moment and I would be curious how common this experience is.
Not uncommon, and I'm happy to chat about efforts to change this. (This offer is open to other forum readers too, please feel free to DM me).
This post makes an argument that politics is important, but I don't really see any argument that it is neglected or tractable?
I share your take, and would add that this example is even more central than you seem to suggest. Not foreseeing this is a clear, traceable mistake. These decisions caused such a death toll that even moderate (bipartisan) efforts on that front would have been justified in terms of EV.
As a forum user, I'd say I think that it's possible to discuss politics coldly (as long as moderation is stringent on tribal dynamics) and would appreciate this done more often (by the right people).
How much could the DOGE cuts reasonably have been prevented? And would the prevention there even have been politics-shaped (vs. e.g. being friends with Musk)?
I have some sense that getting a bunch of congresspeople to feel more positively about foreign aid would have done ~0 to block these cuts (unless the congressperson was really willing to die on the foreign aid hill), but I'm not sure.
I made a prediction that foreign aid would be cut significantly in ~September of last year (see below) so seems like there's some degree to which at least some cuts were predictable. I think the intervention I advocated for at the time, stopping Trump from being elected, would have been the most straightforward action to take (and I think EV of doing that from even a pure global health perspective looked alright).
I didn't predict specific DOGE cuts. That said, if one had, then trying to get the message to Musk that this matters could have been a reasonable action to take (and would have been usefully informed by better political analysis). Plausibly there's some messaging stuff one could do?
Otherwise, the best thing to do might have been to have some contingency planning for large aid cuts? I'm not sure how much counterfactual value having those plans would end up being but it seems possible that with some preparation beforehand one could keep larger parts of PEPFAR alive maybe? Certainly it seemed like the sector was really overwhelmed upon learning of USAID cuts and that seems like some indication that more preparation would have been useful.
Overall, I don't feel too strongly that knowing DOGE cuts were coming would have been super high leverage but I think it exemplifies politics as being something which can have extremely far reaching impacts on cause areas that EAs care about - even far beyond anything happening internal to the field. Same thing seems true of AI as well as animal welfare and pandemic prevention. In the extreme, the end of democracy in the US would seem pretty likely to have a bigger negative impact (by far) on all EA cause areas than basically anything one could do internal to the cause area.
Exact prediction about aid cuts (which I made after maybe 1-2 hours of looking into this):
"If [Harris] spends at the same level as Biden (and Trump reverts to his prior spending), getting her into office would lead to ~$16 billion going to international aid [over the full term] that otherwise wouldn’t have. "
Is there any reason to believe that the election would have been a tractable cause area? As @Jason noted, "the pre-eminent EA funder was one of the top ten donors in the 2024 US elections cycle"
My sense is that even if the full weight of EA were thrown towards preventing Trump from getting elected, it would still have not been enough, and also it would have antagonized Trump.
I helped work on this piece along with some other research attempting to assess tractability. I think it wasn't super obvious that it was the best way to spend money but it was probably cost-competitive with many top donation opportunities in expectation. There also may have been ways we could have influenced things early on if we'd been putting in the effort in, say, 2023 (e.g. trying to get Biden out faster).
Politics work is basically always going to very low probability, high reward. In this election I'd argue the expected impacts counterbalanced the low probability of success.
I agree. I don't think any amount of political donations or support would've made "We should give lots of taxpayer money to Africa" politically palatable in 2024. Enough voters were in an isolationist mood.
This would absolutely be a bad message to use, voters don't care about aid at all. You'd just use the best message-tested stuff available which generically moves the needle in the direction you want it to go
There may have been a time when EA's should have stayed out of politics. This isn't it.
There may been times when we should separate our EA discussions from political opinions - even if we feel strongly about political questions, we should keep those opinions away from our EA discussions.
Today, we do not have that luxury. We need to get our hands dirty.
Many of us care deeply about the world, yet for fear of being called "partisan" do not dare to point out the obvious FACT that there is one party which currently is standing for everything that EA's oppose.
I have written this before, and I got a lot of downvotes, but I will say it again.
By far the most effective, impactful think the EA movement could do would be to find a way to stop Donald Trump and his cronies destroying so much.
I fully accept that EA's should include and listen to both Democrats and Republicans, liberals and conservatives. But Republicans, even more than Democrats, should be putting their necks on the line to stop Trump destroying their party in addition to destroying the US.
There is no coherent way that anyone could be an EA and a Trump supporter.
Stopping Trump destroying the US, destroying AI Governance, destroying global aid, destroying climate-action, ... is the single most important task in the world right now.
Those of us outside the US need our US colleagues, EA's and non-EA's, to do what you can. We need to push our own politicians not to be such pathetic walk-over appeasers too, and we're working on that.
So yes, we need to start engaging in politics, at least until this emergency is over.
I observed social media communication of and about politics in general and I realize that there's a big gap between what kind of communication works well with many voters and what kind of communication highly educated content creators usually tend to use.
There's a lot of "communication in newspaper headline style", making emotional points that feel intuitive for many and need an attention span of a few seconds. And then comes the fact checker, making a 3 minute video explaining why this headline-message has its flaws and what is problematic. When you are just scrolling with the current average attention span of 8,25 seconds (source: Microsoft study), you won't even come to the overall message of video no 2 and it didn't trigger any emotion causing you to react or share the content --> it won't get spread and doesn't reach many views.
So if we want to change the recognition of EA topics not only in more intellectual circles, but in a broad public, we have to adapt to these communication styles and create content for this target group, as this is the majority of voters. I don't want to say there shouldn't also be deep dive content, I think best is to have content (perhaps even on different channels) adjusted to different audiences and different levels of knowledge of the concepts for a broad reach.
How is this affecting the politicians? Public movements have an influence - and if a big enough group of voters is actively campaigning for certain actions, it puts pressure on the government. At some point, it is hard to ignore this. Ideally, we have action on the street and behind closed doors in parallel so the voices of the street deliver weight to the actions behind closed doors.
I think many people following effective altruism principles are focusing on politics, but don't write it in places like the EA forum, because the EA brand is toxic in many circles, and/or has a significant chance of becoming toxic in the future.
See e.g. NIST staffers revolt against expected appointment of ‘effective altruist’ AI researcher to US AI Safety Institute
Yes, in many circles the EA brand is toxic.
But sometimes we stick our heads in the sand as if that were something we couldn't control.
Or maybe some EA's kind of like this feeling of being outsiders and being the minority. I don't know.
Every other group I've ever worked with accepts that PR is part of the world. Companies know that they will get good press and bad press, and that it won't always reflect reality, but they hire people to make it as positive as possible. Politicians are the same. They run focus groups and figure out what words to use to share their message with the public, to maximise support.
Too often we act like we're above all that. We're right and that's enough. If people can't accept that, that's their loss.
But it's not their loss. It's our loss. it's the world's loss.
Public perception of EA's outside the EA community is often "a bunch of 'rationalist' tech guys who like to argue about abstract concepts and believe that AI should have rights," or something along those lines. This is totally at odds with vast majority of EA's who are among the most generous, caring people in the world, who want to help people and animals who are suffering.
A world run by EA's, or on EA principles, would be so wonderful. This should be our vision if we're truly sincere. But if we want to make this happen, we need to be willing to get our hands dirty, do the PR, challenge the newspaper articles that mis-characterize us, learn to communicate in 15 second tweets as well as 22222222243 word essays so that more people can be exposed to EA ideas rather than stereotypes.
If you ask anyone outside the EA community to name an EA, they probably only have heard of SBF. If you push them, they might wonder if Elon Musk is also an EA. It's no wonder they don't trust EA's. But it's up to us to proactively change that perception.
It's true that some EAs work in government but I think this piece lays out pretty well what that actually looks like, and it doesn't typically involve politics - it's more civil service type things. I'm pretty sure I know most EAs who work on direct pollitical work (e.g. elections) and it's quite a small number.
That said, yeah, it's good that there are some people working in government and that does help broader EA understand the political situation a little better.
Not opposed to the idea as an intellectual excercise. Given the massive amount of money, lobbying, and attention paid to politics in mainstream US society, I'm skeptical that EA involvement would move the needle much. It's a pretty saturated field.
I wrote up some arguments for tractability on my forum post about the tractability of electoral politics here. I also agree with this take about neglectedness being an often unhelpful heuristic for figuring out what's most impactful to work on. People I know who have worked on electoral politics have repeatedly found surprising opportunities for impact.
You might not care about politics, but politics cares about you[1].
Honestly, the social justice wave is what made this quite clear to me.
There are situations when disengagement with politics is viable and situations when it is not.
At least some of the time.
I think your link is broken btw. Should be: https://powerandpriorities.substack.com/
Thank you! Fixed
Some of the potential ideas, coming from a politically active university:
- Some sister organizations which can be politically focused on a cause area. One example is FAI which used to be funded by OpenPhil.
- I wonder if EAs should be openly partisan instead of just hiding their political viewpoints. Of course, to prevent the EA forum from becoming a reddit threat the amount of posting should be roughly be equal among both types of political parties.
-- Example: Suppose EA is made out of 80% X and 20% Y. Member of Y should post at a frequency four times higher than X, so we get roughly 50/50 split.
Some other things: To reduce polarization, EA could deprioritize some areas which are seen as very partisan and not as effective. A concrete example: my university is funded by factory farms and we're proud of it. We also have worldviews that since humans are made in the image of God, humans are infinitely more valuable than animals (animals only have instrumental value). Thus saying "abandoning factory farming" would be reputational suicide, as it would be the same as destroying the foundation my university is created on.
You are absolutely never going to get an equal split of x and y. I'm not super deep in the community, but EA seems very politically homogeneous.
You could imagine some kind of quota, where for any partisan political issue, you can only post about it on side X if there is any X quota remaining (which can only happen if posts are added from the perspective of side Y).
But, what would we do if (as seems likely) everything gets polarized? Animal welfare, AI, even GHD are starting to show signs of political polarization. Would everything then be subject to quota?
Agree this is a very thorny problem and I am unsure of how to deal with it. Suspect there's some degree to which you can balance it usefully and that it's worth paying some cost of looking partisan but ultimately it's not really viable to coordinate everyone's level of partisanship.
I think a big part of mitigating the costs is just trying to avoid the sense that you're speaking on behalf of EA or AI safety when talking about partisan stuff