Hide table of contents

A few months ago I posted an advertisement for various EA infrastructure projects on the grounds that there are now many such free or discounted services, and there's very little ongoing way to bring them to the attention of newcomers or remind people who've been around longer and may have forgotten about them.

The trouble is, that post had the same issues all attempts to broadcast ideas in this space do: it sat on the front page for a few hours and then fell away - with the organisations I edited in later getting almost no exposure. So this post is essentially a repost of it, albeit with the suggestions that were edited in later from the start. Ideally I would hope a post like this could be pinned to the top of the forum, but in lieu of that, I'm wondering about posting something like this every N months. My reservations are a) the hassle and b) it being too spammy (or, conversely, that it ends up being a cheap way of karma farming - to counteract this I've left a comment below for counterbalancing downvotes if you upvote this post). Please let me know your thoughts in the comments: assuming nothing better is implemented, should I continue to do this?  If so, for what value of N/under what conditions?

Meanwhile, without further ado, here are the projects that you should check out (edit: bolded entries are new since originally posting):

Coworking/socialising

  • EA Gather Town - An always-on virtual meeting place for coworking, connecting, and having both casual and impactful conversations
  • EA Anywhere - An online EA community for everyone
  • EA coworking Discord - A Discord server dedicated to online coworking

Free or subsidised accommodation

  • CEEALAR/formerly the EA hotel - Provides free or subsidised serviced accommodation and board, and a moderate stipend for other living expenses.
  • NonLinear's EA house  database  - An  experiment by Nonlinear to try to connect EAs with extra space with EAs who could do good work if they didn't have to pay rent (or could pay less rent). 

Professional services

  • WorkStream Business Systems - a service dedicated to EAs, helping you improve your workflow, boost your bottom line and take control of your business
  • cFactual - a new, EA-aligned strategy consultancy with the purpose of maximising its counterfactual impact
  • Good Governance Project - helps EA organizations create strong boards by finding qualified and diverse professionals
  • Altruistic Agency - provides discounted tech support and development to organisations 
  • Tech support from Soof Golan
  • Legal advice from Tyrone Barugh - a practice under consideration with the primary aim of providing legal support to EA orgs and individual EAs, with that practice probably being based in the UK.
  • SEADS - Data Science services  to EA organizations
  • User-Friendly  - an EA-aligned marketing agency
  • Anti Entropy - offers services related operations for EA organizations
  • Arb - Our consulting work spans forecasting, machine learning, and epidemiology. We do original research, evidence reviews, and large-scale data pipelines.
  • Pineapple Operations - Maintains a public database of people who are seeking operations or Personal Assistant/Executive Assistant work (part- or full-time) within the next 6 months in the Effective Altruism ecosystem

Coaching

Financial and other material support

  • Nonlinear productivity fund - A ​low-barrier fund paying for productivity enhancing tools ​for top longtermists. Supported services and products include Coaching, Therapy, Sleep coaching, Medication management , Personal Assistants, Research Assistants, Virtual Assistants, Tutors (e.g. ML, CS, language), Asana, FocusMate, Zapier, etc., Productivity apps, A/C, dishwashers, etc, SAD Lamps
  • Effective Altruism Funds - Whether an individual, organisation or other entity, we’re eager to fund great ideas and great people.
  • Nonlinear fund - We incubate longtermist nonprofits by connecting founders with ideas, funding, and mentorship
  • Survival and Flourishing Fund - A “virtual fund”: we organize application submission and evaluation processes to help donors decide where to make donations. 
  • Open Philanthropy Project - a research and grantmaking foundation that aims to share its findings openly
  • Berkeley Existential Risk Initiative - Supports university research groups working to reduce x-risk, by providing them with free services and support.

As before, please let me know if I missed any in the comments. The guidelines remain as follows (though if there's something you think is essential to let people know about which doesn't strictly fit, feel free to suggest it - for eg the house database doesn't technically qualify, but seemed too valuable to omit):

  • The resource should be free to use, or at available at a substantial discount to relatively poor EAs
  • It should be aimed specifically at EAs
  • It should be for the direct benefit of the people using it, not just to 'enable them to do more good'
  • It should be available to people across the world (ie. not just a local EA group)
  • It should be a service or product that someone is putting ongoing work into (ie not just a list of tips, or Facebook/Discord/Slack groups with no purpose other than discussion of some EA subtopic)

Also, let me know if I should remove or edit any of the descriptions. I've lightly curated it and removed some broken links (FTX Foundation has disappeared for some reason) but haven't done any substantial checking.

Comments21
Sorted by Click to highlight new comments since:

Some low effort thoughts:

  • If this is meant as a living resource, maybe move the first 2-3 paragraphs to the bottom of the post, and leave just a one line explainer at the top, to make it easier to skim ("There are now more free or discounted services available to EAs and EA orgs. Here is an updated list, which is mostly a repost of [this].")
  • Maybe worth linking to your anti karma farming comment in the post so ppl can find it easier?

Other things that might belong here:

My impression is AISS does a bunch of things outside from the health consulting thing fwiw, like maintaining this and this.

If this is meant as a living resource, maybe move the first 2-3 paragraphs to the bottom of the post, and leave just a one line explainer at the top, to make it easier to skim

I think it would be confusing to do that at this stage, but if I do periodically make new versions of this post I'll try harder not to bury the lede :)

Maybe worth linking to your anti karma farming comment in the post so ppl can find it easier?

Good idea! Done.

I'll have a look through the other resources you mention when I have a bit more time.

I have an organization, WorkStream, focused on helping orgs be more effective. See my post about it here. We are currently running both management and operations fellowship programs (see more details here), designed to build community and provide access to relevant education to our org leaders. We start the programs on a quarterly basis, so it's an open application for anyone who wants to apply. It's not a free resource, but we do try to give scholarships when possible.

Could be worth adding these to the Community Infrastructure wiki description as well.

[anonymous]6
0
0

I like the idea!  Upvoted!

What is the reasoning for having "The resource should be free to use, or at available at a substantial discount to relatively poor EAs"  as a guideline (and limiting factor)?

It might be a bad guideline! But the thought was that there are thousands of useful services that aren't preferentially available to EAs, most of which can advertise themselves in the normal way. So this a) keeps the list much smaller and b) helps out organisations who are too targeted and/or not profit-seeking enough to advertise themselves.

[anonymous]4
0
1

Gotcha. I agree (like 60%) with this in the sense that I do want to see the "subsidized" services more heavily promoted. However, if I were the person seeing the list I'd also like to see (for-profit or "not subsidized") services that already work with EA projects or can in some way accommodate needs of EA projects really well and then I'd use my own judgement to decide what service is better suited for me given my own constraints (budget, time, etc.). Hope that makes sense.

(karma upvoted, agreement downvoted)

Can you think of any specific examples of such services? I really want this list not to get too long, both for the sake of my own time and because I think it becomes less useful the less focused it is, but could be persuaded that some company is just so amazingly useful that it should be on here against the guidelines (hence me wanting them to be guidelines rather than strict criteria).

I definitely wouldn't mind being reminded of this list once a quarter!

I don't necessarily disagree but from an organizational perspective: Is this not the sort of thing we have a wiki for? 

For me the difference is simply visibility. The forum has a lot of corners, especially for a newcomer (and come to that, for me, and I've been posting since it launched!), so having a bit of information tucked away is very different from having it appear in people's frontpage feed.

Let’s promote the wiki and make it more visible!

Yeah I say keep doing this, once a month tops/once a quarter minimum :)

I found this incredibly helpful. Along these lines, I've been searching for existing Earn to Give coaches. I know you can join the GWWC and One For the World communities and attend their events. The Life You Can Save also has a feature where you can reach out to an advisor.  But i'm more so curious to know if this is a formal coaching program for making a pledge, similar to what 80,000 hours has with their career coaching program.

There's a newer version of this post here. I suggest that this one be linked to at the top of this post so that people don't spend a bunch of time reading through this only to realize there is a more up-to-date list of resources. 

This isn't necessarily an infrastruture project, but seems relevant to the audience of this post: a compilation of resources for dealing with mental health issues around the alignment problem.

Thanks for collating and sharing this.

I'm not sure if this is the right place to brainstorm possible things missing from this list but one thing that comes to mind is tax or investment advice[1] , particularly for those who are earning to give. At least in the US[2], earning to give presents some unusual risks and opportunities, and it's safe to say most accounting and investment professionals have not worked with clients donating 30% or more of their income. Even if there weren't a discount, just maintaining a list of professionals with experience with specific jurisdictions could be helpful.


  1. Note that tax advice is different from tax preparation; the former is about arranging affairs so as to minimize taxes. ↩︎

  2. The situation is sadly even more complex in certain parts of the US, particularly New York. ↩︎

It sounds like a reasonable idea, though I'm worried that the post will stop functioning as advertising if the criteria are too broad - it feels like it's already a lot to go through. Maybe we could split it into multiple subjects? That does feel like it would be a step in the spammy direction though (not to mention an increasing headache to curate)

Arepo
-18
1
6
[comment deleted]1
0
0
More from Arepo
Curated and popular this week
 ·  · 10m read
 · 
Regulation cannot be written in blood alone. There’s this fantasy of easy, free support for the AI Safety position coming from what’s commonly called a “warning shot”. The idea is that AI will cause smaller disasters before it causes a really big one, and that when people see this they will realize we’ve been right all along and easily do what we suggest. I can’t count how many times someone (ostensibly from my own side) has said something to me like “we just have to hope for warning shots”. It’s the AI Safety version of “regulation is written in blood”. But that’s not how it works. Here’s what I think about the myth that warning shots will come to save the day: 1) Awful. I will never hope for a disaster. That’s what I’m trying to prevent. Hoping for disasters to make our job easier is callous and it takes us off track to be thinking about the silver lining of failing in our mission. 2) A disaster does not automatically a warning shot make. People have to be prepared with a world model that includes what the significance of the event would be to experience it as a warning shot that kicks them into gear. 3) The way to make warning shots effective if (God forbid) they happen is to work hard at convincing others of the risk and what to do about it based on the evidence we already have— the very thing we should be doing in the absence of warning shots. If these smaller scale disasters happen, they will only serve as warning shots if we put a lot of work into educating the public to understand what they mean before they happen. The default “warning shot” event outcome is confusion, misattribution, or normalizing the tragedy. Let’s imagine what one of these macabrely hoped-for “warning shot” scenarios feels like from the inside. Say one of the commonly proposed warning shot scenario occurs: a misaligned AI causes several thousand deaths. Say the deaths are of ICU patients because the AI in charge of their machines decides that costs and suffering would be minimize
 ·  · 14m read
 · 
This is a transcript of my opening talk at EA Global: London 2025. In my talk, I challenge the misconception that EA is populated by “cold, uncaring, spreadsheet-obsessed robots” and explain how EA principles serve as tools for putting compassion into practice, translating our feelings about the world's problems into effective action. Key points:  * Most people involved in EA are here because of their feelings, not despite them. Many of us are driven by emotions like anger about neglected global health needs, sadness about animal suffering, or fear about AI risks. What distinguishes us as a community isn't that we don't feel; it's that we don't stop at feeling — we act. Two examples: * When USAID cuts threatened critical health programs, GiveWell mobilized $24 million in emergency funding within weeks. * People from the EA ecosystem spotted AI risks years ahead of the mainstream and pioneered funding for the field starting in 2015, helping transform AI safety from a fringe concern into a thriving research field. * We don't make spreadsheets because we lack care. We make them because we care deeply. In the face of tremendous suffering, prioritization helps us take decisive, thoughtful action instead of freezing or leaving impact on the table. * Surveys show that personal connections are the most common way that people first discover EA. When we share our own stories — explaining not just what we do but why it matters to us emotionally — we help others see that EA offers a concrete way to turn their compassion into meaningful impact. You can also watch my full talk on YouTube. ---------------------------------------- One year ago, I stood on this stage as the new CEO of the Centre for Effective Altruism to talk about the journey effective altruism is on. Among other key messages, my talk made this point: if we want to get to where we want to go, we need to be better at telling our own stories rather than leaving that to critics and commentators. Since
 ·  · 3m read
 · 
A friend of mine who worked as a social worker in a hospital told me a story that stuck with me. She had a conversation with an in-patient having a very difficult time. It was helpful, but as she was leaving, they told her wistfully 'You get to go home'. She found it hard to hear—it felt like an admonition. It was hard not to feel guilt over indeed getting to leave the facility and try to stop thinking about it, when others didn't have that luxury. The story really stuck with me. I resonate with the guilt of being in the fortunate position of being able to go back to my comfortable home and chill with my family while so many beings can't escape the horrible situations they're in, or whose very chance at existence depends on our work. Hearing the story was helpful for dealing with that guilt. Thinking about my friend's situation it was clear why she felt guilty. But also clear that it was absolutely crucial that she did go home. She was only going to be able to keep showing up to work and having useful conversations with people if she allowed herself proper respite. It might be unfair for her patients that she got to take the break they didn't, but it was also very clearly in their best interests for her to do it. Having a clear-cut example like that to think about when feeling guilt over taking time off is useful. But I also find the framing useful beyond the obvious cases. When morality feels all-consuming Effective altruism can sometimes feel all consuming. Any spending decision you make affects how much you can donate. Any activity you choose to do takes time away from work you could be doing to help others. Morality can feel as if it's making claims on even the things which are most important to you, and most personal. Often the narratives with which we push back on such feelings also involve optimisation. We think through how many hours per week we can work without burning out, and how much stress we can handle before it becomes a problem. I do find that