James Herbert

Co-director @ Effective Altruism Netherlands
2173 karmaJoined Working (6-15 years)Amsterdam, Netherlands
effectiefaltruisme.nl

Bio

Participation
1

I'm currently a co-director at EA Netherlands (with Marieke de Visscher). We're working to build and strengthen the EA community here.

Before this, I worked as a consultant on urban socioeconomic development projects and programmes funded by the EU. Before that, I studied liberal arts (in the UK) and then philosophy (in the Netherlands).

Hit me up if you wanna find out about the Dutch EA community! :)

Comments
306

Thanks for taking the time to provide this context! 

I don't have a super strong view on which set of guiding principles is better - I just thought it was odd for them to be changed in this way. 

If pushed, I prefer the old set, and a significant part of that preference stems from the amount of jargon in the new set. My ideal would perhaps be a combination of the old set and the 2017 set.

Expanding our moral circle

We work to overcome our natural tendency to care most about those closest to us. This means taking seriously the interests of distant strangers, future generations, and nonhuman animals - anyone whose wellbeing we can affect through our choices. We continuously question the boundaries we place around moral consideration, and we're willing to help wherever we can do the most good, not just where helping feels most natural or comfortable.

Prioritisation

We do the hard work of choosing where to focus our limited time, money, and attention. This means being willing to say "this is good, but not the best use of marginal resources" - and actually following through, even when it means disappointing people or turning down appealing opportunities. We resist scope creep and don't let personal preferences override our considered judgments about where we can have the most impact.

Scientific mindset

We treat our beliefs as hypotheses to be tested rather than conclusions to be defended. This means actively seeking disconfirming evidence, updating based on data, and maintaining genuine uncertainty about what we don't yet know. We acknowledge the limits of our evidence, don't oversell our findings, and follow arguments wherever they lead - even when the conclusions are uncomfortable or threaten projects we care about.

Openness

We take unusual ideas seriously and are willing to consider approaches that seem weird or unconventional if the reasoning is sound. We default to transparency about our reasoning, funding, mistakes, and internal debates. We make our work easy to scrutinise and critique, remain accessible to people from different backgrounds, and share knowledge rather than hoarding it. We normalise admitting when we get things wrong and create cultures where people can acknowledge mistakes without fear, while still maintaining accountability.

Acting with integrity

We align our behaviour with our stated values. This means being honest even when it's costly, keeping our commitments, and treating people ethically regardless of their status or usefulness to our goals. How we conduct ourselves - especially toward those with less power - reflects our actual values more than our stated principles. We hold ourselves and our institutions to high standards of personal and professional conduct, recognising that being trustworthy is foundational to everything else.

This is a useful write-up, thanks for sharing! 

As I said in my last comment, I'd go to Zurich simply for the friends thing. 

Thoughts on Amsterdam: 

  1. Probably the worst on this list for finding housing? New rental laws have restricted the market a lot. Once you find something you're golden though.
  2. There's the 30% ruling which would significantly impact your donation abilities BUT I think maybe you live to close too the NL border to qualify?
  3. I think it's better connected than you give it credit for. Easy trains to London, Paris, Brussels. Good night trains to Switzerland and Central Europe. And then Schiphol is one of the best-connected airports in Europe
  4. You're probably already aware of this but since I don't think you mentioned it, Amsterdam has a pretty strong queer scene. Probably not on a par with Berlin but I'd estimate not far behind. 

There are a few Dutch EAs who have worked at quant firms and done E2G - let me know if you'd like an introduction. @Imma🔸 might also be interesting to chat with. She's a software engineer who moved from NL to CH for E2G reasons (IIRC) but then moved back. 

Random suggestion: Dunno if you've already got a master's degree but the UK has just expanded the number of universities that, if you have a degree from one of them, will give you access to their 'high potential individual' visa. AFAIK this is easier than getting sponsorship. Unfortunately, Ghent didn't quite make the cut, but lots of other EU ones did (including Amsterdam, which has a solid AI safety scene). So you could sample a new city, get a 1-year master's, and then you've gained the option of an easier move to London. 

P.S. Great to see you're coming to EAGxAmsterdam - hope you have a great time!

Thanks for clarifying! 

But at what level should that standardised set of outcome-related indicators operate? 

As you mention, we already have indicators for ultimate impact (QALYs, etc). And the indicators at the opposite end of the spectrum are pretty simple (completion rates, NPS, etc.). 

It feels like you're looking for indicators that occupy the space in between? Something like 80k's old DIPY metric or AAC's ICAP?

I thiiiiink both organisations tried these metrics and then discontinued them because they weren't so useful? 

I notice the 'guiding principles' in the introductory essay on effectivealtruism.org have been changed. It used to list: prioritisation, impartial altruism, open truthseeking, and a collaborative spirit. It now lists: scope sensitivity, impartiality, scout mindset, and recognition of trade-offs.  

As far as I'm aware, this change wasn't signalled. I understand lots of work has been recently done to improve the messaging on effectivealtruism.org -- which is great! -- but it feels a bit weird for 'guiding principles' to have been changed without any discussion or notice. 

As far as I understand, back in 2017 a set of principles were chosen through a somewhat deliberative process, and then organisations were invited to endorse them. This feels like a more appropriate process for such a change. 

Thanks for doing this! 

Could you define TFOs? Based on your backgrounds, I'm guessing you mean community building organisations like EA Sweden, EA Netherlands, etc., and coaching/training/advising organisations like Successif, 80k, Talos, AIM, Tarbell, etc.?

While both of these sets of organisations are ultimately about helping talent make a difference, I think they have quite different theories of change, and therefore require different M&E systems. 

See my proposal below for how I think community building organisations should do things. 

An M&E Framework for EA Community Building Based on Community Capital

The proximate objective: Community capital

EA community building organisations are ultimately aiming for impact. But measuring final impact directly is nearly impossible for community builders. How do you attribute a career transition to AI safety, or a crucial research insight, or a new organisation being founded, to your intro fellowship or community event?

Instead, the proximate objective should be increasing community capital, defined as:

Community Capital = (Sum of individual career capital) × (Coordination ability)

This formula captures something important about how EA communities actually create value. Career capital - the skills, knowledge, credentials, and connections that enable someone to have impact - matters enormously. But a collection of capable individuals who don't coordinate is far less valuable than a community that can pool knowledge, collaborate on projects, and leverage each other's expertise. The multiplication relationship reflects that coordination acts as a multiplier: high coordination ability means individual career capital gets leveraged far more effectively.

For an EA national group like EA Netherlands, this means success looks like: growing the number of people with relevant career capital, increasing the average career capital per person, and strengthening the community's ability to coordinate effectively. Do this well, and impact should follow (via EA's broader theory that career capital directed at priority problems matters).

Measuring community capital: The annual survey approach

I think this can be best measured with an annual community survey that collects data on both components simultaneously.

Individual career capital can be measured through self-assessment questions:

  • Specific questions about skills, credentials, etc.
  • "How capable do you feel of doing high-impact work in your priority cause area?" (1-7 scale)
  • "How much have your skills/knowledge relevant to impact grown in the past year?"
  • "Are you on a career path you consider high-impact?"

Coordination ability is best measured through network questions borrowed from social capital research:

  • "List up to 10 EA NL members you've had meaningful interaction with in the past year"
  • "Of the people you listed, how many could you collaborate with on a project?"
  • "How many EA NL members do you trust to give you good advice on your work?"

These network questions serve multiple purposes. First, they give you objective data about who's connected to whom, rather than just subjective feelings about connectedness. You can map the actual network structure, identify clusters, and measure density. Second, they differentiate between mere acquaintance and genuine collaboration-readiness - knowing someone versus being able to work with them effectively.

Estimating community size

Apparently, we could then estimate total community size using capture-recapture methods. If survey respondents collectively name 150 unique people, but only 60 of those actually took the survey, the overlap pattern tells you what proportion of the community you're reaching. This lets you estimate:

  • Total active community size (N)
  • The engaged core (people who both responded and got named multiple times)
  • The periphery (people named but who didn't respond)

Combined with career capital measures, you now have all three components of the formula:

  • Sum of career capital ≈ N × average career capital from survey
  • Coordination ability ≈ function of network density, trust levels, collaboration-readiness
  • Community capital ≈ the product of these

Programme-level indicators

The annual survey tells you whether you're winning overall, but you need more frequent feedback on whether specific programmes are working. Programme-level indicators provide this:

  • Fellowships: completion rates, participant satisfaction, post-programme surveys
  • Events: attendance, quality ratings, new connections formed
  • Organiser support: organiser activity levels, events run
  • Digital infrastructure: Chat engagement, information sharing
  • Marcom: brand awareness/recall/sentiment, conversion rates, etc

These don't measure community capital directly, but they're leading indicators that tell you if you're on track between annual surveys.

Attributing changes to your programmes

Measuring community capital is one thing; showing that your programmes actually contribute to it is another. Three complementary approaches:

1. Cohort tracking: Survey fellowship participants before and after the programme. Survey attendees of major events like EAGx before and after. Track how their career capital and network connections change. This gives you programme-specific deltas, though you can't fully prove causation without control groups.

2. Attribution questions in the annual survey: Simply ask people which EAN programmes most increased their career capital or helped them build connections. This relies on self-reported attribution, which isn't perfect, but people generally have decent intuitions about what helped them.

3. Qualitative contribution analysis: Interview a sample of community members annually and ask them to tell the story of how they became more connected or capable. Code their responses for whether EAN programmes feature in their causal narratives. This captures unexpected pathways and avoids leading them toward giving you credit. We're experimenting with the QUIP methodology at the moment.

Realistically, you'd use all three: cohort tracking for major programmes, attribution questions in the annual survey, and some qualitative interviews.

Connecting community capital to impact

This is the hardest link in the chain. You can measure community capital and show your programmes contribute to it, but does community capital actually produce impact?

The honest answer: you can't measure final impact (lives saved, existential risk reduced) directly. You're relying on a theory of change with two key assumptions:

  1. EA's general theory that career capital directed at priority problems leads to impact
  2. Your theory that coordination multiplies individual effectiveness

What you can do is track intermediate outcomes that validate this theory:

  • Career transitions
  • Collaborative projects launched
  • Grants secured from EA funders
  • Research/writing produced
  • Organisations started

Then correlate these with community capital levels: do people with higher career capital and better networks achieve these outcomes more frequently? Do collaborative projects require the coordination infrastructure you've built?

What this system gives you

This M&E approach offers several advantages:

Practical: One annual survey gives you the core metrics, supplemented by programme-level data you're probably already collecting.

Actionable: The formula highlights where to invest. If career capital is low but coordination is high, focus on upskilling and recruitment. If career capital is high but coordination is low, invest in events and infrastructure.

Honest about limitations: It doesn't pretend you can measure final impact. Instead, it measures the proximate objective you actually control, while acknowledging the remaining uncertainty.

Theory-driven: It's based on an explicit model of how communities create value, not just a collection of metrics. This makes it easier to explain to funders and board members why you're measuring what you're measuring. 

  1. ^

    Hot take: right now I think most regions have high coordination but low career capital but unfortunately are spending waaaaaay more on coordination

Good suggestion! We're planning to have large parts of the programme for EAGxAmsterdam published by next week. 

Agreed. 

But now I have to undermine myself by pointing out that the evidence suggests 'the best charities are 100 times more impactful' is a more effective tagline than 'the best charities can save a child's life for $3,000' ;)

If any funders are reading this, please let us know if you'd be willing to fund orgs that want to spend more on marketing and communications - EA Netherlands could do with your help. Right now, it's relatively easy for us to get funding for things that serve the existing community (EAGx, co-working space), but we haven't yet been successful in getting funding for marketing and communications work. CEA wants to go for growth, but this is hard without the funding. 

“Two modalities of meta-EA. Are you recruiting for roles at orgs or are you building a community? Per my sports idea above, I think the community building (for its own sake) is neglected.”


Yes. Nicely put. 


Also, if someone from the forum team reads this, I can’t figure out how to format my quote as a quote whilst using safari on iOS.

I think the Infrastructure Fund is your best bet, or maybe chat with CEA about donating directly to the Groups Team or the CBG programme?

And of course, you can donate directly to movement builders like EA Netherlands ;)

This quarter, we're conducting a Qualitative Impact Protocol study with Bath SDR to assess, learn from, and demonstrate the impact of their work, but if you can't wait for those results, feel free to book a call with me to chat about how we do things.

Load more