HS

Henry Stanley 🔸

1605 karmaJoined

Comments
153

This is wonderful – thank you so much for writing it.

Mutual dedication to one another’s ends seems like a thing commonly present in religious and ethnic communities. But it seems quite uncommon to the demographic of secular idealists, like me. Such idealists tend to form and join single-focus communities like effective altruism, which serve only a subset of our eudaemonic needs.

Agree about secular, single-purpose communities – but I'm not sure EA is quite the same.

I've found my relationships with other EAs tend to blossom to be about more than just EA; those principles provide a good set of shared values from which to build other things, like a sense of community, shared houses, group meals, playing music together and just supporting each other generally. Then again, I don't consider EA to be the core of my identity, so YMMV.

You've mentioned your experience with burnout in a previous post - I wondered if you were willing to share more about that, and how it influenced your approach to EtG if at all.

Very impressive - I don't think I have the stomach (so to speak) to put myself through this kind of suffering. Thanks for doing something so selfless and unpleasant for the benefit of anonymous others ❤️

Hooray - this is awesome work. Fight the good fight.

I donated to THL last year because of this case; was advised by Founders Pledge that without funding the appeal might fall through and was keen for that not to happen. I wonder what other time-sensitive efforts in this space with room for more funding?

Thanks for all the hard work that went into building/rebuilding/maintaining EA Hub!

It's always sad to see old projects get shuttered, especially ones that were a labour of love, so kudos on recognising that it's the right time to do this.

Not a wholly-unserious suggestion. SWP could do a tie-in with the artist creating these fun knock-offs, capitalise on Swift madness, rehabilitate shrimp as cute in the process.

In addition, the "not funding a company that will make the world worse" constraint on this program likely makes unicorn status substantially less likely

Citation needed on this - I’m not sure what net-bad unicorns you’re thinking of (and I’d be interested to know), but I think at the outset they probably mostly looked like not-making-the-world-worse ideas and by the time they’re getting to unicorn status the original incubator has very little influence over what they do.

I actually think the overlap between those interested in EA and founding unicorns is weak

Perhaps, although as the post says you don’t need to found a unicorn to have a very promising exit and give a large amount to charity. I’m one of the people who became a software engineer back when 80K recommended it as a promising career path. I think more of us should start EtG startups - and suspect there are many like me who don’t feel themselves a good fit for direct work (or just like tech) and would make good founders.

(I ask because I think burnout is a serious problem and its seriousness is probably generally under-appreciated in this community)

It's bizarre isn't it

Very much hoping the board makes public some of the reasons behind the decision.

Load more