P

publius

31 karmaJoined

Posts
1

Sorted by New
1
· · 1m read

Comments
7

There are two different EA messages: EA is about a) doing good better or b) doing the most good. Despite the surface similarities – they are worlds apart. B) is a very singerian notion. I do hope messaging can move toward doing good better.

I still object somewhat to being an "EA". In worlds where EA's epistemic rigour breaks down, I think lots of people identifying as an "EA" without thinking through what it entails is a prevalent cause. I do reckon that for some, EA might give life meaning – but don't make it the core of your identity. First and foremost, you are a human wanting to do good. EA as a research field can help you find answers, and EA as a community can help you find friends and social motivation.

I've been reading outsider's takes on EA on twitter for the past few days, and can't help feeling that it's mostly bad (faith) takes. Obviously, I know most people have suboptimal epistemic practices – but is it really this bad? I won't even quote some of the takes here, but they must surely be made in bad faith.

 

I was quite worried of EA's future since the FTX news broke, especially the reaction  of the public – but for the past few days I've instead been strengthened in my conviction that EA is a force for good. No other community has the curiosity, moral virtue or insightful discussion that EA has.

 

While I am outraged at EA leadership for allowing this to happen, I am incredibly thankful for the EA community at this time. Last Thursday, I was unsure if EA would survive this. For now, I think it will.

It's quite evident people do follow discussions on utilitarianism but fail to understand the importance of integrity in a utilitarian framework, especially if one is unfamiliar with Kant. If the public finds SBF's system of moral beliefs to blame for his actions, it will most likely be for being too utilitarian rather than not being utilitarian enough – a misunderstanding which will be difficult to correct.

There's the case that such distinctions are too complex for a not insignificant proportion of the public and therefore utilitarianism should not be promoted at all for a larger audience, since all the textbooks filled with nuanced discussion will collapse to a simple heuristic in the minds of some, such as 'ends justifying the means' (which is obviously false).

Steelmanning the argument for the utility of crypto besides speculation or resale value.

Baudrillard would call these crypto tokens pure signifier without signified. I disagree. I believe what tokens signify is hope. Undoubtedly, disenchantment has left a nihilistic void at the heart of society. Crypto, qua symbol, forms part of the re-enchantment of the world – a return to mythology. When someone invests in crypto, they receive utilons in return in the form of hope for a better future. They too, can become a multi-millionaire overnight and transform their life. Stories are important, the stories we believe in shape our quality of life. Crypto is the source of a story that helps people cope with difficult material circumstances.

Thoughts on on effective altruism and semantic drift.

Effective altruism – is it a question, a movement, an answer, a professional network or a community? Listed are the top three definitions found from a quick search:

  • 'Effective altruism is the use of evidence and reason to determine the most effective ways to benefit others'
  • 'Effective altruism is a research field and practical community that aims to find the best ways to help others, and put them into practice.'
  • 'Effective altruism (EA) is a philosophical and social movement that advocates using evidence and reason to figure out how to benefit others as much as possible, and taking action on that basis'

The organising idea behind 'effective altruism' – that we can do good better – is  quite self-evident. I agree with this principle. However, to what extent should this principle be the foundation of a cohesive group-identity? 

In my view, effective altruism's raison d'être is as a vehicle for the underlying ideas. EA ought to be promoted to the extent it benefits the adaptation of these ideas – scope sensitivity, curiosity, good epistemic rigour and a will for a better world, alongside the other core principles of honesty, integrity, and so on. Should 'EA' as an identity be promoted, or should we strive to keep our identity small?