Talk to me about cost benefit analysis !
One last thing - if the reason you want to join a totalizing community is to gain structure, you don't need to join an EA cult to do this!
- Join more groups unrelated to EA. Make sure to maintain a connection to this physical world and remember how beautiful it is. Friendship, community and love are extremely motivating.
- I say this as a non-spiritual lifelong atheist: You may also consider adding some faith practice like hinduism or buddhism. I find a lot of hindu texts and songs to be extremely beautiful and although I don't believe in any of the magic stuff the idea of reincarnation and karma and the accompanying art / rituals can be motivating to me to do the best I can for this world.
Feel free to dm me if you want
I agree for the most part with Michael's answers to your questions on LW so I'll just go over some slight differences.
1- This movement should not be centralized at all IMO. EA should be a library. Also It's pretty gross that it's centralized but there is no political system sans a token donation election. I'm pretty sure nick beckstead and Will MacAskill etc etc would have been fired into the moon after ftx if there was a democratic voting process for leaders.
https://forum.effectivealtruism.org/posts/8wWYmHsnqPvQEnapu/?commentId=6JduGBwGxbpCMXymd
https://forum.effectivealtruism.org/posts/MjTB4MvtedbLjgyja/?commentId=iKGHCrYTvyLrFit2W
3- Agree with why the team is the way it is but they do have more of an obligation to correct this (conditional on the demographics of the team actually being an important dimension of success. It's believable but not a no-brainer) than your average HR dep. My experience working in a corporate job is that HR works for the man - don't trust them at all. CEAs community team is actually trying to solve problems to help all members of the community, not just the top dogs (well, at least you would hope)
5- Agree w/michael that they are. However, you're picking up on a real thread of arrogance, and often a smug unwillingness to engage with non top 5 cause areas despite the flow-through effects possibly getting more money to the causes they want. I think local EA groups should focus more on fixing the issues in their cities. Not because it is as important but because I think they would gain a lot of recognition and they could leverage that to fundraise more for their causes down the line. Likewise, orgs should be more willing to compromise their work if that means getting way more money. A few years ago my parents asked me to help them research which homeless shelters in Chicago to donate to and I told them they should give the money to (insert ea FOTM charity). They got super triggered and I think if I just answered their question I would have more sway over other donations they made.
8. I found this post, though I'll say I find the concept of an EA club not having ea in their name bizarre. I dislike the name effective altruism but that is the name of the movement so yea I would say they overcooked here.
I'll start by saying I absolutely think it's a terrible idea to try to destroy humanity. I am 100% not saying we should do that. Ok, now that we have that out of the way. If you decide to commit your life to x-risk reduction because there are "trillions of units of potential value in the future", you are in a bit of sticky situation if someone credibly argues that the expected value of the future is lower if humans become grabby than if they don't. And that's ok! It's still probably one of the highest EV things you can do.
And I'll say it again years later, https://forum.effectivealtruism.org/posts/KDjEogAqWNTdddF9g/long-termism-vs-existential-risk
this ^ post is not great. The entire thing basically presupposes that human society is positive, that aliens will not exist, that animals will not re-evolve if we die. I wouldn't bring this up if not for it being one most upvoted posts on the forum ever (top 5 if you don't include posts about ea drama).
So we get to use cold hard rationality to tell most people that the stuff they are doing is relatively worthless compared to x-risk reduction, but when that same rationality argues that x-risk reduction is actually incredibly high variance and may very well be harming trillions of the people in the future we get to be humanists ?
It actually goes more giga brain than this - since aliens are in the picture, or even maybe life can re- evolve on our planet to interstellar intelligence. You might be interested to talk to @Arepo , he's a crucial considerer. I'd especially recommend his post "A proposed hierarchy of longtermist concepts".
shameless self plugs that also might lead you to some related readings (i'm narcissistic enough to somehow remember almost all my comments on the subject)
https://forum.effectivealtruism.org/posts/zDJpYMtewowKXkHyG/alien-counterfactuals
https://forum.effectivealtruism.org/posts/zLi3MbMCTtCv9ttyz/formalizing-extinction-risk-reduction-vs-longtermism
https://forum.effectivealtruism.org/posts/zuQeTaqrjveSiSMYo/?commentId=7s2vrDuxonBqoGrou
https://forum.effectivealtruism.org/posts/Pnhjveit55DoqBSAF/?commentId=wTkFestNWNorB5mG4
https://forum.effectivealtruism.org/posts/YnBwoNNqe6knBJH8p/?commentId=HPsgdWEbdEZH3WN6j
https://forum.effectivealtruism.org/posts/WebLP36BYDbMAKoa5/?commentId=cJdqyAAzwrL74x2mG
Either way, don't be down on yourself. I know exactly how you feel. There is way too much stuff to know. The fact that you are writing this and reflecting means you are one of the best humans alive right now, regardless of if x-risk is important or not. Keep up the good work.
Do you disagree with this? I guess in my opinion it seems responsible to gather enough data to feel confident in the health effects of something before we let the entire population start doing it.