I am creating a comparative analysis of cross-posted posts on LW and EAF. Make your bets!
I will pull all the posts that were posted on both LW and EAF and compare how different topics get different amount of karma and comments (and maybe sentiment of comments) as a proxy for how interested people are and how much they agree with different claims. Make your bests and see if it tells you anything new! I suspect that LW users are much more interested in AI safety and less vegan. They care less about animals and are more skeptical towards utility maximization. Career related posts will fare better on EAF, while rationality (as in rationality as an art) will go better on LW. Productivity posts will have more engagement on EAF.
It is not possible to check all the bets since the number of cross-posted posts is not that big and they are limited to specific topics.
Do you take your friendships seriously? Do you care about your friends? People often care about their friends and do it for non-moral reasons. I've been thinking about how to live, have friends and so on and have an impartial perspective on beings. If you are impartial you are supposed to act as if you care about X of everyone equally where X is some value that your moral system promotes.
But I care about my friends and family much more than about other people. I could tell myself that I need them to stay productive so I can in the end be better of taking care of everyone but I feel awful about it. Is this a reason to drop impartiality?
Either I have to drop impartiality or drop moral tyranny.
As moral tyranny I mean something as there being only one moral value or rules that I am supposed to live with and maximize it. Under moral tyranny e.g. art is only supposed to be created for the X moral value.
If I drop moral tyranny I can keep neat moral systems and
but it would reduce how helpful is my moral system for making decisions.
Can you clarify your view on suffering to me? Are you saying that suffering is undesirable simply because we made it so? I would say there is something more to it, since all animals try to avoid, not only humans. Humans mostly try to avoid and when they don't they sometimes come up with elaborate ideas on how to justify suffering, e.g. saying it's a catalyst for self-development.
I really like the website. It has nice design.
I would try to talk to people at EAG or EAGx and ask for total honesty. I would say that most people might think it's low impact.
I gave it about 15 minutes to come up with some reasons why people might be reluctant to support you: in general EAs don't focus that much on climate change. Out of all the things that you can do for climate change, planting trees doesn't seem to be the most effective thing to do. You can read about it here. Also, there is a bunch of organizations already doing that (and they also roughly plant 1 tree per dollar).
I think that badges with names on EAGx and EAGs are a bad idea. There are some people who would rather not be connected to the EA movement - some animal advocates or AI safety people. I feel like I'm speculating here, but I imagine a scenario like this:
The only use cases for names on badges I can see are that you can:
I see people using the badges for the first two things from time to time but I don't think it's a huge use case. Some alternatives for third use case:
I think there should at least be an option to have badges that don't have names, and that it should be normalized to have badges like that. It's not obvious to some people that they can cover their badge. Other options include:
what is the difference between net negative and negative in expectation?