Expanding our deeply flawed society would only mean replicating our mistakes, our failures, and our acts of cruelty on a much larger scale.
The problem is that [optimistic longtermism is] based on the assumption that life is an inherently good thing, and looking at the state of our world, I don’t think that’s something we can count on. Right now, it’s estimated that nearly a billion people live in extreme poverty, subsisting on less than $2.15 per day. Right now, there are at least five major ongoing military clashes involving nearly 30 countries, from civil war in Myanmar to the Israeli-Palestinian conflict. I could go on and on.
Human-caused suffering multiplies when we bring animals into the equation. We force dogs to fight each other, we race horses to death, and we trap elephants in zoos. We conduct sadistic experiments on more than 115 million animals each year. We raise and slaughter 80 billion land animals and trillions of sea animals annually for food on factory farms—large-scale industrial agricultural facilities that confine animals under torturous conditions to produce cheap meat, eggs, and milk.
Read the rest in Fast Company.
Thanks for your comment.
I don’t think this is a compelling argument. Being less immoral than the worst doesn’t lead me to conclude we should increase the immorality further. I do think it should lead us to have compassion in so far as humanity makes it very difficult not to be immoral — it’s an evolutionary problem.
That’s true! But still very bad for many. And of course, I’m concerned about all sentient beings, not just humans — the math looks truly horrible when non-humans are in concluded. I do credit humans for unintentionally reducing wild animal suffering by being so drawn to destroying the planet, but I expect the opposite will happen in space colonization situations (i.e. we will see wildlife or create more digital minds, etc.)
I’m a longtermist in this sense. I’m concerned about us torturing non-humans not just in the next several decades, but eons after. This could look like factory farming animals, seeding wild animals, creating digital minds, bringing pets with us, and so on.
Is that transhumanism to the max? I need to learn more about those who endorse this philosophy—I imagine there is some diversity. Would the immorality in us be eradicated under the ideal circumstances, in their minds (s-risks and x-risks aside from AI acceleration)? Sounds like they are a different kind of utopian.