This is a crosspost for The Case for Insect Consciousness by Bob Fischer, which was originally published on Asterisk in January 2025.
[Subtitle.] The evidence that insects feel pain is mounting, however we approach the issue.
For years, I was on the fence about the possibility of insects feeling pain — sometimes, I defended the hypothesis;[1] more often, I argued against it.[2]
Then, in 2021, I started working on the puzzle of how to compare pain intensity across species. If a human and a pig are suffering as much as each one can, are they suffering the same amount? Or is the human’s pain worse? When my colleagues and I looked at several species, investigating both the probability of pain and its relative intensity,[3] we found something unexpected: on both scores, insects aren’t that different from many other animals.
Around the same time, I started working with an entomologist with a background in neuroscience. She helped me appreciate the weaknesses of the arguments against insect pain. (For instance, people make a big deal of stories about praying mantises mating while being eaten; they ignore how often male mantises fight fiercely to avoid being devoured.) The more I studied the science of sentience, the less confident I became about any theory that would let us rule insect sentience out.
I’m a philosopher, and philosophers pride themselves on following arguments wherever they lead. But we all have our limits, and I worry, quite sincerely, that I’ve been too willing to give insects the benefit of the doubt. I’ve been troubled by what we do to farmed animals for my entire adult life, whereas it’s hard to feel much for flies. Still, I find the argument for insect pain persuasive enough to devote a lot of my time to insect welfare research. In brief, the apparent evidence for the capacity of insects to feel pain is uncomfortably strong.[4] We could dismiss it if we had a consensus-commanding theory of sentience that explained why the apparent evidence is ir
Hey all, I'm Miguel.
I've been reading on Effective Altruism for years, but only recently joined the Forum. I'm 26, from Venezuela, and I've got a Bachelor's in Philosophy, but work mostly on operations and HR. I live in Spain (Madrid-based), and do some volunteering at a local LGBTQ nonprofit that helps refugees persecuted by their sexual orientation or gender identity obtain asylum in Spain.
I've just recently signed up because I've been doing some thinking on a project that could have a high impact in the world through an impact in the EA community, for which I plan on writing a post at some point. Inspired by the work of the Friendship Bench, after I read about them in this thread by Michael Plant, I've been thinking of building a similar project but aimed at EAs who have mental health issues, who are going through a hard time, or who simply struggle with their emotions. I'm not a psychologist (though I am considering studying Psychology in the future depending on the viability of this project), but I don't think a degree in mental health should be needed for this. At first it'd just be me, setting up a Calendly and offering to talk to anyone in the EA community who needs it (especially founders, who are by default under especial strain given their circumstances), as a friend; and creating a few monitored support groups of EAs who are struggling with similar problems, where incoming people can perhaps discuss with people who have overcome similar issues or who are working on overcoming them. It'll also require setting up a referral network for people with severe issues, whom we (at first, just me) can't help, who could in turn be helped by volunteer psychologists and psychiatrists. We've got something somehow similar to all of this going on in the Effective Altruism Peer Support Group, which is really cool, but I don't think the majority of people in the community know about it, nor do I think it really suffices for the needs of most. I think there is a difference between writing in a group and actually talking one-on-one with someone, and that most people need the latter.
I'm guessing that the two major issues would be ensuring confidentiality for people who book a chat (confidentiality of what we talk about, of course), and, over time, managing the amount of people who need to talk so that there's not a long waiting time to have a call again if someone wants to. Maybe bringing other people on board? We'd have to see. In any case, I think that helping people in the EA community take care of themselves and express their emotions in a safe space, while offering someone who listens but can also talk, would directly have an impact on the safety and continued existence of the community itself, which in turn has an impact on the endurance of EA ideas, and, ultimately, on their impact in the world.
Anyways, there's more to be said about this and there's several problems I see with this idea that I didn't address here, because it's not the place. I've got solutions for some, but not yet for others. For the moment, I just wanted to introduce myself and explain what has driven a long-time lurker to speak up in the forum. Hope you're all having a beautiful day!