I'm fairly new to EA, greatly enjoying the 80,000 hours podcasts on 10 global problems. I have been pondering on the EA philosophy of using resources to do the most good and therefore having the greatest impact numerically.
So I'm wondering - taken to it's logical conclusion, is this not effectively a well intentioned version of survival of the fittest? What if your cause or issue is niche or that the people affected are low in numbers? How is their validity built in to the model? How does EA value diversity of issues?
So to give a concrete example: less than 1% of the population worldwide have type 1 diabetes. Maybe your money would be better spent on type 2 diabetes, with 8% of the world's population having that. Does this mean those with type 1 are unimportant or unworthy of funding?
Within EA would the solution be looking for the most impactful way to 'solve' type 1 (be that through advocacy for affordable insulin and supplies or via a cure), or would you simply focus on the larger population (type 2) and fund that for greater impact?
The lack of scope for diversity of smaller causes in the model troubles me, but I'm here to learn and very interested to hear views!
Thanks so much for your answers, really interesting to hear the multiple perspectives. I don't think I'll be giving up on type 1 any time soon since my daughter has the condition and I volunteer with a type 1 advocacy charity (called T1International).
I think it's great to reflect on how best to use funds though, and I think going for large scale systemic change and advocating for policy change (which we do) is in our case the right thing. Advocacy impact is non-linear and much harder to measure, but when the wins come, they tend to be big and make positive changes for lots of people.
I don't want to give up on advocating for health equity for the type 1 population when so many people have to spend so much of their monthly income just to stay alive due to the price of insulin, but I like the idea of thinking about appropriate causes in percentage terms.