This is Part 1 of a multi-part series, shared as part of Career Conversations Week. The views expressed here are my own and don't reflect those of my employer.
TL;DR:
Building an EA-aligned career starting from an LMIC comes with specific challenges that shaped how I think about career planning, especially around constraints:
* Everyone has their own "passport"—some structural limitation that affects their career more than their abilities. The key is recognizing these constraints exist for everyone, just in different forms. Reframing these from "unfair barriers" to "data about my specific career path" has helped me a lot.
* When pursuing an ideal career path, it's easy to fixate on what should be possible rather than what actually is. But those idealized paths often require circumstances you don't have—whether personal (e.g., visa status, financial safety net) or external (e.g., your dream org hiring, or a stable funding landscape). It might be helpful to view the paths that work within your actual constraints as your only real options, at least for now.
* Adversity Quotient matters. When you're working on problems that may take years to show real progress, the ability to stick around when the work is tedious becomes a comparative advantage.
Introduction
Hi, I'm Rika. I was born and raised in the Philippines and now work on hiring and recruiting at the Centre for Effective Altruism in the UK.
This post might be helpful for anyone navigating the gap between ambition and constraint—whether facing visa barriers, repeated setbacks, or a lack of role models from similar backgrounds. Hearing stories from people facing similar constraints helped me feel less alone during difficult times. I hope this does the same for someone else, and that you'll find lessons relevant to your own situation.
It's also for those curious about EA career paths from low- and middle-income countries—stories that I feel are rarely shared. I can only speak to my own experience, but I hop
EA-aligned news curation service - prototype online
Actually Relevant is a news curation service that evaluates stories based on how relevant they are for humanity and its long-term future. The website is a first prototype. I hope that Actually Relevant will promote EA causes and perspectives to more people, and that it will free EA-aligned readers from the headaches of irrelevant news.
I'm looking for
This story on the frontpage was rating a conservative 4/5 for importance, which your style guide says means
But the article text only that it directly affects "Over 70,000" people. There are also some speculative comments that this could lead to a general "reevaluation of international legal norms and systems around land rights", but this seems quite unlikely to me. I would expect that you could write multiple stories a year about similar occurrences.
Thanks so much for taking a deeper look at one of the articles! I think you're right: a somewhat lower rating seems more appropriate in this case.
I believe that two things are true for the algorithm behind Actually Relevant: 1) almost all posts are more important for humanity than 90% of news articles by other outlets. In that sense, it's already useful. 2) Many relevance analyses are still off by at least one grade on the rating scale, meaning that some posts get a "major" or "critical" tag that should not get it. The idea is to use community and expert feedback to finetune the prompts to get even better results in the future. I also want to involve a human editor who could double check and adjust dubious cases.
In the post you referenced, the AI says: "The eviction has affected over 70,000 people and risks cultural extinction for the Maasai people. It also highlights the need for a reevaluation of international legal norms and systems around land rights. In certain scenarios, this situation could lead to a broader movement for indigenous land rights in Tanzania and beyond, making it an issue that is far more relevant for humanity than the number of directly affected people would suggest." I think it's a good sign that the algorithm realized that the extinction of an entire culture and developments around indiginous land rights should lead to a higher rating than the number of directly affected people would suggest. It might still be off in this case, but I'm optimistic that additional finetuning can get us there.
Looking for partners in crime to explore a "scope sensitive news provider"
I would like to find out if there is a market for a news provider that selects stories based on how much they matter to sentient life in the universe.[1] Specifically, I would like to run a few experiments following the Lean Startup approach, like pretending that the service already exists to see how many people would subscribe.
Please reach out
I took this idea more seriously when I read the post "What happens on the average day". rosehadshar mention "scope sensitivity" as their first criterion for an ideal news provider and define it as "a serious, good faith attempt to tell the stories that matter most to the most sentient life."