Hello EA Community,

I'm seeking advice and support during a challenging phase in my career and life. My name is Tareq, and my journey has been shaped by growing up in Syria during the conflict. Despite the unimaginable turmoil and risks, I completed a degree in dentistry and worked in various capacities in peacebuilding and health, which, as you probably know, are crucial in the Syrian context.

In 2022, I was determined to expand my impact by pursuing an MSc in Global Health, Social Justice, and Public Policy at King's College London.
However, as a Syrian, securing sponsorship for my education was extremely difficult. Even platforms like GoFundMe were not accessible, but with perseverance and the generosity of many, I successfully crowdfunded my education through an alternative platform. I also overcame significant visa challenges to arrive in the UK and proudly completed my degree in early 2024.

Despite these achievements, I now find myself at a crossroads. My current visa allows me to stay in the UK for a bit until I find employment, but securing a job that offers visa sponsorship has proven to be extremely difficult.

I recently attended two EA events in London, where I was inspired by the exchange of ideas and the passion for impactful work. I enjoyed these experiences and would love to continue contributing to this community. I am posting here in the forum based on the recommendations of a couple of fellow EAs.

I would greatly appreciate any advice, connections, or recommendations you might have that will help me navigate through the job market. I am mainly looking for opportunities within Global Health, Conflict, Development and Policy in London, but I'm open to other sectors and moving to other destinations. I have a wide range of experiences and skills powered by academic degrees and over 80 online courses I finished in Syria. I am also a TEDx speaker and have been lecturing at UK and global universities occasionally.

For more context on my background, I am more than happy to jump on a quick call to discuss further.

Thank you for taking the time to read my story, and I look forward to any guidance or support you can provide.

Tareq
 
 

32

0
0
1

Reactions

0
0
1
Comments2


Sorted by Click to highlight new comments since:

Are you familiar with Probably Good and their 1on1 career advising? This seems like a natural fit!

I am not, but will check it right away. Thanks a lot for sharing!

Curated and popular this week
 ·  · 16m read
 · 
This is a crosspost for The Case for Insect Consciousness by Bob Fischer, which was originally published on Asterisk in January 2025. [Subtitle.] The evidence that insects feel pain is mounting, however we approach the issue. For years, I was on the fence about the possibility of insects feeling pain — sometimes, I defended the hypothesis;[1] more often, I argued against it.[2] Then, in 2021, I started working on the puzzle of how to compare pain intensity across species. If a human and a pig are suffering as much as each one can, are they suffering the same amount? Or is the human’s pain worse? When my colleagues and I looked at several species, investigating both the probability of pain and its relative intensity,[3] we found something unexpected: on both scores, insects aren’t that different from many other animals.  Around the same time, I started working with an entomologist with a background in neuroscience. She helped me appreciate the weaknesses of the arguments against insect pain. (For instance, people make a big deal of stories about praying mantises mating while being eaten; they ignore how often male mantises fight fiercely to avoid being devoured.) The more I studied the science of sentience, the less confident I became about any theory that would let us rule insect sentience out.  I’m a philosopher, and philosophers pride themselves on following arguments wherever they lead. But we all have our limits, and I worry, quite sincerely, that I’ve been too willing to give insects the benefit of the doubt. I’ve been troubled by what we do to farmed animals for my entire adult life, whereas it’s hard to feel much for flies. Still, I find the argument for insect pain persuasive enough to devote a lot of my time to insect welfare research. In brief, the apparent evidence for the capacity of insects to feel pain is uncomfortably strong.[4] We could dismiss it if we had a consensus-commanding theory of sentience that explained why the apparent evidence is ir
 ·  · 7m read
 · 
Introduction I have been writing posts critical of mainstream EA narratives about AI capabilities and timelines for many years now. Compared to the situation when I wrote my posts in 2018 or 2020, LLMs now dominate the discussion, and timelines have also shrunk enormously. The ‘mainstream view’ within EA now appears to be that human-level AI will be arriving by 2030, even as early as 2027. This view has been articulated by 80,000 Hours, on the forum (though see this excellent piece excellent piece arguing against short timelines), and in the highly engaging science fiction scenario of AI 2027. While my article piece is directed generally against all such short-horizon views, I will focus on responding to relevant portions of the article ‘Preparing for the Intelligence Explosion’ by Will MacAskill and Fin Moorhouse.  Rates of Growth The authors summarise their argument as follows: > Currently, total global research effort grows slowly, increasing at less than 5% per year. But total AI cognitive labour is growing more than 500x faster than total human cognitive labour, and this seems likely to remain true up to and beyond the point where the cognitive capabilities of AI surpasses all humans. So, once total AI cognitive labour starts to rival total human cognitive labour, the growth rate of overall cognitive labour will increase massively. That will drive faster technological progress. MacAskill and Moorhouse argue that increases in training compute, inference compute and algorithmic efficiency have been increasing at a rate of 25 times per year, compared to the number of human researchers which increases 0.04 times per year, hence the 500x faster rate of growth. This is an inapt comparison, because in the calculation the capabilities of ‘AI researchers’ are based on their access to compute and other performance improvements, while no such adjustment is made for human researchers, who also have access to more compute and other productivity enhancements each year.