I have work experience in HR and Operations. I read a lot, I enjoy taking online courses, and I do some yoga and some rock climbing. I enjoy learning languages, and I think that I tend to have a fairly international/cross-cultural focus or awareness in my life. I was born and raised in a monolingual household in the US, but I've lived most of my adult life outside the US, with about ten years in China, two years in Spain, and less than a year in Brazil.
As far as EA is concerned, I'm fairly cause agnostic/cause neutral. I think that I am a little bit more influenced by virtue ethics and stoicism than the average EA, and I also occasionally find myself thinking about inclusion, diversity, and accessibility in EA. Some parts of the EA community that I've observed in-person seem not very welcoming to outsides, or somewhat gatekept. I tend to care quite a bit about how exclusionary or welcoming communities are.
I was told by a friend in EA that I should brag about how many books I read because it is impressive, but I feel uncomfortable being boastful, so here is my clunky attempt to brag about that.
Unless explicitly stated otherwise, opinions are my own, not my employer's.
I'm looking for interesting and fulfilling work, so if you know of anything that you think might be a good fit for me, please do let me know.
I'm looking for a place to be my home. If you have recommendations for cities, for neighborhoods within cities, or for specific houses/communities, I'd be happy to hear your recommendations.
I'm happy to give advice to people who are job hunting regarding interviews and resumes, and I'm happy to give advice to people who are hiring regarding how to run a hiring round and how to filter/select best fit applicants. I would have no problem running you through a practice interview and then giving you some feedback. I might also be able to recommend books to read if you tell me what kind of book you are looking for.
Here are some of the non-fiction books I've enjoyed most this year:
Sure. The silly and simplified cliché is something like this: a comment describes someone's feelings (or internal state) and then gets some agree votes and disagree votes, as if Person A says "this makes me happy" and person be wants to argue that point.
(to be clear, this is a very small flaw/issue with the EA Forum, and I wouldn't really object if the people running the forum decide that this is too minor of an issue to spend time on)
A few little examples:
The only thing that comes to mind for me regarding "make it better" would be to change the wording on the tooltips for voting to clarify (or to police?) what they are for. I somewhat regularly see people agree vote or disagree vote with comments that don't contain any claims or arguments.
I suspect that the biggest altruistic counterfactual impact I've had in my life was merely because I was in the right place at the right time: a moderately heavy cabinet/shelf thing was tipping over and about to fall on a little kid (I don't think it would have killed him. He probably would have had some broken bones, lots of bruising, and a concussion). I simply happened to be standing close enough to react.
It wasn't as a result of any special skillset I had developed, nor of any well thought-out theory of change; it was just happenstance. Realistically, I can't really take credit for it any more than I can take credit for being born in the time and place that I was. It makes me think about how we plan for things in expectation, but there is such a massive amount of random 'noise' in the world. This isn't exactly epistemic humility or moral cluelessness, but it seems vaguely related to those.
I had a lovely conversation with someone currently working for Mercy for Animals about food systems in Southeast Asia, including some aspects of the logistics for transporting vegetables as opposed to animal-based food that I wasn't aware of. I think I might have I learned more from this conversation than from any other conversation I've had at an EAG or EAGx event.
it's not clear there’s more 5 people who will read to this point
I just want to say that in late 2023, a few years after you wrote this, at least one person is still reading and appreciating your comments. :)
TLDR: I am glad that I am not the only one who thinks "I'd like to help, but I want to make sure I have enough financial runway first."
(The rest of this comment is just my musings and explorations.)
I'm glad you wrote this. I've had vaguely similar thoughts bouncing around in my head during the past few days of seeing so much talk about giving.
I've donated a small amount of money over the years (≈8% of my income one year, and ≈1% another year), but I've never taken a pledge, and for most of my adult life I've not felt as financially secure as I would like. I feel a vague sense of pressure that effective giving is something I should do, and that it is something other people will think more highly of me for. But I've crunched numbers, and I have a spreadsheet, and I know a decent amount about personal finance. My rough narrative is something like this: there are bad things that have a reasonable chance of occurring in my life. If I have enough money available, then most of these things will be an annoyance, or a minor setback, or a negligible cost to me. If I do not have enough money available, these most of things will be a major setback, or will irrevocably alter my life path for the worse, or make other things much harder for me.
I would also like to retire someday. I assume that I will eventually reach an age where I lack the energy or motivation to do stuff, and I don't want to be in a situation in which I am forced to choose between doing a miserable job and not being able to afford decent food, clothing, and shelter.
If I had an in demand skill that could easily get me a well-paying job, or if I had family wealth to rely on, or if I had an excellent professional network from a well-reputed university... well, the more of these things one has, the less risk-averse one has to be.[1] But in general I feel a great sense of financial precarity, and currently I am not confident that I will have enough money in the future to provide a modestly comfortable life for myself.
In brief, I want to make sure that I am taken care of. Any money above and beyond that I am happy to get rid off (such as donating it at the end of my life, or doing donations after a few years of earning good money).
For legal citizens, it’s less risky to drive above the speed limit. For married people with two incomes that share expenses, it's less risky to quit one job to start another. For people with a highly in-demand skillset, it is less risky to take time off work to travel. For students not relying on scholarships, it’s less risky to skip class. For people with non-abusive parents, it's less risky to live with family. And so on.
I'm been thinking about small and informal ways to build empathy[1]. I don't have big or complex thoughts on this (and thus I'm sharing rough ideas as a quick take rather than as a full post). This is a tentative and haphazard musing/exploration, rather than a rigorous argument.
Why bother? Well, I have a vague and not well-reasoned intuition that being more empathetic makes you a better person. Will it actually increase your impact? I have no idea. Maybe you would have higher impact and you would make the world a better place if you just kept your head down and worked on your project.
A polished article would have some sort of conclusion or a nice takeaway, but for this short form I'll just end it here.
I'm using "empathy" in a pretty sloppy sense. Something like "caring for other people who are not related/connected to you" or "developing something of an emotional understanding of the suffering people go through, rather than merely an intellectual one." I'm thinking about this in a very suffering-focused sense.
Half of a Yellow Sun is once of the books that I think made me a little bit more empathetic. It is a book about the Nigerian Civil war, something that I assume most of fellow North Americans know almost nothing about. I certainly knew nothing about it.
And to echo writings from many other people in and around the EA community: if you think that is bad, remember that there is a similar level of suffering happening every day for millions of people.
Although you can read accounts from transgender people. The rough summary would be something like "I am stunned at how different people treat me when they see me as a man/woman."
Note that the Stoic interpretation here isn't to build empathy, but rather to make yourself unafraid of hardship. And the trouble with using these for building empathy is that you aren't really in the situation; you can stop pretending whenever you like. For anyone who is curious, here is the relevant excerpt from The Daily Stoic that turned me on to this idea:
What if you spent one day a month experiencing the effects of poverty, hunger, complete isolation, or any other thing you might fear? After the initial culture shock, it would start to feel normal and no longer quite so scary.
There are plenty of misfortunes one can practice, plenty of problems one can solve in advance. Pretend your hot water has been turned off. Pretend your wallet has been stolen. Pretend your cushy mattress was far away and that you have to sleep on the floor, or that your car was repossessed and you have to walk everywhere. Pretend you lost your job and need to find a new one. Again, don’t just think about these things, but live them. And do it now, while things are good. As Seneca reminds us: “It is precisely in times of immunity from care that the soul should toughen itself beforehand for occasions of greater stress. . . . If you would not have a man flinch when the crisis comes, train him before it comes.”
(not well thought-out musings. I've only spent a few minutes thinking about this.)
In thinking about the focus on AI within the EA community, the Fermi paradox popped into my head. For anyone unfamiliar with it and who doesn't want to click through to Wikipedia, my quick summary of the Fermi paradox is basically: if there is such a high probability of extraterrestrial life, why haven't we seen any indications of it?
On a very naïve level, AI doomerism suggests a simple solution to the Fermi paradox: we don't see signs of extraterrestrial life because civilizations tend to create unaligned AI, which destroys them. But I suspect that the AI-relevant variation would actually be something more like this:
Like many things, I suppose the details matter immensely. Depending on the morality of the creators, an aligned AI might reach spend resources expanding civilization throughout the galaxy, or it might happily putt along maintaining a globe's agricultural system. Depending on how an unaligned AI is unaligned, it might be focused on turning the whole universe into paperclips, or it might simply kill its creators to prevent them from enduring suffering. So on a very simplistic level it seems that the claim of "civilizations tend to make AI eventually, and it really is a superintelligent and world-changing technology" is consistent with reality of "we don't observe any signs of extraterrestrial intelligence."