Hi, I'm an 18 year old going into college in a week. I am studying Computer engineering and mathematics. Since I have a technical interest and AGI has a much higher probability ending humanity this century(1/10, I think) than other causes (that I would rather work on, like Biorisks is 1/10,000), would the utility positive thing to do be to force myself to get an ML alignment focused PhD and become a researcher?
I am at a mid-tier university. I think I could force myself to do AI alignment since I have a little interest, but not as much as the average EA. I wouldn't find as much engagement in it, but I also have an interest in starting a for-profit company, which couldn't happen with AGI alignment (most likely). I would rather work on a hardware/software combo for virus detection (Biorisks), climate change, products for 3rd world, other current problems, or other problems that will be found in the future.
Is it certain enough that AI alignment is so much more important that I should forgo what I think I will be good at/like to pursue it?
Edit: made some people confused that I had a false dichotomy between "pursuing my passion" and doing EA alignment. Removed that comment.
You ask, "Is it certain enough that AI alignment is so much more important that I should forgo what I think I will be good at/like to pursue it?"
One argument against pursuing AI alignment is that it's very unlikely to work. So long as humans are in any way involved with AI, weaknesses of the human condition will be a limiting factor which will prevent AI from ever being a safe technology.
If I was in your position, smart, educated, with a long life ahead of me, and really wanted to have a meaningful impact, I would focus on the machinery which is generating all these threats, the knowledge explosion.
From the perspective of almost old age, I would advise you not to follow the "experts" who are focused on the effort to manage the products of an ever accelerating knowledge explosion one by one by one, as that effort is doomed to failure.
Perhaps you might consider this thought experiment. Imagine yourself working at the end a factory assembly line. The products are coming down the line to you faster, and faster, and faster. You can keep up for awhile by working hard and being smart, but at some point you will be overwhelmed unless you can take control of the assembly line, and slow it to a pace you can manage.
That's the challenge which will define the 21st century. Will we learn how to control the knowledge explosion? Or will it control us?