I'm posting this to tie in with the Forum's Draft Amnesty Week (March 11-17) plans, but it is also a question of more general interest. The last time this question was posted, it got some great responses.
This post is a companion post for What posts are you thinking about writing?
When answering in this thread, I suggest putting each idea in a different answer, so that comment threads don't get too confusing.
If you think someone has already written the answer to a user's question, consider lending a hand and linking it in the comments.
A few suggestions for possible answers:
- A question you would like someone to answer: “How, historically, did AI safety become an EA cause area?”
- A type of experience you would like to hear about: “I’d love to hear about the experience of moving from consulting into biosecurity policy. Does anyone know anyone like this who might want to write about their experience?”
If you find yourself with loads of ideas, consider writing a full "posts I would like someone to write" post.
Draft Amnesty Week
If you see a post idea here which you think you might be positioned to answer, Draft Amnesty Week (March 11-17) might be a great time to post it. In Draft Amnesty Week, your posts don't have to be fully thought through, or even fully drafted. Bullet-points and missing sections are allowed, so you can have a lower bar for posting.
I feel like the act of consciously pushing the button to cause said consequences to occur vs the consequences of pushing button A or B occurring spontaneously and without anyone actually pushing either button are slightly different cases. I'm not 100% sure if Epicurus would push button B or be indifferent as in his words:
However, I still believe that one should be indifferent to which outcome occurs, in order to remain consistent with this view. I do feel as though this view would probably lean towards being indifferent to which button one choses to push.
Having said this, I too would push button B, but this is due to my deep-rooted biases about my life and death, however irrational they may be, but maybe I would be better off changing this stance, since according to Epicurus:
Also I just want to add that, on your point that annihilation would be bad because it prevents future flourishing, for whom would this be bad? It can't be bad for counterfactual non-existent beings, since they don't exist to perceive the badness of missing out on (the good bits of) life. Or am I misunderstanding your claim? And what exactly do you mean by instrumental reasons in this case? Could you give some examples?
Epicurus, Letter to Menoeceus [4], translated by Cyril Bailey (1926)
Epicurus, Letter to Menoeceus [3], translated by Cyril Bailey (1926)