At the most recent Effective Altruism Global in San Francisco, I presented CFAR's Double Crux technique for resolving disagreements.
For the "practice" part of the talk, I handed out a series of prompts on EA topics, to generate disagreements to explore. Several people liked the prompts a lot and asked that I make them generally available.
So here they are! Feel free to use these at EA events, as starting points for discussions or Double Cruxes.
(For Double Crux practice at least, I recommend giving each person time to consider their own answers before talking with their partner. I find that when I don't do this, participants are often influenced by each-other's beliefs and the conversation is less fruitful.)
Prompts:
You have a million dollars to donate to the organization or charity of your choice. To which organization do you give it?
If you could make one change to EA culture on a single dimension (more X, or less Y), what change would you make? (Another framing on this question: what annoys you about EA or EAs?)
Should EA branding ever lie or mislead in some way, in order to get people to donate to effective charities? (For instance: exaggerating the cheapness with which a human life can be saved might get people more excited about donating to the Against Malaria Foundation and similar.)
You’re in charge of outreach for EA. You have to choose one demographic to focus on for introducing EA concepts to, and bringing into the movement. What single demographic do you prioritize?
Are there any causes that should not be included as a part of EA?
Should EA be trying to grow as quickly as possible?
This is the printout that I typically use when teaching Double Crux to EAs, which includes instructions and some additional binary, non-EA questions.
I strongly agree that more EAs doing independent thinking really important, and I'm very interested in interventions that push in that direction. In my capacity as a CFAR instructor and curriculum developer, figuring out ways to do this is close to my main goal.
Strongly agree.
I think this misses the point a little. People at EAG have some implicit model that they're operating from, even if it isn't well-considered. The point of the exercise in this context is not to get all the way to the correct belief, but rather to engage with what one thinks and what would cause them to change their mind.
This Double Crux is part of the de-confusion and model building process.