At the most recent Effective Altruism Global in San Francisco, I presented CFAR's Double Crux technique for resolving disagreements.
For the "practice" part of the talk, I handed out a series of prompts on EA topics, to generate disagreements to explore. Several people liked the prompts a lot and asked that I make them generally available.
So here they are! Feel free to use these at EA events, as starting points for discussions or Double Cruxes.
(For Double Crux practice at least, I recommend giving each person time to consider their own answers before talking with their partner. I find that when I don't do this, participants are often influenced by each-other's beliefs and the conversation is less fruitful.)
Prompts:
You have a million dollars to donate to the organization or charity of your choice. To which organization do you give it?
If you could make one change to EA culture on a single dimension (more X, or less Y), what change would you make? (Another framing on this question: what annoys you about EA or EAs?)
Should EA branding ever lie or mislead in some way, in order to get people to donate to effective charities? (For instance: exaggerating the cheapness with which a human life can be saved might get people more excited about donating to the Against Malaria Foundation and similar.)
You’re in charge of outreach for EA. You have to choose one demographic to focus on for introducing EA concepts to, and bringing into the movement. What single demographic do you prioritize?
Are there any causes that should not be included as a part of EA?
Should EA be trying to grow as quickly as possible?
This is the printout that I typically use when teaching Double Crux to EAs, which includes instructions and some additional binary, non-EA questions.
I'm not sure I follow. The question asks what the participants think is most important, which may or may not be diversity of perspectives. At least some people think that diversity of perspectives is a misguided goal, that erodes core values.
Are you saying that this implies that "EA wants more of the same" because some new EA (call him Alex) will be paired with a partner (Barbra) who gives one of the above answers, and then Alex will presume that what Barbra said was the "party line" or "the EA answer" or "what everyone thinks"?
EA skews young, white, male, and quantitative. Imagine you’re someone who doesn’t fit that profile but has EA values, and is trying to decide “is EA for me?” You go to EA Global (where the audience is not very diverse) and go to a Double Crux workshop. If most of the people talk about prioritizing adding AI researchers and hedge fund people (fields that skew young, male, and quanty) it might not feel very welcoming.
Basically, I think the question is framed so that it produces a negative externality for the community. And you could probably tweak the framin... (read more)