I asked if EA has a rational debate methodology in writing that people sometimes use. The answer seems to be “no”.
I asked if EA has any alternative to rationally resolve disagreements. The answer seems to be “no”.
If the correct answer to either question is actually “yes”, please let me know by responding to that question.
My questions were intended to form a complete pair. Do you use X for rationality, and if not do you use anything other than X?
Does EA have some other way of being rational which wasn’t covered by either question? Or is something else going on?
My understanding is that rationality is crucial to EA’s mission (of basically applying rationality, math, evidence, etc., to charity – which sounds great to me) so I think the issue I’m raising is important and relevant.
My summary of the answers I got to my question:
This is intended to be a statement that EAers would agree with. Please let me know if you disagree.
I think this is a pretty good summary! One minor addition is that "the community as a whole tries to encourage rationality" suggests that this is a nice-to-have but not much effort has been put into this, but the reality is that the community has invested non-trivial resources into individual and collective rationality (e.g. funding LessWrong, Metaculus, Manifold Markets, QURI, etc).
Also relevant.