Eliezer Yudkowsky has an excellent post on "Evaporative Cooling of Group Beliefs".
https://www.lesswrong.com/posts/ZQG9cwKbct2LtmL3p/evaporative-cooling-of-group-beliefs
Essentially, when a group goes through a crisis, those who hold the group's beliefs least strongly leave, and those who hold the group's beliefs most strongly stay.
This might leave the remaining group less able to identify weaknesses within group beliefs or course-correct, or "steer".
The FTX collapse, bad press and bad optics of the Whytham Abbey purchase probably mean that this is happening in EA right now.
I'm not really sure what to do about this, but one suggestion might be for community building to move away from the model of trying to produce highly-engaged EAs, and switch to trying to produce moderately-engaged EAs, who might be better placed to offer helpful criticisms and help steer the movement towards doing the most good.
I'm extremely sceptical that the evaporative cooling model applies. As far as I'm aware its only empirical support is the three anecdotes in the original post. Almost all social science is wrong so I basically don't update at all on the model's predictions.
Thanks for clarifying further, and some of that rationale does make sense (e.g. it's important to critically look at the assumptions in models, and how data was collected).
I still think your conclusion/dismissal is too strong, particularly given social science is very broad (much more so than the economics examples given here), some things are inherently harder to model accurately than others, and if experts in a given field have certain approaches the first question I would ask is 'why'.
It's better to approach these things with humility and an open mind, ... (read more)