Eliezer Yudkowsky has an excellent post on "Evaporative Cooling of Group Beliefs".
https://www.lesswrong.com/posts/ZQG9cwKbct2LtmL3p/evaporative-cooling-of-group-beliefs
Essentially, when a group goes through a crisis, those who hold the group's beliefs least strongly leave, and those who hold the group's beliefs most strongly stay.
This might leave the remaining group less able to identify weaknesses within group beliefs or course-correct, or "steer".
The FTX collapse, bad press and bad optics of the Whytham Abbey purchase probably mean that this is happening in EA right now.
I'm not really sure what to do about this, but one suggestion might be for community building to move away from the model of trying to produce highly-engaged EAs, and switch to trying to produce moderately-engaged EAs, who might be better placed to offer helpful criticisms and help steer the movement towards doing the most good.
Thanks for clarifying further, and some of that rationale does make sense (e.g. it's important to critically look at the assumptions in models, and how data was collected).
I still think your conclusion/dismissal is too strong, particularly given social science is very broad (much more so than the economics examples given here), some things are inherently harder to model accurately than others, and if experts in a given field have certain approaches the first question I would ask is 'why'.
It's better to approach these things with humility and an open mind, particularly given how important the problems are that EA is trying to tackle.
I've just commented on your EA forum post, and there's quite a lot of overlap and further comments seemed more relevant there compared to this post: https://forum.effectivealtruism.org/posts/WYktRSxq4Edw9zsH9/be-less-trusting-of-intuitive-arguments-about-social?commentId=GATZcZbh9kKSQ6QPu