Eliezer Yudkowsky has an excellent post on "Evaporative Cooling of Group Beliefs".
https://www.lesswrong.com/posts/ZQG9cwKbct2LtmL3p/evaporative-cooling-of-group-beliefs
Essentially, when a group goes through a crisis, those who hold the group's beliefs least strongly leave, and those who hold the group's beliefs most strongly stay.
This might leave the remaining group less able to identify weaknesses within group beliefs or course-correct, or "steer".
The FTX collapse, bad press and bad optics of the Whytham Abbey purchase probably mean that this is happening in EA right now.
I'm not really sure what to do about this, but one suggestion might be for community building to move away from the model of trying to produce highly-engaged EAs, and switch to trying to produce moderately-engaged EAs, who might be better placed to offer helpful criticisms and help steer the movement towards doing the most good.
Thanks for the summary and repost. I do think that this saga also has lessons for the EA community. I have seen many incidences whereby we overemphasise EA alignment over subject matter expertise , especially when subject expertise is more practical and mission critical, for example in operation and risk management. This supports your comment on ‘This might leave the remaining group less able to identify weaknesses within group beliefs or course-correct, or "steer".