Eliezer Yudkowsky has an excellent post on "Evaporative Cooling of Group Beliefs".
https://www.lesswrong.com/posts/ZQG9cwKbct2LtmL3p/evaporative-cooling-of-group-beliefs
Essentially, when a group goes through a crisis, those who hold the group's beliefs least strongly leave, and those who hold the group's beliefs most strongly stay.
This might leave the remaining group less able to identify weaknesses within group beliefs or course-correct, or "steer".
The FTX collapse, bad press and bad optics of the Whytham Abbey purchase probably mean that this is happening in EA right now.
I'm not really sure what to do about this, but one suggestion might be for community building to move away from the model of trying to produce highly-engaged EAs, and switch to trying to produce moderately-engaged EAs, who might be better placed to offer helpful criticisms and help steer the movement towards doing the most good.
I don't think red teaming is a good replacement for the kind of diversity of perspectives and approaches that having more moderately involved EAs would bring. Being a highly involved EA takes a lot time and mental resources, and I would expect mideratley involved EAs to be able to provide this kind of diversity simply because they will have more engagement with non-EA things. They will also be less enmeshed with EA as an identity so will presumably have a more disinterested approach, which I think will valuable.
I also think they would be less effected by the perverse incentives that are in place for highly engaged EAs when it comes to criticism and new ideas - currently, both the whay weighted voting on the Forum works and funding environment where, based in the writeups for EA Funds, having made a good impression on a grantmaker seems to be a non-trivial factor in getting funded disincentivizes criticisms that might not go over well, and having more people who are less concerned with their reputation and standing in the EA community would be a good way to counteract this.