Regarding the Wytham Abbey purchase, there has been discussion over whether or not optics should be considered when making decisions.
Some objections include that optics can be hard to correctly predict / understand, and thinking around optics could be prone to motivated reasoning, so optics should be set aside for decision making.
But the same is true for almost every aspect of EA, aside from the highly empirical randomista development wing!
Especially over the longer term, optics affects community building, including how many people get into EA, and maybe more importantly, who gets into EA, i.e, what kind of pre-existing beliefs and opinions they bring with them. As EAs aim to improve government policy in EA priority areas, EA's optics affects their ability to do this. Optics also affect how EA ideas diffuse outside of EA, and where they diffuse to.
Like with every other hard to predict, highly uncertain factor that goes into lots of EA decision making, we should make uncertain estimates around optics anyway, work on constantly refining our predictions around optics, and include optics as a factor when working out the EV of decisions.
(Of course, one might still decide it's worth making big purchases for community building, but optics should be taken into account!)
Thanks for copying your comment!
”most naive attempts to do so can easily do more harm than good.”
I agree that factoring in optics can accidentally do harm, but I think if we’re trying to approximately maximise EV, we should be happy to risk doing harm.
I’m sure factoring in optics will sometimes lead to optics being overweighted, but I’m still unclear on why you think optics would be overweighted more often than not, and why ignoring optics is a better solution to overweighting than factoring it in. If we’re worried about overweighting it, can’t we just weight it less?
If I’m interpreting your comment correctly, you’re arguing that systematic biases to how we would estimate optics value mean that we’re better off not factoring in optics into the EV calculations.
There are other systematic biases to the “wanting to fit in” bias that affect EV calculations - self-serving biases might cause EAs to overestimate the value of owning nice conference venues, or the value of time saved through meal delivery services or Ubers. I think consistency would require you to argue that we should not factor in the value of these things into EV calculations - I’d be interested to get your thoughts on this.
(My view is that we should continue to factor everything in and just consciously reduce the weighting of things that we think we might be prone to overweighting or overvaluing.)