"Most expected value is in the far future." Because there are so many potential future lives, the value of the far future dominates the value of any near-term considerations.
Why this needs to be retired: just because a cause has high importance doesn't mean it has high tractability and low crowdedness. It could (and hopefully will soon) be the case that the best interventions for improving the far future are fully funded, and the next best intervention is highly intractable. Moreover, for optimally allocating the EA budget, we care about the expected value of the marginal action, and not average expected value.
"What matters most about our actions is their very long term effects."
Why this needs to be retired: there are only a small number of actions where we have a hope of reasonably estimating long-term effects, namely, actions affecting lock-in events like extinction or misaligned AGI spreading throughout the universe. For all other actions, estimating long-term effects is nearly impossible. Hence, this is not a practical rule to follow.
Not actively. I buy that doing a few projects with sharper focus and tighter feedback loops can be good for community health & epistemics. I would disagree if it took a significant fraction of funding away from interventions with a more clear path to doing an astronomical amount of good. (I almost added that it doesn't really feel like lead elimination is competing with more longtermist interventions for FTX funding, but there probably is a tradeoff in reality.)