I had already made a question post about this during the last thematic week. I suppose my main motivation comes from being surprised at the fact that there's not just an absence consensus on this, but that it even seems sidelined in X-risk discussion (not that no one has ever given an answer to this, of course). It's a question I try to ask in 1:1 conversations with individuals involved in reducing existential risks, but the answers I get vary widely from person ton person, and I still don't have any idea of where "the community" tends to stand on this. Since it seems much "easier" for an existential catastrophe in general to happen than for all animal sentience to be wiped out even temporarily, I expect at least a slight majority of votes to be on the "disagree" side. However, from my limited experience, I've had the impression being that individuals with P(ASI Doom) > 50% over the next century tend to believe that an existential catastrophe (here, ASI) would indeed wipe out all animal life (and even biological life, perhaps).
Some notes : I mean wiping it out in the moment, independently of whether it could evolve again on earth in the future or not. And digital sentience is not a consideration here, though I think it matters a lot.
Intuitively seems very unlikely.