Very speculative and anecdotal
I think I personally find myself emotionally tugged away from longtermism a little by these events. When there's so much destruction happening "right before my eyes" and in a short enough future that it can really emotionally resonate, it's like on some level my brain/emotions are telling me "How could you be worried about AI risk or a future bioengineered pandemic at a time like this! There are people dying right now. This already is a catastrophe!"
And it's slightly hard to feed into my emotions the fact that a very different scale of catastrophe, and a much more permanent type, could still possibly happen at some point. (Again, I'm not dismissing that the current pandemic really is a catastrophe, and I do believe it makes sense to reallocate substantial effort to it right now.)
On the other hand, this pandemic also seems to validate various things longtermists have been saying for a while, such as about how civilization is perhaps more fragile than people imagine, how we need to improve the speed at which we can develop vaccines, etc. And it provides an emotionally powerful reminder of just how bad and real a catastrophe can be, which might make it easier for people to feel how bad it is that we could have a catastrophe that's even worse, and that in fact destroys civilization as a whole.
I think I'd tentatively guess that this pandemic will make the general public slightly more "longtermist" in their values in general. I'd also guess that it'll make the general public substantially more in favour - for present-focused reasons - of things that also happen to be good from a longtermism perspective (e.g., increased spending on future pandemic preparedness in general).
But I'm not sure how it'll affect people who are already quite longtermist. From my sample size of 1 (myself), it seems it won't really change behaviours, but will slightly reduce the emotional resonance of longtermism right now (as opposed to just general focus on GCRs).