It's been a year, but I finally wrote up my critique of "longtermism" (of the Bostrom / Toby Ord variety) in some detail. I explain why this ideology could be extremely dangerous -- a claim that, it seems, some others in the community have picked up on recently (which is very encouraging). The book is on Medium here and PDF/EPUB versions can be downloaded here.
See my response to AlexHT for some of my overall thoughts. A couple other things that might be worth quickly sketching:
The real meat of the book from my perspective were the contentions that (1) longtermist ideas, and particularly the idea that the future is of overwhelming importance, may in the future be used to justify atrocities, especially if these ideas become more widely accepted, and (2) that those concerned about existential risk should be advocating that we decrease current levels of technology, perhaps to pre-industrial levels. I would have preferred if the book focused more on arguing for these contentions.
Questions for Phil (or others who broadly agree):
P.S. - If you're feeling dissuaded from checking out Phil's arguments because they are labeled as a 'book', and books are long, don't be - it's a bit long for an article, but certainly no longer than many SSC posts, for example. That said, I'm also not endorsing the book's quality.