Thanks Matt and other commenting here. I have independently starting worrying about the show being too narrow and repetitive this year, and will be factoring in the issues people have raised here in planning for next year!
(Unfortunately I can't say we'll probably get back to being as interesting for an EA Forum audience as we once were, as we're working with a different theory of change now and I think for better or worse the times we're living in call for shifting strategy.)
Thanks for this post, it warmed our hearts! Glad we've been able to help you understand the world better over the years and maybe even have more impact too. ❤️
I threaded the top ten list here: https://x.com/robertwiblin/status/1834613676034113817
(By the way the next episode we plan to release, one of Luisa's, actually has more pushback on AI and robotics, have a listen and see what you think.)
For what it's worth SBF put this idea to me in an interview I did with him and I thought it sounded daft at the time, for the reasons you give among others.
He also suggested putting private messages on the blockchain which seemed even stranger and much less motivated.
That said, at the time I regarded SBF as much more of an expert on blockchain technology than I was, which made me reluctant to entirely dismiss it out of hand, and I endorse that habit of mind.
As it turns out people are now doing a Twitter clone on a blockchain and it has some momentum behind it: https://docs.farcaster.xyz/
So my skepticism may yet be wrong — the world is full of wonders that work even though they seem like they shouldn't. Though how a project like that out-competes Twitter given the network effects holding people onto the platform I don't know.
Now having data for most of October, knowing our release schedule, and being able to see month-by-month engagement, I'd actually forecast that 80k Podcast listening time should grow 15-20% this year (not 5%), for ~300,000 hours of consumption total.
(If you forecast that Q4 2023 will be the same as Q3 2023 then you get 11% growth, and in fact it's going to come in higher.)
That is indeed still a significant reduction from last year when it grew ~40%.
Let me know if you'd like to discuss in more detail!
Ah OK, I agree it's not that consistent with GiveWell's traditional approach.
I think of high-confidence GiveWell-style giving as just one possible approach one might take in the pursuit of 'effective altruism', and it's one that I personally think is misguided for the sorts of reasons Shruti is pointing to.
High-confidence (e.g. GiveWell) and hits-based giving (e.g. Open Phil, all longtermism) are both large fractions of the EA-inspired portfolio of giving and careers.
So really I should just say that there's nothing like a consensus around whether EA implies going for high-confidence or low-confidence strategies (or something in the middle I guess).
(Incidentally from my interview with Elie I'd say GiveWell is actually now doing some hits-based giving of its own.)
Sorry in what sense does Shruti say that EA solutions aren't effective in the case of air pollution? Do you mean that the highest 'EV' interventions are likely to be ones with high uncertainty about whether they work or not?
(I don't think of EA as being about achieving high confidence in impact, if anything I'd associate EA with high-risk hits based giving.)
Hey Matt, obviously there's a tonne one could say here, just to offer some quick thoughts: