HS

Henry Stanley 🔸

1828 karmaJoined
henrystanley.com

Bio

Participation
4

Former CTO and co-founder of earn-to-give fintech Mast.

Comments
199

Ferrous sulphate is also common but a bit nauseating and poorly absorbed in any case. Ferrous bisglycinate is also found branded as “gentle iron”.

For those very deficient in iron, an iron infusion will give you ~two years’ worth of iron in one go - and skips all the issues with oral bioavailability of iron. You will need to test your iron levels first to avoid iron overload.

I write a bit about iron supplementation in my guide to treating restless leg syndrome (RLS) for which iron deficiency is a common cause: https://henryaj.substack.com/p/how-to-treat-restless-legs-syndrome

This is wonderful – thank you so much for writing it.

Mutual dedication to one another’s ends seems like a thing commonly present in religious and ethnic communities. But it seems quite uncommon to the demographic of secular idealists, like me. Such idealists tend to form and join single-focus communities like effective altruism, which serve only a subset of our eudaemonic needs.

Agree about secular, single-purpose communities – but I'm not sure EA is quite the same.

I've found my relationships with other EAs tend to blossom to be about more than just EA; those principles provide a good set of shared values from which to build other things, like a sense of community, shared houses, group meals, playing music together and just supporting each other generally. Then again, I don't consider EA to be the core of my identity, so YMMV.

Great post. Two things come to mind:

  1. One way to just be able to do more stuff is to take stimulants. I think there are cases where being on them can dent your intelligence in some subtle ways but broadly they can drastically increase your ability to do more, work through when you're fatigued, etc. Maybe it's still a sufficiently edgy position that you don't mentioned it here but the absence was interesting. People at college are all taking modafinil for a reason.

  2. I worry that some incredibly ambitious people in the EA world have gone on to pursue paths that have actually been harmful. Early employees at the frontier AI labs seem like the obvious example - Anthropic was founded as an "AI safety lab" with commitments not to push the frontier but they obviously forgot about that along the way, and it seems hard to justify continuing to work there on capabilities imo. I suspect there's a lot of motivated reasoning going on among this group. Perhaps it's a cautionary tale about ambition unmoored from reflection as other people point out here, or that if your ambition leads to filthy lucre then it's very hard to course correct later on.

(Agree with the other commenters here that maybe the rate-limiting step isn't just pushing harder but co-ordination, taking more individual risks, etc)

reposted from my comment on the original Substack article

Is there a risk of boiling the ocean here?

The 'community notes everywhere' proposal seems easy enough to build (I've been hacking away at a Chrome extension version of it). I'm not sure it makes sense to wait for personal computing to change fundamentally before trying to attempt this.

I agree that distribution is an issue, which I'm not sure how to solve. One approach might be to have a core group of users onboarded who annotate a specific subset of pages - like the top 20 posts on Hacker News - so that there's some chance of your notes being seen if you're a contributor. But I suppose this relies on getting that rather large core group of users (e.g. HN readers) to start using the product.

Alternatively you build the thing and hope that it gets adopted in some larger way, say it gets acquired by X if they want to roll out community notes to the whole web.

You do address the FTX comparison (by pointing out that it won't make funding dry up), that's fair. My bad.

But I do think you're make an accusation of some epistemic impropriety that seems very different from FTX - getting FTX wrong (by not predicting its collapse) was a catastrophe and I don't think it's the same for AI timelines. Am I missing the point?

I might be missing the point, but I'm not sure I see the parallels with FTX.

With FTX, EA orgs and the movement more generally relied on the huge amount of funding that was coming down the pipe from FTX Foundation and SBF. When all that money suddenly vanished, a lot of orgs and orgs-to-be were left in the lurch, and the whole thing caused a huge amount of reputational damage.

With the AI bubble popping... I guess some money that would have been donated by e.g. Anthropic early employees disappears? But it's not clear that that money has been 'earmarked' in the same way the FTX money was; it's much more speculative and I don't think there are orgs relying on receiving it.

OpenPhil presumably will continue to exist, although it might have less money to disburse if a lot of it is tied up in Meta stock (though I don't know that it is). Life will go on. If anything, slowing down AI timelines will probably be a good thing.

I guess I don't see how EA's future success is contingent on AI being a bubble or not. If it turns out to be a bubble, maybe that's good. If it turns out not to be a bubble, we sure as hell will have wanted to be on the vanguard of figuring out what a post-AGI world looks like and how to make it as good for humanity as possible.

For effect, I would have pulled in a quote from the Reddit thread on akathisia rather than just linking to it.

Akathisia is a inner restlessness that is as far as I know the most extreme form of mental agitation known to man. This can drive the sufferer to suicide [...] My day today consisted of waking up and feeling like I was exploding from my skin, I had a urge that I needed to die to escape. [...] I screamed, hit myself, threw a few things and sobbed. I can’t get away from it. My family is the only reason why I’m alive. [...] My CNS is literally on fire and food is the last thing I want. My skin burns, my brain on fire. It’s all out survival.

if I had kept at it and pushed harder, maybe the project would have got further... but I don't think I actually wanted to be in that position either!

I think this is a problem with for-profit startups as well. Most of the time they fail. But sometimes they succeed (in the sense of “not failing” rather than breakout success which is far rarer), and in that case you’re stuck with the thing to see it through to an exit.

I enjoyed this, and I miss bumping into you on the stairs at house parties!

Load more