a lover of technology, computer applications
There’s definitely something qualitatively different about really reading a paper vs. getting a summary, even a good one. I’ve noticed that when I rely too much on ChatGPT or similar tools to summarize studies, I sometimes end up with a false sense of confidence — like I “get” it, when actually I’ve missed key caveats or limitations that only become clear when reading the methods section or skimming figures.
I totally agree that tools can help in the “discovery” phase — finding relevant papers faster, generating search terms I hadn’t thought of, or even helping me decide which ones are worth digging into. But I still feel like the deep understanding (and the kind of judgment that comes with it) only happens when I go through the paper myself.
Also, yes to your point about being able to talk confidently about the details — I’ve had that experience too, where being fluent in a study’s methods actually changes how seriously people take an idea. That kind of credibility seems hard to fake with summaries alone.
I didn’t mean to suggest it’s an either/or thing, and I totally agree that it’s possible (and probably healthy) to feel both worry and optimism at the same time. That’s actually where I find myself most days too.
The title was more of a shorthand to capture that tension — not to say we must pick one side, but to get people into the headspace of asking: “What’s actually going on here, and how should we feel about it?”
Totally agree — “surface-level engagement” is exactly the phrase I’ve been circling around without quite naming it. That’s the subtle risk, I think: you feel productive, even insightful, but you haven’t actually done the real thinking yet. It’s like reading a menu and thinking you’ve tasted the meal.
And I really resonate with your point about emotional connection. When I’m too “efficient,” sometimes the work starts to feel oddly transactional — like I’m just slotting in the next block of text or ideas, rather than wrestling with them. I don’t think EA work has to feel emotionally intense all the time, but there’s a danger if it becomes purely mechanical.
That said, I’m with you: AI can absolutely empower people who might otherwise struggle to express their ideas clearly — whether due to language barriers, confidence, or just inexperience with writing. I’ve seen it give people a kind of voice they didn’t have before, and that feels like a win.
Have you found any specific habits or “guardrails” that help you stay on the deeper-thinking side when using ChatGPT?