A few years back I wrote a post about learning to code. The short version: most people shouldn't, the gold rush is ending, "programmer" is becoming a regular job and losing its shine as a status symbol, and the people at the top of the field are busy coding themselves out of a job. It's here, now.
Simon Willison wrote up something that came out of the Oxide and Friends podcast — a term coined by Adam Leventhal for something that's been hanging in the air: Deep Blue. The psychological ennui — shading into existential dread — that a lot of software developers are feeling right now. Named after the IBM computer that beat world chess champion Garry Kasparov in 1997. If you weren't around for that or don't follow chess: it was treated as a watershed, the moment a machine conclusively beat the best human at the game that was supposed to best represent human strategic intelligence. There was real grief about it in some corners.
Chess didn't die, It's bigger now than it's ever been — streaming, online platforms, a whole new generation of players who got into it through Twitch and The Queen's Gambit. The thing that was supposed to kill it turned out to be just a chapter. The difference for programming is that chess never had the same economic stakes. Losing at chess didn't mean losing your livelihood. That part is real and harder to hand-wave away. But the idea that a field becomes worthless once a machine can outperform humans at its core tasks — chess has already run that experiment, and the result wasn't what people feared.
I'm not a programmer, so I don't feel it the way they do. But I recognize the shape of it. The thing about programming that made it such an appealing life path was that it sat at a weird intersection: meritocratic enough that a smart kid with a laptop and time could bootstrap their way in without credentials or connections, and valuable enough that breaking in meant a real career. It rewarded the kind of people who spent their teenage years taking things apart. That's a rare combination, and people built their identities around it.
The AI coding tools are good now. Not "good enough to help with boilerplate" — actually good. People are watching that identity proposition erode in real time and they're not happy about it. I don't blame them.
Here's the thing I said back then that I still believe: most of what looked like programming value was really force multiplication value. Code was the thing that let one person do something a million people benefit from. Or something tedious that would take a week and does it in an afternoon. The force multiplication was always the point. The syntax was just the interface.
LLMs are a new interface to the same underlying thing.
I'm not a programmer, so for me this isn't a crisis — it's just the thing becoming accessible. I have old music projects sitting in formats I can't easily convert. I had fifty worldbuilding infographics that were never going to get organized. I can now describe what I need in plain language, hand it to a robot, and get something useful back. That's not replacing a skill I had. It's giving me a capability I never had and wasn't going to develop.
Around 2022 I picked up the Humble Tech Book Bundle: Machine Learning and AI from No Starch Press. Sat down with it genuinely intending to learn. Got overwhelmed and moved on. That's a familiar story — the material assumed a foundation I didn't have, and building that foundation wasn't the point for me. What I actually wanted was to be able to do things with it. Now I can, with a prompt. The gap between wanting a capability and having it has collapsed in a way that would've seemed like a pitch for a sci-fi show a few years ago.
The skill question is more interesting for things I actually care about. I still run the show on music and fiction. Not because I'm suspicious of the tools, but because the doing is the point. I'm not trying to produce output. I'm trying to have the experience of making something. The process is load-bearing. Offloading that would be like hiring someone to go on a walk for you.
But for the stuff I never cared about — the scripting, the formatting, the organizing — I'm happy to hand it off. Someone who always cared more about what they could build than the act of building is going to make the same calculation. That seems fine. That seems correct, actually.
What I'd push back on is the idea that you can let everything atrophy and be fine. The LLMs are good, not infallible. You have to know enough to tell when they're wrong, which means staying in contact with your own skills even if you're offloading chunks of the work. A completely passive relationship with these tools is going to produce passive results. We always lose something in the bargain with new technology, and there will come a time when these tools are good enough that you can lean on the machine and use that freed up brainpower for something else. Whether you'll want to, or if you should, should be an intentional choice.
As for where this all lands — I wrote a few years ago that companies would eventually hire for ability to learn rather than ability to perform gatekeeping exercises. I think that's still coming, just faster than I expected. The frontier model game is probably going to be won by Amazon, Google, and Microsoft, not because they're smarter but because they're so profitable from other things that the cost is a rounding error. The open source models are already good enough for most uses and getting better. When you can run a capable model on consumer hardware without chaining rigs together, a lot of assumptions reset.
The people waiting for the bubble to pop and things to go back to normal are going to be waiting a long time. The bubble will pop — they always do. But that's not the same as the technology going away. The dot-com crash didn't kill the internet. It wiped out the overextended speculators and left the infrastructure standing, and then the internet quietly reorganized everything anyway. The dark fiber Google bought up cheap in the wreckage became the backbone for things nobody had imagined yet. This will go the same way. The froth burns off, the consolidation happens, and the technology keeps being a conduit for change whether anyone's in the mood for it or not. Normal isn't coming back. There's just what comes next.