Cool Data. What Do I Do With It?

Another data-driven piece measuring AI productivity in git commits. Another morning reading it and not knowing what to do with the information.

The piece is fine. The data is fine. The problem is that none of my AI use would show up in any of it — not a commit, not a package, not a repository. I use these tools extensively. I use them every day. The charts would show nothing, and that nothing would read as evidence that nothing is happening.

Neither would my artist friends' use show up. Nor my musician friends'. Nor my academic friends'. Their absence in the data looks identical to absence from the practice. It isn't.

Fifty worldbuilding infographics — years of accumulated visual reference — had been sitting in a folder waiting to become something coherent. Not because the work wasn't worth doing. Because organizing it was the kind of task that sits on a list for months before quietly disappearing. The gap between "this should exist" and "I am going to make this exist today" was exactly the size of the friction involved.

One afternoon with NotebookLM closed that gap. Contradictions resolved, sources cross-referenced, the whole thing shaped around what I actually needed it for. A queryable worldbuilding resource, built in an afternoon instead of never.

That project generated something else: a session analyzing the writing of several Star Trek writers — Fontana, Piller, Moore, Ellison — that became a set of behavioral redirects for a fiction skill file. Structural moves worth stealing, patterns worth encoding. I finished it, filed it, and forgot about it.

Months later I put 43% of a five-hour model quota into getting an outline for the prequel to a novel I haven't written yet. Not to write it — to have ground to stand on before writing the thing it precedes. Some stories need the thousand-year picture before the surface events make sense. I've been reading Animorphs for the first time, starting with The Andalite Chronicles, for the same reason: the prequel first, because certain histories have to exist before the present is legible.

The model pulled in the worldbuilding skill and the fiction skill without being asked. Work I'd stopped thinking about converged into something new. Every programmer recognizes the underlying move: pull reusable patterns out of specific work, package them, call them later. Functions, objects, modules. I did the same thing with AI behavior — extracted a fiction analysis into a skill file, extracted a worldbuilding process into another. The difference is the compiler is a language model and I'd forgotten I wrote the libraries.

None of it ships. All of it is real.

The people making the case for AI to my friends aren't making it badly because they're wrong. They're making it badly because they keep reaching for the tool that works on them — data, studies, charts measuring repository activity — and handing it to people for whom that tool does nothing.

Data reaches decision-makers. Stories reach the people who aren't sure yet. My artist friends, musician friends, academic friends aren't withholding belief pending sufficient evidence. They're not seeing themselves in the conversation at all. The researchers sharing PyPI analyses are, in the most literal sense, talking to each other.

The story is: here's the specific thing I did. Here's what it unblocked. Here's what existed afterward that couldn't have existed otherwise. That's the unit of measure that lands with someone who makes things.

There's a CEO making the rounds right now, pitching an expert marketplace. The idea: a creator's process has standalone value, separable from any individual output. Package it, make it available as a tool, let an LLM apply your methodology to someone else's situation. Revenue split. Sounds familiar if you've been building this way already.

He's not wrong that the idea is real. I know it's real because I built it — for myself, outside any marketplace, with no revenue split, because it made my work better. The skill files aren't a product. They're a practice. And they compound in ways no marketplace can offer, because I own them and they live in plain text and I can take them anywhere.

What the pitch can't account for is the session where three things you built in three separate contexts arrive together uninvited, because the problem you brought was exactly the shape of what you'd already solved. That's not a feature on a pricing page. That's what it feels like when the tool is actually yours.

The window for that — for domain experts building tools that serve their own needs on their own terms — isn't permanently open. Big tech noticed. The offer is already on the table: come build on our platform, under our terms, and we'll call it ownership. It's a better offer than most creators have gotten in a long time. It's still enclosure.

The data isn't wrong. It's measuring real activity in a real ecosystem. It's just measuring a room I wasn't in — and neither were most of the people who most need to hear that this is different, that something genuinely changed, that the thing they've been waiting for a computer to do is, imperfectly and with friction, here.

The argument that reaches them won't come with a chart. It'll come from someone who made something, and can say what it was, and what it cost, and what existed afterward that didn't before.