← All posts

When AI Eats Itself: Why Your Writing Is About to Be Worth More

By Mark Hankin

Here's a phrase I've been thinking about a lot lately: model autophagy disorder.

It sounds like something you'd hear on a particularly grim episode of House. ("The patient's models are eating their own outputs, Doctor — it's degenerative, and frankly we should have seen this coming.") But it's a real thing — documented by AI researchers in a Nature paper in 2024 — and it describes what happens when you train AI language models on text that was itself generated by AI.

The short version: they get worse.

Not catastrophically. Not all at once. But measurably, generation by generation, like a photocopy of a photocopy of a photocopy. The original sharp lines blur. The quirks and rhythms and idiosyncrasies — the things that make human language feel like something a person actually said — get smoothed out into a kind of polite mush.

This matters more than it might first appear. Because we are, very obviously, in the middle of an AI-generated content explosion. Every blog post farm, every SEO-optimised content mill, every "10 best ways to write a book" listicle that nobody wrote and nobody really reads — they're all flooding the open web with text that AI produced, which is exactly the text that the next generation of AI is being trained on.

The internet is becoming an ouroboros. The snake is eating its own tail. And the people who keep churning out AI-generated content at scale aren't just being a bit annoying — they're actively degrading the resource that future AI systems depend on.

So what does this have to do with you and your novel?

Quite a lot, actually.

Three things are happening at once, and they're more connected than they look.

First: the major AI labs — Anthropic, OpenAI, Google, Mistral, the lot — are starting to grapple with where the next decade of training data comes from. The open web is increasingly polluted. Books, magazines, newspapers, and other sources of high-quality human-written prose are becoming disproportionately valuable. There's already a market forming for licensed datasets of "verified human-authored" creative writing. It's small now. It will not be small for long.

Second: readers are getting better at smelling AI-generated content. Maybe not the average reader at first glance, but Goodreads communities, BookTok creators, indie reviewer cohorts, the literary-fiction crowd — they're catching on, and they're vocal about it. Books that read as synthetic get one-star reviewed into the ground. Books that are unmistakably human get celebrated for that reason alone.

Third: distributors and prizes are tightening up. Amazon KDP now requires AI disclosure (I wrote about the practical implications back in April). The Booker, the Pulitzer, the National Book Award — all have human-authorship clauses. The trajectory is unmistakable.

In a world where AI-generated text is cheap, abundant, and quietly poisoning its own future, human writing is going from an assumption to an asset. And not just morally. Commercially. Demonstrably yours, attestably yours, actually-you-wrote-it yours.

A generous word about my competitors

Here I should be fair to the AI-first writing tools that have appeared over the last couple of years. I don't think they're villains.

Sudowrite has built impressive tools for fiction writers who get stuck on the actual sentences — describe-this-scene generators, brainstorm engines, prose-rewriters that can shake a stuck paragraph loose. For some writers, this is the difference between finishing a draft and not. I respect that.

NovelCrafter has built a brilliant codex-and-orchestration system for writers managing long fantasy series with seventeen named characters and forty cultures. The depth of its world-building tools is exceptional. If that's the problem you have, it's a tool worth using.

NovelAI has carved out a niche for storytellers who want generative text without content restrictions — fan fiction, role-play, experimental work. It serves an audience that the big AI providers won't, and there's something honest about that.

Each platform is making a different bet about how AI fits into the writing process. They've each chosen to lean into the generation side — let the AI write longer, faster, freer.

I've made the opposite bet.

Why I went the other way

gowrite uses AI too. Of course it does. (You'd have to be deliberately obtuse to build a writing platform in 2026 without it.) But the AI in gowrite is structurally different in two ways.

First, it's an advisor, not a generator. You select a passage. The AI suggests. You decide whether to accept the suggestion, modify it, or dismiss it entirely. The default is suggestion, not substitution. There's no auto-write button. There's no "generate the next chapter for me." The writer is always making the call, always doing the work, always the author.

Second — and this is the bit nobody else does — gowrite tracks the process. Every accepted suggestion. Every dismissed one. Every AI involvement, recorded. When you export your manuscript, gowrite shows you a verifiable percentage of how much of your finished prose passed through AI. For most writers using the tool as I designed it, that number is reassuringly low. And it's not just reassuring. It's evidence. Attestable, signed, "this is what actually happened" evidence that a real human wrote your book.

That evidence is going to matter more, not less, as the next few years unfold. Not because anyone's going to turn up at your door demanding it. But because:

- Distributors will increasingly ask for it
- Awards will increasingly require it
- Readers will increasingly trust it
- Clean, verified human-authored writing is starting to look like a scarce resource in a market that's only just beginning to notice

The other writing platforms could add this kind of attestation. They could. But it would be commercially awkward for them, because attesting to low AI involvement on a tool whose primary feature is "AI generates prose for you" creates an obvious tension. Their job is to show you what AI can do. Mine is to show you what you did.

The slow shift in what writing is for

I realise this all sounds quite practical and commercial. Let me be slightly less practical for a moment.

The thing I keep coming back to is this: writing has always been an act of one person trying to get something specific out of their head and into someone else's. The shape of that act has changed across history — from carved stone to scratched papyrus to printing press to laptop screen — but the core has been stubbornly consistent. One human mind producing language that another human mind will eventually read.

AI-generated text breaks that contract in a way I don't think we've fully reckoned with. Not because it's dishonest (it can be very good), but because the mind on the other end of it is the wrong shape. There's nobody behind the words. Nobody who chose them. Nobody whose specific experience filtered through to produce that exact sentence in that exact rhythm.

When you read a book, what you're doing — at some level — is borrowing the consciousness of the person who wrote it. For ninety minutes, you are seeing the world through eyes that aren't yours. That is, I'd argue, more or less the point of fiction.

You can't borrow consciousness from a thing that doesn't have one. You can have a perfectly good time with the prose. You can enjoy the plot. But you don't meet anyone, because there's no one there to meet.

This is why human authorship matters, even before we get to model collapse and training-data economics. It's why I built gowrite the way I built it. And it's why, when I think about the kind of writing platform I want to be running in five years, the answer has very little to do with how clever the AI is and almost everything to do with how plainly we can say: yes, the person whose name is on this book is the person who wrote it.

The AI will keep getting better. The internet will keep filling up with synthetic text. Model autophagy disorder will quietly chew through the next generation of language models, and the labs will quietly look for cleaner data.

Writers — actual writers, with real voices and real experiences and real things to say — are about to be the most valuable resource in the room.

You should probably write that book.

When AI Eats Itself: Why Your Writing Is About to Be Worth More — gowrite