Synthetic Scarcity: Abundance, Enclosed
We built abundance and fenced it off.
The irony is perfect: AI promises infinite generation — text, images, code, music, video — but the business models depend on manufacturing scarcity.
- Rate limits
- Premium tiers
- API pricing that penalises exploration
- "Context window" as a scarce resource
- Paywalled features that cost almost nothing to provide
The Tragedy That Wasn't
You've probably heard of the "tragedy of the commons" — the idea that shared resources inevitably get depleted because everyone acts in self-interest. It's used to justify privatisation: the only way to save the commons is to enclose it.
Except it's not true.
Elinor Ostrom won the Nobel Prize in Economics for demonstrating that communities successfully govern shared resources all the time — without either state control or market enclosure. Fisheries, forests, irrigation systems. The commons doesn't fail. It gets made to fail when external forces disrupt the community governance that kept it healthy.
The "tragedy" framing was always ideological cover for enclosure. Make the commons look ungovernable, then offer private ownership as the solution.
Sound familiar?
Feed Me, Seymour

In Little Shop of Horrors, Seymour discovers a strange plant that solves his problems. It brings customers to the failing flower shop. It makes him famous. It seems like a miracle.
Then it starts demanding blood.
"Feed me, Seymour. Feed me all night long."
The LLM follows the same arc. It starts helpful — solving your problems, writing your emails, explaining your code. Then it starts demanding: your data, your attention, your conversations, your corrections. Every interaction trains it. Every query flows through the chokepoint.
MOAR TOKENS.Like Audrey II, it started as a solution and became the problem. And like Seymour, we keep feeding it because the alternative — walking away — seems impossible once we're dependent.
The Chokepoint Economy
Cory Doctorow and Rebecca Giblin identified the pattern in Chokepoint Capitalism: find the bottleneck, control the bottleneck, extract from everyone who needs to pass through.
The AI chokepoints are being constructed in real-time:
Compute. Training large models requires capital that only a few entities can marshal. The foundation becomes a tollbooth. Data. Your conversations, your queries, your corrections — all flow through the chokepoint, improving the model, strengthening the position of whoever controls it. Context. Your accumulated relationship with an AI — your patterns, your preferences, your memory — lives on their servers. Switch providers and you start from scratch.Each chokepoint converts abundance into scarcity. Infinite generation becomes metered access. Your cognitive history becomes lock-in.
Enshittification: Synthetic Scarcity of Quality
The pattern extends beyond access to quality itself.
Doctorow's "enshittification" describes how platforms degrade: first they're good to users (to acquire them), then they're good to business customers (to lock them in), then they extract from everyone until the platform becomes barely usable.
This is synthetic scarcity of quality. The technical capacity for a good product exists. The economic logic produces a worse one.
Search results that prioritise advertisers over answers. Recommendation algorithms that serve engagement over satisfaction. AI outputs padded with caveats and hedging for liability, not clarity.
The abundance is there. The business model degrades it.
The Gap Where Value Accumulates
GPT-4 isn't expensive because intelligence is expensive. It's expensive because someone needs to extract value.
The marginal cost of one more query approaches zero. The price does not. The gap between marginal cost and price is where capital accumulates.
The technology is post-scarcity.
The economics refuse to notice.