The Day My Workflow Got 50x More Expensive

Friday, April 4, 2026. I open the terminal, connect OpenClaw to Claude, and start my workday. Or rather: I tried to start it.

Because that Friday, Anthropic simply flipped the switch. Starting at noon (PT), Claude Pro ($20/month) and Max ($100-$200/month) subscriptions no longer covered third-party tools like OpenClaw. If I wanted to keep using it, I needed to migrate to the API — with per-token billing.

The quick math gave me vertigo. A journalist at The Register calculated that during March, his $20 subscription yielded $236 worth of tokens at list price. Others reported even more absurd ratios — 36x the amount paid. A subscription that cost $20 could now turn into $1,000+ per month on the pay-as-you-go model.

As AI product manager Aakash Gupta put it: “The $20/month all-you-can-eat buffet just closed.”

What Happened (The Real Timeline)

The restriction didn’t come out of nowhere. When I reconstructed the timeline, it was clear this was a gradual strategy:

November 2025: OpenClaw (then called Clawdbot) launches and supports access via Claude subscription OAuth tokens. Everything works perfectly.

January 2026: An Anthropic engineer hints on social media that enforcement will tighten. Few notice.

February 14, 2026: Peter Steinberger, OpenClaw’s creator, announces he’s joining OpenAI. Sam Altman posts that Steinberger will “drive the next generation of personal agents.” OpenClaw is transferred to an open-source foundation.

February 20, 2026: Anthropic updates its legal terms to explicitly prohibit subscription OAuth tokens in third-party tools.

April 4, 2026: Ban takes effect. Boris Cherny, Head of Claude Code at Anthropic, announces the change on X: “Our subscriptions weren’t built for the usage patterns of these third-party tools.”

April 10, 2026: Anthropic temporarily bans Steinberger’s own account for “suspicious activity” — even though he was using the API as required by the new rules. After the post goes viral, the account is restored within hours.

Steinberger’s response couldn’t have been more direct: “First they copy some popular features into their closed harness, then they lock out open source.”

Why Anthropic Did This

The official explanation is technical and legitimate: subscriptions were subsidizing usage they were never sized to support. With over 135,000 OpenClaw instances running, each consuming tokens far above normal user patterns, the math simply didn’t work. Anthropic’s proprietary tools (Claude Code, Cowork) are optimized to maximize prompt cache hit rates — reusing previously processed text to save compute. OpenClaw didn’t do this.

But the strategic explanation is more revealing.

Anthropic committed $100 million to its Claude Partner Network in March 2026. Launched an enterprise software marketplace running on Claude. Launched Claude Dispatch for remote agent control — weeks before cutting off OpenClaw. The pattern is consistent: Anthropic wants the revenue, the data, and the governance that comes from owning the customer relationship.

And then, as if we needed more context, the leak happened.

Meet Conway: The Super Agent Nobody Was Supposed to See

On March 31, 2026 — four days before the OpenClaw ban — Anthropic accidentally published 512,000 lines of Claude Code source through a source map file included in an npm package. Security researcher Chaofan Shou discovered it. The post on X accumulated over 28 million views. The code was downloaded, mirrored on GitHub (84,000 stars, 82,000 forks), and analyzed by thousands of developers before Anthropic could issue a DMCA takedown.

And buried in those 512,000 lines, developers found something never announced: Project Conway.

Conway isn’t a Claude upgrade. It’s not an improved chatbot. It’s a persistent agent platform — built to run continuously in the background, responding to external triggers instead of waiting for user input.

The capabilities revealed in the code:

Persistent memory. Conway doesn’t forget context between sessions. It maintains a three-layer memory architecture: a lightweight index (MEMORY.md), on-demand topic files, and raw transcripts searchable via grep — never fully reloaded into context.

Always-on and ambient. Unlike all current Claude usage, Conway isn’t activated by prompts. It runs continuously, monitoring webhooks, schedules, and data changes. It can sleep until a specific external signal fires, complete a task, and go back to sleep — a very different cost model from continuous inference.

Proactive. It makes decisions and takes actions before you ask. Monitor incoming tickets, check deployment status, scan competitor sites, generate briefings before meetings — all automatically.

Proprietary extensions (.cnw). Conway has its own extension format — ZIP files that package tools, UI tabs, and context handlers. This isn’t a wrapper around Claude. It’s an ecosystem with a formal configuration layer.

Chrome integration. Conway connects directly to the browser via a Chrome extension, keeping the agent close to wherever you work on the web.

When you see Conway, the OpenClaw block makes complete sense. Anthropic isn’t just protecting margins. It’s clearing the way for its own persistent agent product — one that directly competes with what OpenClaw offered for free via subscription.

The “Walled Garden Wars”

Anthropic’s move reflects a brutal trend in 2026: AI companies don’t want to be just the “engine” (the model). They want to be the “complete car” — the interface, the agent, and the entire ecosystem.

Google did something similar in February, blocking third-party tools from using Gemini CLI’s OAuth. The Google engineer’s wording was almost identical to Anthropic’s: “Using third-party software to harvest or piggyback on Gemini CLI’s OAuth authentication is a direct violation of terms.”

Interoperability is dying. Walled gardens are going up.

What to Do Now (Real Alternatives)

If you, like me, depended on this setup, here are the paths I’ve tested:

Anthropic API (pay-as-you-go). Still the best reasoning on the market, in my opinion. But the cost changes dramatically: $3/million input tokens and $15/million output tokens for Sonnet 4.6. For Opus 4.6, $15 and $75 respectively. Cherny published PRs to improve cache hit rate specifically for OpenClaw via API — helpful, but doesn’t solve the price jump.

OpenAI Codex/ChatGPT. Native integration, robust ecosystem, and now with Steinberger leading personal agents. When asked why he still tests with Claude, he replied “working on that” — a clue about what OpenAI is preparing.

Open local models. Google’s Gemma, NVIDIA’s Nemotron, Meta’s Llama. Freedom without token fees, but with clear quality trade-offs on complex tasks. For many “Efficiency Pillar” tasks, they’re more than enough.

AWS Bedrock and Google Vertex AI. Accessing Claude via cloud providers continues working normally — no subscription restriction. If your company is already in those ecosystems, this may be the path of least resistance.

Conclusion: What I Really Think

I understand Anthropic’s position. Financially, the model was unsustainable. With reported annualized revenue of $19 billion and an IPO on the horizon, subsidizing massive agent usage was a real risk.

But the execution was rough. Announcing on a Friday night. Giving one week’s notice. Temporarily banning OpenClaw’s creator. Launching Dispatch weeks before the cut. The optics are of a company that copies features, locks out open source, and then offers a proprietary version.

Steinberger wasn’t wrong when he said: “One welcomed me, the other sent legal threats” — explaining why he chose OpenAI.

The era of using third-party agents for free under cheap subscriptions is ending. If your workflow depends on Claude, prepare your wallet or diversify. Conway will arrive — probably still in 2026 — and it will probably be impressive. But it will have a price.

And at that point, each of us must decide: pay the premium or invest in open models that give you freedom without asking permission.

Share your decision:

When the tool you’ve been using for free decides to charge, it’s telling you who it is. Believe it.


Read Also