<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Economics on Carles Abarca</title><link>https://carlesabarca.com/tags/economics/</link><description>Recent content in Economics on Carles Abarca</description><generator>Hugo -- gohugo.io</generator><language>en</language><copyright>© 2026 Carles Abarca</copyright><lastBuildDate>Tue, 07 Apr 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://carlesabarca.com/tags/economics/index.xml" rel="self" type="application/rss+xml"/><item><title>The Era of Cheap AI Is Ending</title><link>https://carlesabarca.com/posts/cheap-ai-ending/</link><pubDate>Tue, 07 Apr 2026 00:00:00 +0000</pubDate><guid>https://carlesabarca.com/posts/cheap-ai-ending/</guid><description>Subscription restrictions, harder usage limits, and rising inference costs: AI was never cheap — it was subsidized.</description><content:encoded>&lt;p&gt;&lt;strong&gt;AI felt cheap. It wasn&amp;rsquo;t. It was subsidized.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;A couple of years ago, if someone had told you that the world&amp;rsquo;s most advanced AI would be available for $20 a month, you would have laughed. And yet, for a while, that&amp;rsquo;s exactly what it felt like.&lt;/p&gt;
&lt;p&gt;Flat-rate subscriptions. Increasingly capable models. Usage that, in practice, felt nearly unlimited. The prevailing perception was clear: advanced AI was becoming an abundant, accessible resource.&lt;/p&gt;
&lt;p&gt;But that perception is starting to crack. Not because of a technical failure — because of something more fundamental: economics.&lt;/p&gt;

&lt;h2 class="relative group"&gt;The Signs Are Already Here
 &lt;div id="the-signs-are-already-here" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#the-signs-are-already-here" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;In recent weeks, several converging signals have pointed in the same direction:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Anthropic&lt;/strong&gt;, the creator of Claude, has started actively restricting certain usage patterns through its subscriptions. In particular, it has blocked automation tools like OpenClaw from channeling requests through subscription accounts. The reason is straightforward: the inference cost of those intensive usage patterns doesn&amp;rsquo;t square with the subscription price.&lt;/p&gt;
&lt;p&gt;This isn&amp;rsquo;t an isolated case. &lt;strong&gt;OpenAI&lt;/strong&gt; has been adjusting the real usage limits of its subscription plans for months, progressively reducing the number of interactions with their most powerful models before downgrading users to a lesser model. What were once generous, fuzzy limits are becoming explicit, harder caps.&lt;/p&gt;
&lt;p&gt;And there&amp;rsquo;s something more unsettling: &lt;strong&gt;even with these adjustments, most major AI operators are still not profitable&lt;/strong&gt;. Anthropic, OpenAI, and virtually every frontier lab are burning capital at a pace that would make any traditional CFO faint.&lt;/p&gt;
&lt;p&gt;The obvious question is: if users are already feeling restrictions, how is it possible that providers are still losing money?&lt;/p&gt;
&lt;p&gt;The answer reveals something important about the real economic structure of this industry.&lt;/p&gt;

&lt;h2 class="relative group"&gt;AI Was Never Cheap. It Was Subsidized.
 &lt;div id="ai-was-never-cheap-it-was-subsidized" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#ai-was-never-cheap-it-was-subsidized" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;What we&amp;rsquo;ve experienced over the past two years was not the real cost of AI. It was an adoption strategy.&lt;/p&gt;
&lt;p&gt;The major labs needed to do three things simultaneously:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Demonstrate capability&lt;/strong&gt; to justify valuations in the tens (or hundreds) of billions.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Drive massive adoption&lt;/strong&gt; to create network effects, lock-in, and usage data.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Capture developers and enterprises&lt;/strong&gt; before the competition did.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;To achieve this, they offered access to frontier models at prices that didn&amp;rsquo;t reflect the real cost of operating them. The $20/month subscriptions were, in practice, a subsidy funded by venture capital.&lt;/p&gt;
&lt;p&gt;And it worked. Adoption soared. Millions of people began using Claude, ChatGPT, and other models daily. Companies of every size started integrating AI into their workflows.&lt;/p&gt;
&lt;p&gt;But now we&amp;rsquo;re entering the next phase. And in this phase, the numbers have to start adding up.&lt;/p&gt;

&lt;h2 class="relative group"&gt;Frontier Models Don&amp;rsquo;t Follow Traditional Software Economics
 &lt;div id="frontier-models-dont-follow-traditional-software-economics" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#frontier-models-dont-follow-traditional-software-economics" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;There&amp;rsquo;s a widespread misunderstanding that needs to be addressed head-on: many people assume that AI follows the same economic logic as conventional software. That is: you develop it once, distribute it at near-zero marginal cost, and margins improve with scale.&lt;/p&gt;
&lt;p&gt;But generative AI — especially frontier models — doesn&amp;rsquo;t work like that. Not at all.&lt;/p&gt;
&lt;p&gt;Every interaction with a large model involves:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Real-time computation&lt;/strong&gt; on very expensive hardware (state-of-the-art GPUs).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Significant energy consumption&lt;/strong&gt; per request.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Infrastructure costs&lt;/strong&gt; that don&amp;rsquo;t disappear with scale; in many cases, they grow with it.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Latency and availability&lt;/strong&gt; requirements that demand reserved capacity, not just peak capacity.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;And the problem is amplified by the most recent usage trends. AI agents — which execute multiple chained calls to complete complex tasks — multiply inference costs dramatically. A single agent session solving a programming problem can involve dozens of model calls, each with long contexts and active tools.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;More capability doesn&amp;rsquo;t automatically mean lower unit cost. In many frontier AI cases, it means exactly the opposite.&lt;/strong&gt;&lt;/p&gt;

&lt;h2 class="relative group"&gt;The Flat Rate Was a Commercial Strategy, Not a Sustainable Reality
 &lt;div id="the-flat-rate-was-a-commercial-strategy-not-a-sustainable-reality" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#the-flat-rate-was-a-commercial-strategy-not-a-sustainable-reality" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;Let&amp;rsquo;s think about what a flat-rate subscription to a frontier model actually meant in practice:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A user paid $20 a month.&lt;/li&gt;
&lt;li&gt;They had access to a model whose inference cost, under intensive use, could easily exceed $100 or $200 per month per user.&lt;/li&gt;
&lt;li&gt;The provider absorbed the difference, betting that the average user wouldn&amp;rsquo;t use the model that intensively.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;That model works reasonably well when most users are casual: they ask a few questions a day, use the model for simple tasks, and don&amp;rsquo;t stress the infrastructure. It&amp;rsquo;s the same principle that makes gyms work: they sell more memberships than the gym can simultaneously accommodate, betting that most people won&amp;rsquo;t show up every day.&lt;/p&gt;
&lt;p&gt;But when tools emerge that channel intensive, programmatic use through those subscriptions, the model breaks. It&amp;rsquo;s as if someone found a way to fill the entire gym 24 hours a day. The price no longer covers the cost.&lt;/p&gt;
&lt;p&gt;That&amp;rsquo;s why providers are reacting:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;More explicit usage limits&lt;/strong&gt; per model and per period.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Automatic degradation&lt;/strong&gt; to less expensive models when certain thresholds are reached.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Contractual restrictions&lt;/strong&gt; against unanticipated usage patterns.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Clearer separation&lt;/strong&gt; between consumer, API, and enterprise plans.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;It&amp;rsquo;s not a whim. It&amp;rsquo;s economic survival.&lt;/p&gt;

&lt;h2 class="relative group"&gt;The Great Paradox: More Restrictions, and Still Not Enough
 &lt;div id="the-great-paradox-more-restrictions-and-still-not-enough" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#the-great-paradox-more-restrictions-and-still-not-enough" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;Here&amp;rsquo;s the data point that should give pause to any business leader betting heavily on AI:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Even after introducing all these restrictions, most frontier AI operators are still losing money.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Anthropic has raised over $10 billion in funding. OpenAI is on a similar trajectory. And neither has yet demonstrated a sustainable economic model at scale without continuous external capital injection.&lt;/p&gt;
&lt;p&gt;This doesn&amp;rsquo;t mean the business is unviable. There&amp;rsquo;s probably an economic equilibrium somewhere, with the right combination of pricing, inference efficiency, enterprise volume, and hardware optimization. But that equilibrium clearly hasn&amp;rsquo;t been reached yet.&lt;/p&gt;
&lt;p&gt;And meanwhile, the industry keeps advancing toward bigger, more capable models with wider context windows, more integrated tools, and more agentic capabilities. All of those improvements are fantastic for users. But each one &lt;strong&gt;increases inference cost&lt;/strong&gt;, not reduces it.&lt;/p&gt;

&lt;h2 class="relative group"&gt;What This Means for Enterprises
 &lt;div id="what-this-means-for-enterprises" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#what-this-means-for-enterprises" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;If you&amp;rsquo;re a digital transformation leader or a CTO designing your organization&amp;rsquo;s AI strategy, the message is clear:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Don&amp;rsquo;t build your AI architecture assuming that today&amp;rsquo;s cost is tomorrow&amp;rsquo;s cost. And certainly not assuming it will go down.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;In practice, this means several things:&lt;/p&gt;

&lt;h3 class="relative group"&gt;1. Design for Multiple Model Tiers
 &lt;div id="1-design-for-multiple-model-tiers" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#1-design-for-multiple-model-tiers" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h3&gt;
&lt;p&gt;Not everything needs a frontier model. Many tasks — classification, data extraction, routine summaries, basic assistance — can be handled by smaller, significantly cheaper models. Reserve the most powerful models for tasks that truly require them.&lt;/p&gt;

&lt;h3 class="relative group"&gt;2. Implement Intelligent Routing
 &lt;div id="2-implement-intelligent-routing" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#2-implement-intelligent-routing" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h3&gt;
&lt;p&gt;Mature AI architectures don&amp;rsquo;t send everything to the same model. They route each request to the most efficient model that can resolve it at the required quality level. This can reduce inference costs by 60-80% without degrading the end-user experience.&lt;/p&gt;

&lt;h3 class="relative group"&gt;3. Measure Cost Per Use Case
 &lt;div id="3-measure-cost-per-use-case" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#3-measure-cost-per-use-case" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h3&gt;
&lt;p&gt;If you don&amp;rsquo;t know how much each AI interaction costs, broken down by task type, model used, and outcome achieved, you&amp;rsquo;re flying blind. AI cost observability should be as standard as it is in cloud infrastructure.&lt;/p&gt;

&lt;h3 class="relative group"&gt;4. Think of AI as Infrastructure, Not Just Software
 &lt;div id="4-think-of-ai-as-infrastructure-not-just-software" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#4-think-of-ai-as-infrastructure-not-just-software" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h3&gt;
&lt;p&gt;Generative AI has a variable cost component that resembles electricity or cloud compute more than a software license. Plan accordingly: with capacity reserves, variable budgets, and consumption governance.&lt;/p&gt;

&lt;h3 class="relative group"&gt;5. Don&amp;rsquo;t Depend on a Single Provider
 &lt;div id="5-dont-depend-on-a-single-provider" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#5-dont-depend-on-a-single-provider" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h3&gt;
&lt;p&gt;Concentration on a single model provider exposes you directly to their pricing decisions, policy changes, and usage restrictions. A multi-model architecture gives you the flexibility to adapt when — not if — conditions change.&lt;/p&gt;

&lt;h2 class="relative group"&gt;The End of an Era, Not the End of AI
 &lt;div id="the-end-of-an-era-not-the-end-of-ai" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#the-end-of-an-era-not-the-end-of-ai" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;I want to be clear about something: &lt;strong&gt;nothing I&amp;rsquo;ve described here is negative for the future of AI&lt;/strong&gt;. It is simply the end of a phase.&lt;/p&gt;
&lt;p&gt;The phase that&amp;rsquo;s ending is one of &lt;strong&gt;illusory abundance&lt;/strong&gt;: frontier models at promotional prices, apparently unlimited usage, and a widespread feeling that advanced AI was becoming a commodity.&lt;/p&gt;
&lt;p&gt;The phase that&amp;rsquo;s beginning is more honest. It&amp;rsquo;s the phase where:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Prices more faithfully reflect real costs.&lt;/li&gt;
&lt;li&gt;Providers find sustainable business models.&lt;/li&gt;
&lt;li&gt;Enterprises learn to use AI with economic discipline.&lt;/li&gt;
&lt;li&gt;And the market matures, as cloud, SaaS, and every other technology infrastructure that started with promotional pricing has matured before it.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In a sense, it&amp;rsquo;s good news. Because an indefinitely subsidized market is a fragile market. And a market that finds its economic equilibrium is a market that can last.&lt;/p&gt;

&lt;h2 class="relative group"&gt;Conclusion
 &lt;div id="conclusion" class="anchor"&gt;&lt;/div&gt;
 
 &lt;span
 class="absolute top-0 w-6 transition-opacity opacity-0 -start-6 not-prose group-hover:opacity-100 select-none"&gt;
 &lt;a class="text-primary-300 dark:text-neutral-700 !no-underline" href="#conclusion" aria-label="Anchor"&gt;#&lt;/a&gt;
 &lt;/span&gt;
 
&lt;/h2&gt;
&lt;p&gt;AI isn&amp;rsquo;t suddenly becoming more expensive. We&amp;rsquo;re simply stopping the pretense that it was cheap.&lt;/p&gt;
&lt;p&gt;And the question that should be on every boardroom table is no longer just &amp;ldquo;What can AI do for us?&amp;rdquo; It&amp;rsquo;s something more uncomfortable and more necessary:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;How much is it really going to cost us, and are we designing for sustainability?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Whoever has a good answer to that question will have a real competitive advantage. Whoever doesn&amp;rsquo;t will discover that the most expensive AI isn&amp;rsquo;t the one with the best model — it&amp;rsquo;s the one that was deployed without thinking about the economics.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;em&gt;Carles Abarca is VP of Digital Transformation at Tecnológico de Monterrey. He writes about AI, digital strategy, and the future of technology in organizations.&lt;/em&gt;&lt;/p&gt;</content:encoded><media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://carlesabarca.com/posts/cheap-ai-ending/featured.png"/></item></channel></rss>