Big Tech Earnings: The $100B AI Bill Is Coming Due

Big Tech Earnings: The $100B AI Bill Is Coming Due

DP
Daniel Park

Economy & Markets Editor

·6 min read·1273 words
techmarginrevenuegooglecost
Share:

The number that matters this earnings season isn't revenue growth. It's not user engagement or even profit. The number that will define the next two years for big tech is 350 megawatts. That’s the rumored power draw of just one of Google’s new AI-focused data center campuses in The Dalles, Oregon. Enough to power a city of a quarter-million people.

For the past three years, Wall Street has given tech giants a blank check labeled "AI." Spend whatever it takes. Build the models. Buy the Nvidia chips. Corner the market. We'll reward you with a higher stock price. Now, the bill is coming due, and I suspect investors are about to get a nasty shock when the Q1 2026 numbers start rolling in next month.

The narrative is finally shifting from "How big is your model?" to "What are your unit economics?" And frankly, for most of them, the answer is probably ugly.

The Great AI Spending Hangover

Let’s rewind. 2024 was the year of brute force. Capital expenditure across the top five tech firms surged by an estimated $80 billion, almost all of it funneled into the AI arms race. It was a land grab. Microsoft shoveled cash into its partnership with OpenAI. Google scrambled to prove its Gemini models were competitive. Meta, desperate for a win after its metaverse face-plant, went all-in on open-sourcing Llama to build an ecosystem.

It was a classic gold rush. The problem with a gold rush is that the only people who reliably get rich are the ones selling the shovels. In this case, that was Nvidia, and to a lesser extent, the power companies. For everyone else, the return on that massive investment has been… fuzzy.

Now we’re in 2026. The cost of capital isn't zero anymore, sitting at a stubborn 4.0%. The low-hanging fruit of integrating AI into existing products—summarizing emails, generating ad copy—has been picked. The next phase is about creating entirely new, profitable revenue streams. That’s a much harder task, especially when the cost of goods sold (COGS) for an AI query is orders of magnitude higher than a simple web search.

Why Are Big Tech Stocks Stuttering?

The market is starting to sniff this out. After a blistering run, the Nasdaq has been trading sideways for most of the new year. The reason is simple: uncertainty. Investors are moving from a "growth at any cost" mindset to a "prove it" model. They want to see a clear path from AI features to bottom-line profit, and the roadmaps are getting murkier, not clearer.

Let's look at the numbers that are causing jitters ahead of the Q1 reports:

  • AI Inference Costs: My back-of-the-envelope math, based on conversations with cloud engineers, suggests that a complex AI-powered search query can cost a company between $0.02 and $0.05 in pure compute. A traditional keyword search? It's closer to $0.0003. When you serve billions of queries a day, that difference isn't just a rounding error—it's a margin-killer.
  • Capital Expenditure vs. Revenue: Look for this in the upcoming reports. In Q1 2025, cloud revenue for the big three (AWS, Azure, Google Cloud) grew, on average, around 22%. Their data center capex, however, grew by nearly 40%. That’s an unsustainable divergence. If that gap hasn't narrowed significantly, it's a major red flag.
  • Energy Consumption: Tech companies are notoriously cagey about this, but estimates from utility providers suggest their power consumption in key states like Virginia and Oregon is up 50% since early 2024. That’s an operational expense that hits directly at the gross margin line, and as the recent oil price volatility shows, energy costs are anything but predictable.
  • Enterprise Adoption Plateau: According to a recent Morgan Stanley survey, the percentage of CIOs experimenting with generative AI is still high (around 85%), but the percentage moving from pilot programs to full-scale, eight-figure contracts has flatlined at around 20% in the last two quarters. The "wow" factor is wearing off, and budget-conscious CFOs are now asking for hard ROI.

This is the hangover. The party was fun, but the cleanup is going to be expensive.

How Is Big Tech Actually *Paying* for AI?

This is the trillion-dollar question. Each of the giants is taking a different, and risky, approach to monetization. When I was on Wall Street, we called this "strategic divergence." In plain English, it means nobody is quite sure what works, so they're all trying something different.

The Players and Their Bets

Microsoft: They're the furthest ahead, and their strategy is the most straightforward. They're embedding AI into the products businesses already pay for—Office 365 (now Microsoft 365), Azure, and Dynamics—and charging a premium. The Copilot subscription, at roughly $30/user/month, is a direct attempt to monetize. The key metric to watch in their report is not just Azure growth, but the attach rate of these premium AI services. Are customers paying up, or are they sticking with the basic versions?

Alphabet (Google): They're in the toughest spot. Their core search business is a 90%+ gross margin machine. Layering on expensive AI queries risks cannibalizing their own golden goose. Their strategy seems to be a mix of charging for premium AI tiers in Google Workspace and hoping that better AI-powered ads will lift overall ad revenue. It feels defensive. As TechCrunch and others have reported, their Gemini API pricing is aggressive, but they need to convert that into a real threat to Azure's enterprise dominance.

Amazon (AWS): Amazon is playing the long game, positioning AWS as the neutral platform. They offer access to their own Titan models, but also models from Anthropic, Cohere, and others via Bedrock. Their bet is that they don't need to win the model war, they just need to be the place where everyone else runs their models. Watch their AWS margin figures closely. Are they being forced to discount heavily to compete with Microsoft and Google's native offerings?

Meta: The dark horse. By open-sourcing Llama, they commoditized the model layer, forcing competitors to justify their high prices. Meta's own monetization is indirect: use AI to make their ad targeting engine on Facebook and Instagram even more potent, thus charging advertisers more for better conversion. It’s a smart play, but it’s also less transparent. It will be hard to isolate the "AI lift" in their ad revenue numbers.

The Hidden Risk: Margin Compression as a Feature, Not a Bug

Here’s the contrarian thought that I can’t shake: What if high-cost AI isn't a temporary problem to be solved by more efficient chips? What if it's a permanent feature of this technological cycle?

The entire business model of software and the internet for the last 25 years has been built on the magic of near-zero marginal costs. Once you build a search engine or a social network, serving one more user costs next to nothing. This is what created the obscene 80-90% gross margins that made these companies market darlings.

"Generative AI, in its current form, breaks this model. Every query, every image generated, every line of code suggested incurs a real, non-trivial compute cost. It turns a software business into something that looks a lot more like a manufacturing business, with a direct, variable cost attached to every unit sold."

This is the risk the market hasn't fully priced in. We're so focused on the potential revenue uplift that we're ignoring the potential for a permanent, structural decline in gross margins. If Google's margin on search were to fall from 90% to, say, 70% over the next five years due to AI compute costs, it would require a staggering amount of new revenue just to keep net income flat. The official

Related Articles