The number that matters this earnings season isn't revenue growth. It's not user engagement or even profit. The number that will define the next two years for big tech is 350 megawatts. That’s the rumored power draw of just one of Google’s new AI-focused data center campuses in The Dalles, Oregon. Enough to power a city of a quarter-million people.
For the past three years, Wall Street has given tech giants a blank check labeled "AI." Spend whatever it takes. Build the models. Buy the Nvidia chips. Corner the market. We'll reward you with a higher stock price. Now, the bill is coming due, and I suspect investors are about to get a nasty shock when the Q1 2026 numbers start rolling in next month.
The narrative is finally shifting from "How big is your model?" to "What are your unit economics?" And frankly, for most of them, the answer is probably ugly.
The Great AI Spending Hangover
Let’s rewind. 2024 was the year of brute force. Capital expenditure across the top five tech firms surged by an estimated $80 billion, almost all of it funneled into the AI arms race. It was a land grab. Microsoft shoveled cash into its partnership with OpenAI. Google scrambled to prove its Gemini models were competitive. Meta, desperate for a win after its metaverse face-plant, went all-in on open-sourcing Llama to build an ecosystem.
It was a classic gold rush. The problem with a gold rush is that the only people who reliably get rich are the ones selling the shovels. In this case, that was Nvidia, and to a lesser extent, the power companies. For everyone else, the return on that massive investment has been… fuzzy.
Now we’re in 2026. The cost of capital isn't zero anymore, sitting at a stubborn 4.0%. The low-hanging fruit of integrating AI into existing products—summarizing emails, generating ad copy—has been picked. The next phase is about creating entirely new, profitable revenue streams. That’s a much harder task, especially when the cost of goods sold (COGS) for an AI query is orders of magnitude higher than a simple web search.
Why Are Big Tech Stocks Stuttering?
The market is starting to sniff this out. After a blistering run, the Nasdaq has been trading sideways for most of the new year. The reason is simple: uncertainty. Investors are moving from a "growth at any cost" mindset to a "prove it" model. They want to see a clear path from AI features to bottom-line profit, and the roadmaps are getting murkier, not clearer.
Let's look at the numbers that are causing jitters ahead of the Q1 reports:
- AI Inference Costs: My back-of-the-envelope math, based on conversations with cloud engineers, suggests that a complex AI-powered search query can cost a company between $0.02 and $0.05 in pure compute. A traditional keyword search? It's closer to $0.0003. When you serve billions of queries a day, that difference isn't just a rounding error—it's a margin-killer.
- Capital Expenditure vs. Revenue: Look for this in the upcoming reports. In Q1 2025, cloud revenue for the big three (AWS, Azure, Google Cloud) grew, on average, around 22%. Their data center capex, however, grew by nearly 40%. That’s an unsustainable divergence. If that gap hasn't narrowed significantly, it's a major red flag.
- Energy Consumption: Tech companies are notoriously cagey about this, but estimates from utility providers suggest their power consumption in key states like Virginia and Oregon is up 50% since early 2024. That’s an operational expense that hits directly at the gross margin line, and as the recent oil price volatility shows, energy costs are anything but predictable.
- Enterprise Adoption Plateau: According to a recent Morgan Stanley survey, the percentage of CIOs experimenting with generative AI is still high (around 85%), but the percentage moving from pilot programs to full-scale, eight-figure contracts has flatlined at around 20% in the last two quarters. The "wow" factor is wearing off, and budget-conscious CFOs are now asking for hard ROI.
This is the hangover. The party was fun, but the cleanup is going to be expensive.



