Sam Altman’s Caloric Deflection: Why AI Isn't a Human

Sam Altman’s Caloric Deflection: Why AI Isn't a Human

Alex Chen
Alex Chen

Senior Tech Editor

·Updated 2d ago·6 min read·1228 words
energyaltmanconsumptiontechcomparing
Share:

The Sandwich vs. The Server

Sam Altman wants to talk about your lunch. Specifically, how much energy it took to make it, and why that makes you just as much of a resource hog as a cluster of H100 GPUs. It is a classic Silicon Valley pivot—taking a massive, systemic problem and reframing it as a relatable, almost philosophical comparison. But let’s be real: comparing the metabolic rate of a human being to the cooling requirements of a 100,000-node data center is like comparing a candle to a forest fire because they both happen to produce heat.

I’ve spent a decade watching these guys dance around the consequences of their "move fast and break things" ethos. Usually, the "breaking" involves taxi regulations or hotel zoning laws. This time, it involves the power grid. As reported by TechCrunch, Altman is essentially arguing that if we’re going to freak out about the wattage required to train GPT-5, we should also look in the mirror at our own biological inefficiencies. It’s bold. It’s weird. It’s Sam.

Why This Matters Right Now

You should care about this because your electricity bill is about to become a battleground. We are currently seeing a massive surge in energy demand that the existing infrastructure simply wasn't built to handle. When the CEO of the world's most influential AI company starts comparing silicon to biology, he isn't just making a "fun fact" observation. He is setting the stage for a policy fight. He wants to normalize the idea that "intelligence"—whether biological or synthetic—has a high energy price tag that we should all just accept as the cost of doing business.

The numbers are staggering. According to Reuters, data centers could consume up to 9% of total U.S. electricity generation by 2030, up from roughly 4% today. We are talking about doubling the footprint of the digital world in less than a decade. I remember sitting in a server room in 2014, complaining about the heat from a few measly racks; today’s engineers are looking at liquid-cooling solutions that look more like industrial chemical plants than computer hardware.

The Data They Don't Want You to Crunch

Let’s look at the "biological" argument through a cold, technical lens. A human brain runs on about 20 watts of power. That is roughly the energy needed to power a dim LED light bulb. In exchange for those 20 watts, you get consciousness, creativity, the ability to drive a car, and—occasionally—the ability to write code that actually compiles. By contrast, training a large language model (LLM) requires megawatts of sustained power for months on end.

  • A single ChatGPT query uses roughly 10 times more electricity than a Google search.
  • Microsoft and Google’s combined energy consumption has already surpassed that of entire countries like Iceland or Ghana.
  • Estimates suggest OpenAI’s "Stargate" supercomputer project could require up to 5 gigawatts of power—enough to light up several million homes.

So, when Altman reminds us that "humans use energy too," he’s ignoring the scale of the delta. I can write a snarky article on a ham sandwich and a cup of coffee. To get an AI to do the same, you need to burn enough coal to power a small suburb. The efficiency gap isn't just wide; it's astronomical.

The Angle Everyone is Missing: The "Personhood" Shield

There is a deeper, more cynical pattern at play here. By comparing AI energy use to human energy use, Altman is subtly personifying the machine. This is a brilliant legal and PR strategy. If we start thinking of AI as a "digital person," then denying it energy starts to feel like a human rights violation rather than a corporate regulation issue.

We saw this before with the "corporate personhood" debates in the early 2000s. If a corporation is a person, it has free speech. If an AI is "like a human," it has a right to "eat" (consume power). It’s a way to move the conversation away from ESG goals and toward a debate about the "necessity" of artificial intelligence for the survival of the species. It’s a high-stakes shell game.

Alex’s Take: This isn't an environmental argument; it's a pre-emptive strike against future regulation. Altman knows the EPA and local utility boards are coming for his power cords. By framing energy consumption as a fundamental requirement of "intelligence," he’s trying to make AI consumption unassailable. It’s the ultimate "don't hate the player, hate the game" move.

The Ghost of Crypto Past

The last time we had a conversation this heated about "useless" energy consumption was the Bitcoin mining boom of 2021. Back then, the industry’s defense was that crypto would "incentivize green energy production." We’re hearing the same tune now. Microsoft is literally trying to restart Three Mile Island to feed its AI hunger.

But there’s a difference. Crypto was largely decentralized and, frankly, optional for the average person. AI is being baked into every piece of software we use. You can’t opt out of the energy consumption of the AI that’s now summarizing your emails or generating your slide decks. This is a forced tax on our global energy reserves, packaged as "innovation." For more on how this is affecting the market, check out our analysis on AI's Power Problem Is a Gold Mine for Batteries.

The Efficiency Myth

The tech industry loves to talk about Jevons Paradox, even if they don't call it that. The theory is that as technology makes a resource more efficient to use, we don't use less of it—we use way more. We see this in every "efficiency" claim from Silicon Valley.

Altman might argue that AI will eventually help us solve fusion or optimize the grid. Maybe. But in the meantime, the "bridge" to that future is paved with billions of dollars in natural gas and nuclear contracts. I’ve sat through enough product launches to know that "future efficiency" is usually code for "current excess." We are being asked to gamble our current climate stability on the hope that the machine we’re building will eventually figure out how to fix the mess we made while building it.

The Specific Prediction: The Rise of the "Energy Entitlement" Era

Here is where this is actually going. We are about to enter an era of "Energy Protectionism" for tech giants.

Within the next 36 months, expect to see the "Big Three" (Microsoft, Google, Amazon) move to become their own utility companies. They won't just buy green energy; they will own the generation, the transmission, and the regulation. We are going to see a "de-coupling" of the tech grid from the public grid.

For professionals in the energy and tech sectors, this signals a massive shift: Power is the new compute. If you’re a startup, your bottleneck won't be talent or even capital—it will be your "energy allocation." The downstream effect I'm watching: A "compute-to-calorie" metric becoming a standard part of corporate SEC filings by 2027. Companies will have to justify why their AI "brain" deserves more electricity than the literal brains of the community it sits in.

Sam Altman is right about one thing: humans use a lot of energy. But we use it to live. AI uses it to process. Confusing the two isn't just a mistake—it's a strategy. And if we buy into the "biological defense," we’re essentially giving tech companies a blank check to out-eat the rest of the planet.

Related Articles