Cover photo

A question of power

Letʼs get it out of the way: training a large language model can consume as much electricity as a small city. Running“AI” applications at scale demands a vast infrastructure of power-hungry data centres and GPU clusters.

These are the facts.

AI is an energy hog - donʼt let anyone tell you different.

And when weʼre increasingly facing existential anxiety about both climate change and AI, the notion that the rapid advancement of LLM research will come at the cost of skyrocketing carbon emissions feels like…well, a story with stakes.

Tech advocates and AI fanboys are finding themselves on the defensive, scrambling to justify AI’s energy footprint in the face of public scrutiny and environmental backlash.

But the AI energy argument is a red herring.

It’s not a problem with AI. It’s a problem with energy.

Specifically, how that energy is being generated in the first place. We live in the 21st century, and yet the backbone of our energy infrastructure remains stubbornly, stupidly, ludicrously, irresponsibly, indefensibly stuck in the 20th. Globally, a staggering 80%+ of energy still comes from fossil fuels - an antiquated, polluting technology we’ve known for decades is unsustainable and driving us towards climate catastrophe.

That’s the real sustainability crisis, not the emergence of power-hungry new technologies. Until we transition our energy systems to clean, renewable sources, hand-wringing over AI’s carbon footprint is an exercise in missing the forest for the steadily vanishing trees.

The only sane path forward is to fix the underlying issue: to rapidly transition our grids from fossil fuels to clean power.

This is not an argument for AI developers and tech companies to ignore efficiency or be given a blank check to waste energy. Minimizing unnecessary energy usage should always be a priority. But at a systemic level, we need to be clear-eyed about the real sustainability bottleneck : the fact that in 2024, we’re still burning hundred-million-year-old carbon sludge to keep the lights on. It’s ridiculous. It’s reckless. And it needs to change as rapidly as possible.

In the meantime, wasting our limited attention and outrage on the energy consumption of specific applications, rather than the energy system as a whole, is counterproductive. It’s a form of displacement - a way to feel like we’re addressing sustainability without grappling with the true scale of the problem.

By all means, let’s have a robust public discourse about LLMs and artificial intelligence. There are vital conversations to be had about the societal impacts, the ethical implications, the economic disruptions. We need clear-eyed frameworks for mitigating risks and protecting IP.

“AI uses too much energy” cannot be the central thrust of the discussion. It’s a shallow take that distracts from the real, underlying crisis. The problem isn’t that AI is using “too much” power from our current grid; it’s that our current grid still overwhelmingly runs on fossil fuels in the first place. Until that changes, we’re just rearranging deckchairs on the Titanic.

Directing outrage at AI’s power usage is like worrying about a leaky faucet while the house is on fire. Yes, we should fix the faucet at some point. But that’s hardly the most pressing concern when the entire structure is about to burn to the ground.

@Westenberg logo
Subscribe to @Westenberg and never miss a post.