The Intelligence of Energy
Every thought you think costs calories. Every AI response burns watts. Every calculation, biological or silicon, requires energy—and that energy has to come from somewhere.
This isn't a metaphor. Intelligence is expensive, and the bill comes in joules.
We're used to thinking about intelligence as pure information—patterns, algorithms, insights. But information doesn't process itself. Moving bits around, whether in neurons or transistors, requires physical work. And physical work requires energy. Always.
This series explores what happens when we take that constraint seriously.
The Convergence
Something strange is happening in 2026: the energy limits of computation are becoming the central constraint on artificial intelligence. Training large models now costs hundreds of millions of dollars—not in researcher salaries or clever algorithms, but in raw electricity. Data centers are signing deals with nuclear plants. Tech companies are investing in fusion startups.
Meanwhile, your brain runs on 20 watts.
That comparison—the hyperscale data center versus the human skull—reveals something profound about the relationship between energy and intelligence. Biology solved the efficiency problem that silicon is desperately chasing.
What This Series Covers
We begin with the compute ceiling: why AI training costs are exploding and what happens when you hit physical limits. Then we go deep into Landauer's principle—the thermodynamic floor on computation that says every bit erased releases heat.
From there, we examine biological efficiency: how the brain achieves its computational miracles on a lightbulb's power budget. This leads to organoid intelligence and neuromorphic computing—attempts to learn from biology's solutions.
The back half explores the energy supply side: nuclear power for data centers, the tungsten bottleneck in chip manufacturing, and the long bet on fusion. We'll examine reversible computing—theoretical approaches to escaping thermodynamic limits entirely.
The synthesis ties it together: why coherence costs energy, and what that means for the future of minds—biological and artificial.
The Through-Line
Energy isn't just a practical constraint on computation. It's revealing something fundamental about what intelligence is.
Coherent systems—systems that maintain complex, organized states over time—require constant energy input to fight entropy. Brains, computers, civilizations: all are coherence engines running on energy gradients.
Intelligence isn't free. Understanding why tells us what intelligence is made of.
This is the hub page for the Intelligence of Energy series. Start with The Compute Ceiling: Why AI Hit a Wall.
Comments ()