Synthesis: Intelligence Scales With Energy
We've traveled from cooking fires to fusion reactors, from brain calories to Kardashev civilizations. The journey reveals a single through-line: intelligence and energy are bound together.
Not metaphorically. Not loosely. Bound in a way that physics enforces.
Every leap in human capacity—fire, agriculture, fossil fuels, electricity—has been an energy leap. Every expansion of what minds can do has required an expansion of available power. The correlation isn't accidental. It's structural.
This synthesis brings together the threads of the series to make a claim that's both obvious and underappreciated: cognitive capacity tracks energy capture. Understanding this changes how you think about human history, artificial intelligence, and civilizational futures.
The Caloric Foundation
Start with the brain itself.
Your brain runs on about 20 watts—roughly the power of a dim light bulb. It weighs 2% of your body but consumes 20% of your calories. This disproportionate energy demand is why human intelligence exists at all.
The cooking hypothesis (from Article 1) explains how we got here: fire allowed us to pre-digest food externally, extracting more calories with less metabolic effort. Those surplus calories funded a bigger brain. Homo sapiens is literally a species shaped by energy technology.
But there's a ceiling. The human body can sustain only so much caloric throughput. Our brain size stabilized roughly 200,000 years ago, not because we stopped evolving but because we hit the metabolic limit. More brain would have required more calories than our gut could deliver.
For 200,000 years, human intelligence was capped by biology. Individual humans didn't get smarter. We got more numerous and more organized. Civilization was the workaround for biological limits.
The Externalization Turn
Then something changed. We learned to think with tools that don't eat.
Writing externalized memory. You no longer had to store everything in neurons; you could store it in marks on clay. The total "memory" available to a civilization exploded beyond any individual brain.
Printing externalized copying. One manuscript became thousands. Ideas spread faster than any messenger could carry them.
Calculation externalized arithmetic. From abacuses to slide rules to mechanical computers, we built devices that did cognitive work without requiring calories.
Each externalization followed the same pattern: offload cognitive work to something that runs on a different energy budget. Writing runs on the energy of making marks. Printing runs on the energy of pressing type. Computing runs on electricity.
This externalization is the key insight. Human intelligence didn't scale by growing bigger brains. It scaled by connecting brains to external systems that amplified their reach.
And those external systems? They run on industrial energy.
The Industrial Amplification
The steam engine didn't just move coal and drive looms. It funded everything else.
Before industrial energy, most human labor went to food production. About 90% of people were farmers, producing the calories that kept everyone—including the small cognitive elite—alive. There was limited surplus for cities, for specialists, for thinkers.
Steam and coal changed the ratio. By 1900, machines were doing work that would have required hundreds of millions of human laborers. That freed actual humans to do something other than subsistence. Education expanded. Literacy spread. The fraction of humanity doing cognitive work—science, engineering, administration, art—grew from a sliver to a significant minority.
Industrial energy purchased cognitive capacity. Not directly, but through the surplus it created. Every scientist, every engineer, every knowledge worker exists because machines do the physical work that would otherwise consume their time.
This is still true. If you're reading this, you're probably not a farmer. Someone else—or rather, some network of machines powered by fossil fuels—produced your calories. The energy regime underwrites your cognitive freedom.
The Information Explosion
Then came electronics, and the amplification went exponential.
Article 7 covered the details: Moore's Law and Dennard scaling meant computation got radically cheaper for 50 years. But the deeper point is what cheap computation enabled.
Artificial cognition. Not true intelligence (yet), but cognitive labor performed by machines. Search engines that answer questions. Spreadsheets that calculate. Databases that remember. Large language models that write and reason.
Each of these is externalized cognition running on electricity. The energy cost is real but the cognitive leverage is enormous. A single person with a laptop and internet connection can access more information, perform more calculations, and coordinate with more people than any pre-industrial civilization could.
The cognitive capacity of humanity didn't just grow. It exploded. And it exploded because we connected biological brains to electrical networks running on terawatts of power.
The current AI moment is just the latest wave. Large language models are statistical patterns extracted from the entire written output of humanity, running on industrial-scale computation. They're not conscious, but they're productive—they augment and substitute for human cognitive labor in ways that would have seemed magical a generation ago.
And they're hungry for power. Training GPT-4 required more electricity than a small city uses in a year. Running inference across millions of queries requires dedicated data centers. AI companies are building nuclear plants not because they want to but because they have to.
Intelligence requires joules. Silicon intelligence requires lots of joules.
The Biological Baseline
Compare the brain again.
The human brain: ~10^15 operations per second on 20 watts. That's about 50 trillion operations per joule.
Current silicon: ~10^12 operations per joule. About 50,000 times less efficient.
The brain is an astonishingly energy-efficient computer. Evolution spent hundreds of millions of years optimizing it. Silicon has been optimizing for decades. The gap shows.
This efficiency gap explains why biological intelligence dominated for so long. If you're competing for scarce energy, wet neurons beat dry circuits. It's only when you have industrial-scale power that silicon becomes competitive.
But silicon has advantages biology lacks: it can be manufactured at scale, copied exactly, run continuously, and connected across distances. A network of inefficient computers, backed by enough power, can outperform efficient brains that can't network as densely.
The contest between biological and artificial intelligence is ultimately an energy contest. Right now, biology is more efficient. Silicon has more power. If silicon becomes efficient (neuromorphic computing, biological substrates) or gets even more power (fusion), the balance could tip permanently.
The Kardashev Perspective
Zoom out to civilizational scales.
Kardashev classified civilizations by power consumption. But there's an implicit second axis: cognitive capacity. A Type I civilization doesn't just use planetary energy—it processes planetary information. A Type II civilization doesn't just capture stellar output—it thinks at stellar scales.
Energy and cognition scale together because thinking is expensive. Running minds—biological or artificial—requires joules. More minds, or more powerful minds, require more joules.
A rough calculation: if human intelligence requires 20 watts per brain, and there are 8 billion brains, biological human intelligence runs on about 160 gigawatts. Global AI computation adds perhaps another 50 gigawatts currently. Total cognitive power: roughly 200 gigawatts.
A Type I civilization using 10^17 watts could, in principle, run 10^15 watts of computation—a million times more than current humanity. Whether that computation would be "intelligent" in any recognizable sense is unknown, but the capacity would exist.
The path to Type I is also the path to radically expanded cognitive capacity. Not just more energy for industry, but more energy for thinking.
The Through-Line
Let's trace the full arc:
1. Fire unlocked calories from food → bigger brains → Homo sapiens emerges 2. Agriculture created surplus → cities, writing, specialists → culture accumulates 3. Fossil fuels mechanized labor → industrial surplus → mass education, science 4. Electricity enabled computation → electronic cognition → information economy 5. AI amplifies processing → artificial cognition → ??? 6. Abundant clean energy → civilization-scale computation → Type I and beyond
Each step builds on the previous. Each requires energy.
The pattern isn't deterministic—civilizations can stall, collapse, or choose different paths. But the correlation holds: more energy enables more cognition enables more complexity enables more capability.
This is what civilizational progress actually means: not just comfort or wealth or justice, but expanded cognitive capacity made possible by expanded energy capture. The rest—culture, politics, economy—builds on that foundation.
What This Implies
Several things follow from the energy-intelligence connection:
Energy isn't optional. You can't have advanced civilization—including the science needed to solve climate change—without abundant energy. De-growth would mean de-cognition. The challenge isn't using less energy but transitioning to cleaner energy while maintaining or increasing capacity.
AI is an energy problem. The future of artificial intelligence depends on the future of power generation. If energy stays expensive, AI development slows. If energy becomes abundant, AI could scale to planetary significance. The people building AI understand this; that's why they're investing in nuclear.
Collapse has cognitive dimensions. When civilizations lose energy capacity, they lose cognitive capacity. Libraries burn. Knowledge is forgotten. Technical skills disappear. The Dark Ages weren't dark because people chose ignorance but because the energy surplus that had funded Roman complexity was gone.
The stakes of the energy transition are cognitive. We're not just deciding whether the planet warms 2 or 3 degrees. We're deciding whether humanity continues climbing the cognitive ladder or stalls on a step. Climate change threatens the infrastructure that enables current cognitive capacity; solving it requires building infrastructure for even more.
The Coherence Connection
One more thread to tie in.
The Ideasthesia framework understands meaning through coherence—M = C/T, meaning equals coherence over time. Minds create meaning by integrating information into coherent patterns that persist.
Energy is what enables coherence. Maintaining complex patterns against entropy requires energy. The second law of thermodynamics guarantees that order decays without energy input. Minds, cultures, civilizations—all are coherent patterns fighting entropy with energy.
More energy means more capacity for coherence. Not guaranteed coherence—you can waste energy on chaos—but the possibility of coherence at larger scales. A Type I civilization could maintain coherence across an entire planet. A Type II civilization across a solar system.
The drive toward energy is, at root, a drive toward expanded coherence. Toward more meaning at larger scales.
Conclusion: The Inheritance and the Task
We inherited 200,000 years of biological brain development, 10,000 years of cultural accumulation, and 200 years of industrial energy capture. Every calorie of surplus, every joule of computation, every bit of externalized cognition stacks up to this moment.
The task is to continue climbing.
That means completing the clean energy transition—not as an environmental checkbox but as the foundation for what comes next. That means navigating the AI transition—not as a threat to be contained but as a cognitive amplifier to be directed. That means thinking at civilizational timescales while acting in the political now.
Intelligence scales with energy. This is the lesson of fire, steam, electricity, and silicon. It's the lesson encoded in 300 million years of fossil sunlight and 13 billion years of stellar fusion.
The universe runs on energy. So does thinking. So does civilization.
What we do with that understanding determines what kind of intelligence fills the future—and whether there's any intelligence at all.
Further Reading
- Smil, V. (2017). Energy and Civilization: A History. MIT Press. - Kauffman, S. (2019). A World Beyond Physics: The Emergence and Evolution of Life. Oxford University Press. - Friston, K. (2010). "The free-energy principle: a unified brain theory?" Nature Reviews Neuroscience.
This is Part 10 of the Energy of Civilization series, concluding our exploration of how energy shapes intelligence across scales. For the companion series on the intelligence-energy relationship from the computational side, see "The Intelligence of Energy."
Comments ()