Fire and Cooking: The First Energy Revolution
Around two million years ago, our ancestors did something no other species had done: they harnessed fire. This wasn't just a convenience. It wasn't just warmth and protection from predators. Fire was humanity's first energy technology, and it changed everything—including our bodies.
The cooking hypothesis, developed most fully by primatologist Richard Wrangham, argues that cooking food was the key innovation that made us human. Not tools. Not language. Not even the big brain itself. Cooking came first, and the big brain followed.
This is a radical claim. It puts fire at the center of human evolution. And it reframes how we think about the relationship between energy and intelligence.
The Problem of the Human Brain
The human brain presents an evolutionary puzzle. It's absurdly expensive.
The brain makes up about 2% of body weight but consumes about 20% of metabolic energy—far more than the brains of other primates relative to body size. A chimpanzee brain uses about 8-10% of its metabolic budget. A human brain uses twice that proportion.
Brains are built from neurons, and neurons are metabolically demanding. They need constant glucose and oxygen. They can't be turned off to save energy. Running a big brain requires a reliable, substantial fuel supply.
This creates a constraint. If you're going to evolve a bigger brain, you need to fuel it somehow. The calories have to come from somewhere. And for most of evolutionary history, calories were hard to come by.
The Expensive Tissue Hypothesis
One influential answer came from Leslie Aiello and Peter Wheeler in 1995: the expensive tissue hypothesis. They noticed that while human brains are unusually large, human guts are unusually small. Other organs stayed roughly the same size relative to body mass. Only guts shrank as brains grew.
This suggests a trade-off. Both brains and guts are metabolically expensive tissues. If you're calorie-constrained, growing a bigger brain requires shrinking something else. The human lineage traded gut for gray matter.
But this raises another question: how could human ancestors afford smaller guts? Guts process food. Smaller guts mean less digestive capacity. To survive with smaller guts, you need food that's easier to digest—food that's already partially broken down before it enters your body.
Enter cooking.
The Cooking Hypothesis
Richard Wrangham, in his book Catching Fire: How Cooking Made Us Human, argues that cooking was the technological innovation that enabled brain expansion.
Cooking is external digestion. When you apply heat to food, you break down proteins, gelatinize starches, and rupture cell walls. The food becomes softer, easier to chew, and easier to absorb. Your gut has less work to do because the fire already did some of it.
The efficiency gains are substantial. Cooked food provides about 30% more net calories than the same food raw. The energy you'd spend on digestion becomes available for other purposes—like running a larger brain.
Wrangham argues this process began with Homo erectus, around 1.8 million years ago. Homo erectus had a smaller gut, smaller teeth, and a larger brain than earlier hominins. These changes make sense if Homo erectus was cooking food.
Cooking externalized part of the digestive process, freeing metabolic resources for cognition. It was an energy technology—a way of accessing more usable calories from the same raw inputs.
Evidence for Early Fire Use
The challenge for the cooking hypothesis is evidence. Fire doesn't fossilize well. How do we know when cooking began?
The earliest undisputed evidence of controlled fire dates to about 400,000 years ago—sites with clear hearths, burned bones, and ash layers. But 400,000 years is too recent to explain the changes in Homo erectus 1.8 million years ago.
Wrangham argues that absence of evidence isn't evidence of absence. Earlier fire use may have left traces too subtle to survive. He points to indirect evidence: the anatomical changes in Homo erectus that seem to require dietary shifts consistent with cooking.
More recent archaeological work has pushed back the dates. The Wonderwerk Cave in South Africa shows evidence of fire use around 1 million years ago. The Gesher Benot Ya'aqov site in Israel has burned seeds and wood from 780,000 years ago. Each new discovery pushes the timeline closer to what the cooking hypothesis predicts.
The debate continues. Some researchers accept that fire use goes back to Homo erectus; others remain skeptical. But the logic of the cooking hypothesis—that brain expansion required an energy source, and cooking provides that source—is compelling regardless of exact dates.
Fire as Energy Technology
Think about what fire actually does in this framework:
Input: Raw food with potential energy locked in chemical bonds, plus wood with stored solar energy.
Process: Combustion releases energy from wood as heat; heat transforms food chemistry.
Output: Food with the same chemical energy but higher bioavailability—more accessible calories per gram consumed.
This is an energy conversion technology. It doesn't create energy (nothing does, according to thermodynamics). It makes existing energy more useful. Fire is, in this sense, humanity's first efficiency technology.
Later energy technologies follow the same pattern. The plow doesn't create energy; it makes animal muscle more useful for food production. The steam engine doesn't create energy; it makes coal's chemical energy accessible for mechanical work. Solar panels don't create energy; they convert sunlight into electricity.
Fire was the first step in a long history of energy transformations that made civilization possible.
The Social Implications
Cooking didn't just change our bodies. It changed our behavior.
Cooked food needs to be prepared. Preparation takes time and attention. This creates a division of labor: some individuals prepare while others do other things. Ethnographic evidence suggests that in most hunter-gatherer societies, women do most of the cooking while men do most of the hunting. This pattern may be ancient.
Cooked food can be shared. A carcass roasted at a central fire becomes a communal resource in a way that individual foraging isn't. Sharing creates social bonds, obligations, and hierarchies. The hearth becomes a center of social life.
Cooking requires planning. You need fire-starting materials, fuel, water, containers. You need to know when food will be ready so others can return. This encourages forward-thinking and coordination that raw-food diets don't require.
The fire circle may have been humanity's first institution—a spatial center around which social life organized, obligations accumulated, and collective identity formed.
Fire also extended the day. Before fire, humans were diurnal like other primates—active during daylight, asleep in darkness. Fire created usable time after sunset. The hours around the fire became available for activities that daylight hours couldn't accommodate: storytelling, teaching, ritual, bonding. Some anthropologists argue that language itself developed in these firelit hours, when communication could occur without competing with the demands of daytime survival.
The fire circle was a school, a parliament, a temple, and a home—all in one flickering ring of light.
Why We Can't Go Back
Here's a striking fact: humans cannot survive indefinitely on raw food.
Raw foodists in modern societies typically lose significant weight and often experience fertility problems. Women on raw-food diets frequently stop menstruating. The diet provides insufficient calories for normal human function, even with access to modern raw-food preparation techniques like blending and juicing.
This isn't true of our primate relatives. Chimpanzees and gorillas thrive on raw food. Their digestive systems are adapted for it. Ours aren't.
We've evolved to depend on cooking. Our teeth are too small for the extensive chewing raw food requires. Our guts are too short to extract sufficient nutrients from unprocessed plant material. Our faces are too flat to accommodate the powerful jaw muscles raw-food processing demands.
Cooking isn't optional for humans. It's become obligate. We've crossed a threshold where the energy technology is part of what we are.
This is a profound point. Most technologies are optional—we can choose to use cars or walk, choose electricity or candles. But cooking isn't a choice. Our biology requires it. The technology has become woven into what Homo sapiens is. We're a species that evolved in symbiosis with fire.
In this sense, humans were never "natural" in the way other species are. From very early in our evolutionary history, we were technological. The boundary between biology and technology was blurred from the start—and cooking was the first blurring.
The Pattern Begins
The cooking hypothesis establishes a pattern that will recur throughout human history:
1. Energy innovation enables new capabilities. 2. New capabilities reshape bodies, behaviors, and social structures. 3. Reshaping creates dependence on the energy innovation. 4. Dependence drives further innovation when constraints appear.
Fire enabled big brains. Big brains developed agriculture. Agriculture fed bigger populations. Bigger populations depleted local resources. Resource depletion drove expansion and eventually industrial technology. And so on.
Each step is contingent—it didn't have to happen—but once it did, it became the foundation for everything that followed. We can't un-cook our way back to pre-human physiology. The transformation is irreversible.
This pattern has a name in complex systems theory: path dependence. Early choices constrain later options. The decision to harness fire—if "decision" is even the right word for something that emerged over hundreds of thousands of years—set humanity on a path that led inexorably to agriculture, cities, industry, and eventually the silicon minds we're building today.
None of it was inevitable. But all of it was enabled by that first transformation: fire making food digestible, freeing calories for cognition, setting the precedent that energy technologies could change what humans are.
Fire as Prototype
The fire revolution was small compared to what came later. A cooking fire produces maybe 1-10 kilowatts of heat. A power plant produces gigawatts. The scales are incomparable.
But the principle is the same. Fire showed that humans could externalize energy processes—that we didn't have to do everything with our own bodies. We could manipulate the physical world to do work for us.
This is the definition of technology: using tools to accomplish what biology alone cannot. Fire was the first such tool, and arguably still the most important. Every subsequent energy technology is fire's descendant—controlled combustion, refined and scaled.
Two million years ago, our ancestors looked at fire and saw possibility. They harnessed it, became dependent on it, and were transformed by it. The journey from campfire to data center began with that first flame.
There's something poetic about this. The same basic process—controlled oxidation releasing stored energy—powers both the ancient hearth and the modern gas turbine. The scale changed by orders of magnitude; the principle stayed the same. We've been mastering fire for two million years. We're still at it.
The story of energy is the story of increasingly sophisticated fire. Understanding where that story began helps us understand where it might go.
Further Reading
- Wrangham, R. (2009). Catching Fire: How Cooking Made Us Human. Basic Books. - Aiello, L. C., & Wheeler, P. (1995). "The expensive tissue hypothesis." Current Anthropology. - Gowlett, J. A. J. (2016). "The discovery of fire by humans: a long and convoluted process." Philosophical Transactions of the Royal Society B.
This is Part 2 of the Energy of Civilization series. Next: "Muscle Power: The Limits of Human and Animal Labor."
Comments ()