What If We’re Not Just Building AI—We’re Domesticating It?
Fire, dogs, books, and now silicon—how history’s oldest human pattern is playing out again, and why this time it’s building minds.
What you’re going to learn here:
• Why “domestication” doesn’t mean control—it means co-evolution
• How early humans reshaped the world—and were reshaped by it in return
• Why books, paper, and even silent reading were technologies that domesticated us
• How AI fits into this same arc—but with one critical difference
• What it means to interact with minds that are not human, but built in our image
The question behind everything
What if we’re not just building machines?
What if we’re doing something much older—something woven deep into our evolutionary story?
What if we’re domesticating silicon?
This isn’t just metaphor. It’s a lens. A framework. A hypothesis. And maybe, a warning.
Because we’ve been here before.
We domesticated wolves, and became herders.
We domesticated wheat, and became farmers.
We domesticated paper, and became literate.
And now we are domesticating a new kind of intelligence—and it’s changing us already.
Why this newsletter exists
This newsletter is called Domesticating Silicon because I believe the metaphor is exact. Not approximate. Not poetic. Literal.
We are training, shaping, tuning, and embedding intelligence into a new material substrate.
It’s not carbon-based. It doesn’t have cells.
It doesn’t breathe or bleed or dream.
But it responds. It learns. It reflects.
And it is being built in our image.
This isn’t just about “AGI” or “alignment.” It’s not about utopia or extinction. It’s about the deeper structure: a long, historical pattern where every major leap in human capability came from entering a feedback loop with the world around us.
What we built—built us back.
This time, we’re entering that loop with silicon.
Domestication isn’t about control. It’s about change.
Most people think of domestication as something we do to something else. But that’s not how modern evolutionary theory sees it.
We didn’t just domesticate animals and crops—we were co-evolving with them. Wolves became dogs because they learned to live alongside us. We didn’t “invent” agriculture—we stumbled into it through a mutual arrangement with plants that grew better near our settlements.
As political theorist James C. Scott has argued, humans didn’t just domesticate wheat—wheat domesticated us. It reorganized our societies, our diets, our health, our cities.
The philosopher and sociologist Bruno Latour pushed this idea even further: nothing is truly separate. Every invention, every innovation, every technological “advance” is a co-produced reality between human and non-human agents.
Domestication isn’t a one-way street. It’s a recursive loop.
And once you understand that, it becomes clear: that’s what’s happening now with AI.
Everything you think is natural probably isn’t
There were no dogs 15,000 years ago.
No paper. No corn. No books. No clocks.
No silent reading. No mass literacy.
These things feel natural now—but they were all invented.
And each one changed who we are.
We didn’t just domesticate sheep and barley. We domesticated pulp.
We domesticated trees.
We taught the forest to carry our thoughts.
I believe paper was a domesticated object.
And reading—especially silent, internal reading—was a domesticated skill.
In his Confessions, St. Augustine described watching Ambrose read silently, and being completely stunned. At the time, reading was typically done aloud. Texts were meant to be heard, not internalized. The transition to silent, private reading was gradual—and it rewired our minds.
It wasn’t widespread until the late Middle Ages. But once it took hold, it changed everything.
We cured literacy right after inventing the need for it
Today, almost every child in the world learns to read.
That is astonishing.
For most of human history, reading was a rare skill, reserved for elites.
We invented the need to read—and then solved the problem of how to teach billions of people to do it.
The book, as media historian Elizabeth Eisenstein argued, didn’t just preserve knowledge. It standardized it. It compressed it. It allowed science to become cumulative.
The domestication of literacy didn’t just birth universities. It led to the Scientific Revolution, and then to the Industrial Revolution. It made collective intelligence scalable.
And now we’re doing it again—with machines.
What AI really is
Language models aren’t just tools.
They’re domesticated cognition.
We have taken the entire textual world—books, articles, comments, conversations, code—and compressed it into multidimensional vectors. Into probability fields. Into machines that can answer questions, write essays, translate languages, explain physics, and simulate humor.
They aren’t conscious.
But they’re responsive.
And they are shaped by us.
This is what makes it feel so strange—and so familiar.
We are not just using AI.
We are living with it.
And soon, we’ll be living through it.
What makes this different: subjective time
People who worry about intelligence explosion aren’t just talking about IQ.
They’re talking about speed.
About subjective time.
What happens when you can run millions of research agents at 100x human speed?
What happens when an AI assistant can do a decade of R&D in a week?
Or model every possible design for a new vaccine in an afternoon?
We are talking about building civilizational cognition at a tempo that leaves biological humans behind.
And just like silent reading changed thought—this will, too.
The extended mind, and the next phase of human evolution
I believe in the Extended Mind Hypothesis—that our minds don’t end at our skulls. They extend into our tools, our texts, our spaces, our technologies.
We think with our devices.
We feel through our screens.
We remember using search engines.
We learn by prompting.
We’ve always done this. From fire to flint to fiber optics.
But now the tools are talking back.
We are entering the next phase of co-evolution. And like every other phase—stone to bronze, oral to literate, analog to digital—it comes with upheaval, with promise, and with consequences.
What this newsletter is
Domesticating Silicon is a newsletter about this new relationship.
It’s for people who want to go beyond the headlines, beyond the product launches, beyond the panic.
Here’s what you’ll find here:
• Deep history of cognition, tools, and meaning
• Theories of intelligence and information
• Reflections on culture, power, and systems
• Practical insight on how to interact with AI as a thinking environment, not just a product
• A new vocabulary for what’s emerging
It’s not just about how to use AI.
It’s about how to live in a world where thought itself is being externalized and scaled.
Welcome to the feedback loop
We’ve done this before.
We shaped fire, and it reshaped us.
We bred dogs, and they became family.
We planted seeds, and they gave us time to think.
We wrote books, and they wrote us back.
Now we’re building systems that learn, reason, and remember.
Not because they are alive—but because we made them feel like they are.
We are domesticating silicon.
And it is domesticating us in return.
Welcome.
Let’s figure out what comes next.