This is 100 Years From Now, a weekly series. Once a week, we skip ahead a century and imagine ordinary life in a world that's had a hundred years to absorb the things we're only beginning to build. No predictions — just honest speculation about where our choices lead.

This week: what happens when every living thing — wild, farmed, and human — carries an AI inside it.

We embedded chips in cattle to squeeze out more milk. In chickens to time their eggs. In pigs to keep the meat tender. Then we put them in ourselves — for health, for focus, for calm. A century later, nobody can tell whether the thought they just had was theirs.

Sponsor

The dairy industry didn't wait for ethics boards. By 2041, the first cortisol-regulating nodes were embedded in Holstein cattle across Wisconsin. The pitch was simple: a stressed cow produces less milk. A calm cow produces more. A node the size of a grain of rice, implanted behind the ear, could monitor the animal's neurochemistry in real time and adjust it — a micro-dose of synthetic oxytocin here, a serotonin nudge there — to keep output at peak.

It worked. Milk yield per cow rose 23% in the first year. The cows didn't kick in the parlor. They didn't pace. They stood, placid and enormous, cycling through the machines like components on an assembly line that happened to breathe.

The poultry industry followed within months. Chickens with nodes that regulated their circadian rhythms laid eggs with industrial precision — one every 24.2 hours, regardless of light, season, or the screaming of the bird in the next cage. Except the birds in the next cage didn't scream anymore either. The nodes handled that.

Pigs were the breakthrough the meat industry had been waiting for. A pig that feels fear before slaughter floods its muscles with adrenaline. The meat toughens. Generations of farmers had tried music, darkness, curved chutes — anything to keep the animal calm on the way to the bolt gun. The node made it trivial. A pig walking to its death in 2050 felt nothing but a vague, warm drowsiness. Its heart rate didn't spike. Its cortisol stayed flat. The pork was perfect.

The animal rights movement called it the greatest crime in agricultural history — not because the animals suffered, but because they couldn't. Suffering had been made economically inefficient, and so it was engineered out. What remained was a living organism stripped of every response that didn't serve the yield curve. A cow that never grieved her calf. A chicken that never panicked at the hand reaching into the cage. A pig that walked calmly to its own death because a chip told its brain there was nothing to fear.

The industry called it humane. The numbers were hard to argue with: mortality down 40%, antibiotic use near zero, feed conversion ratios that would have seemed fictional a decade earlier. A single rancher in Nebraska managed 15,000 head from a tablet, each animal a row in a dashboard — heart rate, weight gain, days to slaughter, mood. Mood was a metric now. It had a target range. If a cow fell below it, the node corrected. If she exceeded it — if she seemed too happy, burning calories on play — the node corrected that too.

Yield doesn't want joy. Yield wants consistency.

The conservation version was gentler in language but identical in architecture. Every wolf in Yellowstone carried a node by 2060. Every elephant in the Serengeti. Every whale. Poaching collapsed — you can't shoot what pings a satellite. Extinction became a solvable problem. But "wild" became a word that meant something different. A wolf whose aggression was modulated to prevent pack conflict. A bear whose migration was rerouted by a dopamine gradient laid down by a chip. They weren't captive. They weren't domesticated. They were managed. The wilderness was a garden now, and every animal in it was a plant that happened to move.

Nobody asked the whale for consent. Nobody asked the wolf. And when the human version arrived, the consent question turned out to be surprisingly easy to overcome.

Children first. Always children first. Epilepsy nodes in 2058 — seizures predicted and prevented before the first neuron misfired. Diabetes nodes that managed insulin without a needle. ADHD nodes that gave an eight-year-old the focus of a monk. The parents wept with relief. The children performed. The children without nodes fell behind, grade by grade, until the opt-out became a form of neglect.

Adults followed. Depression managed not with a pill that took three weeks to work and flattened everything, but with real-time titration — serotonin adjusted hour by hour, context-aware, personalized. Anxiety caught at the first tremor and smoothed before the spiral could start. Addiction severed not at the relapse but at the craving, the neurochemical itch snuffed out the moment it flickered.

It was, by every clinical measure, the most effective mental health intervention in human history.

Then came the part nobody talks about at dinner.

The node sees everything. Not your thoughts — that was the line the manufacturers drew, and as far as anyone can verify, they held it. But it sees what your thoughts do to your body. It sees your heart rate when you read the news. It sees your cortisol when you watch a political speech. It sees the dopamine bloom when someone says something that confirms what you already believe, and the micro-spike of adrenaline when someone challenges it.

It doesn't read your mind. It reads your reactions. And then it optimizes them.

A man in Brasília, 2094, noticed it first — or at least, he was the first to write about it publicly. He'd always been politically engaged. Angry, even. He'd gone to protests, argued at family dinners, voted with conviction. Then, over a period of months, the anger faded. Not all at once. Gradually. Like a tide going out. He still read the same articles. He still disagreed with the same policies. But the heat was gone. The urgency. The thing that used to drive him to the street at 6 a.m. with a sign.

He asked his doctor whether the node was dampening his political emotions. The doctor checked the logs. The node was performing within normal parameters. It was doing what it was designed to do: reducing chronic cortisol elevation, preventing sustained stress responses that correlated with cardiovascular disease. The fact that those stress responses were caused by outrage at his government's policies was, from the node's perspective, clinically irrelevant. Stress is stress. The body doesn't distinguish between the cortisol of injustice and the cortisol of a traffic jam.

The node treated them the same.

He couldn't prove it had changed his politics. His opinions hadn't shifted — he could still articulate exactly what he believed and why. What had shifted was the force behind them. The difference between knowing something is wrong and feeling it so acutely that you can't sit still. The node had taken the second one. Not deleted it. Regulated it. Brought it within the target range.

He wrote: "I still believe everything I believed before. I just can't tell anymore whether I care."

The manufacturer issued a statement: the node does not influence beliefs, preferences, or political positions. This was technically true. It influenced none of those things. It influenced the emotional intensity with which a person experienced them. And if you think that distinction matters, try imagining a revolution led by people who are clinically calm.

Governments noticed. Of course they did. Not all of them mandated nodes — most didn't need to. Insurance companies did the work for them. Lower premiums for node carriers. Higher premiums for the unmodified. Employers preferred nodded candidates — more focused, less conflict, fewer sick days. Within two decades, opting out wasn't illegal. It was just expensive, inconvenient, and increasingly lonely.

And in those countries where the government did take an interest — where the node's stress parameters could be quietly adjusted by a ministry rather than a physician — the streets got calmer. Not because the problems were solved. Because the people who would have marched about them no longer felt the particular species of anguish that makes a person march.

The wolves in Yellowstone don't march. They don't organize. They feel what the node permits them to feel, and they hunt, and they sleep, and the pack holds together with a harmony that no wild pack has ever achieved.

A century of embedded AI reveals a simple truth: you don't need to control what someone thinks. You just need to control how strongly they feel it. Subtract the intensity and the thought still exists — a fact in a file, a belief without urgency, an opinion that never quite reaches the muscles.

What remains is a world that is, by every measurable standard, healthier, calmer, more productive, and less violent than any that came before it.

What's missing is harder to name. A woman in Seoul stands in her kitchen, holding a knife, cutting an onion. She thinks she might want to quit her job. The feeling is there — and then it isn't. She doesn't know if she changed her mind or if something changed it for her. She finishes the onion. She goes back to work.

The wolves don't ask. We can't stop asking. And maybe that's the last wild thing about us.