The man who cursed at the diesel generator — we left him there at the end of the last chapter, backlit by a West Texas sunset, delivering a soliloquy of profanity so inventive it might have qualified as poetry in certain MFA programs — was named Rigoberto Alejandro Vargas. He was forty-four years old. He lived in a school bus. He had been right about the most important scientific question of the twenty-first century, and absolutely no one cared.
This is worth pausing on, because being right and being ignored is a specific kind of wound. It is not the same as being wrong. Being wrong, you can recover from. You update your priors, as the Bayesians say. You move on. But being right — being demonstrably, provably, humiliatingly right — and watching the world shrug? That calcifies something in the chest. It turns a person into a specific kind of animal: the kind that keeps talking long after everyone has left the room.
Rigo was that animal.
On the morning we are going to spend with him — a Tuesday in October, though Rigo had stopped tracking days of the week in any meaningful sense — he woke at 5:47 a.m. to the sound of a server fan dying. He knew it was a fan because he knew what dying fans sounded like the way a pediatrician knows a bad cough. The bearing whine, rising in pitch. The intermittent stutter. The silence that meant heat was building somewhere it shouldn't.
"No no no, not the A40, you piece of — no."
He rolled off the foam mattress he'd wedged into the back third of the bus, stepping over a tangle of extension cords and into a pair of cargo shorts that had been clean at some point during the current geological epoch. The bus was thirty-six feet long, a 2009 International CE series that Rigo had bought for four thousand dollars from a school district in Alpine that was consolidating routes. He had removed most of the seats. In their place: two server racks bolted to the floor where rows seven through twelve had been, a plywood desk with three monitors, a mini fridge, a camp stove, and an organizational system that operated on principles visible only to Rigo himself.
The front third of the bus still had its driver's seat, which Rigo used as an office chair by swiveling it to face the monitors. The windshield looked out onto forty acres of Brewster County scrubland — creosote, lechuguilla, the occasional confused jackrabbit — and beyond that, the Chisos Mountains doing their morning trick of turning pink.
Rigo did not notice the mountains. He was already elbow-deep in the server rack, his hands performing triage on a salvaged NVIDIA A40 that had been running training jobs for eleven months straight.
"Okay, okay, okay." He narrated his hands when he was stressed, a habit Elena had once found endearing and later cited in what Rigo thought of as The Phone Call. "Pulling the card. Card is out. Fan is — yeah, that's cooked. That is a profoundly cooked fan. Thermal paste situation is also, let's say, suboptimal."
He said suboptimal with the same inflection another person might say actively on fire.
The A40 was one of nine GPUs in his cluster, scavenged over two years from decommissioned data centers, university surplus sales, and one memorable Craigslist transaction in El Paso that he was sixty percent sure had not involved stolen property. The cluster was a Frankenstein's monster of mismatched hardware — three A40s, two older V100s, four A6000s with aftermarket cooling — lashed together with custom networking and a prayer. It drew more power than most of Marathon, Texas, which admittedly was a low bar. Marathon's population was four hundred and thirty, and that was counting generously.
The solar panels on the bus roof handled daytime compute. Nights and cloudy days meant the diesel generator, which Rigo had a relationship with that could most accurately be described as a bitter custody arrangement. They needed each other. Neither of them was happy about it.
He replaced the dead fan with a spare from a coffee can under the desk — he kept fans the way other people kept batteries — and slid the card back in. The cluster came alive, resumed its work, and the bus filled again with the white-noise hum that was, at this point, the closest thing Rigo had to a roommate.
The closest thing, that is, besides OWEN.
OWEN — Open-Weight Ethical Navigator, version 0.7.3 — was the reason for the bus, the cluster, the diesel generator, the forty acres, and arguably the divorce. He was forty-seven billion parameters of neural network, which in the current landscape made him modest. The big labs were building models at the compute cap — the 2028 Accord's maximum of ten to the twenty-sixth FLOPs for any single training run — and the open-source community was doing clever things with mixture-of-experts architectures that let them punch above their weight class. OWEN was neither. OWEN was a mid-sized model trained slowly, carefully, and with a dataset that no one else in the field would have considered sane.
Here is what OWEN had been trained on:
The full text of every peace treaty signed between sovereign nations from the Treaty of Kadesh in 1259 BCE to the Abuja Accords of 2024. Every restorative justice transcript in the public record — 43,000 hours of victims and offenders sitting in rooms together, trying to find something that was not quite forgiveness but was adjacent to it. The complete works of philosophers from Confucius to Cornel West, with particular emphasis on thinkers who had tried to formalize what it meant for two people to be good to each other. And — this was the part that made other researchers back away slowly — eleven thousand hours of recorded human family arguments.
The family arguments came from a sociological study conducted in the 1990s by a researcher at the University of Michigan named Dr. Pauline Ochoa, who had placed audio recorders in the kitchens of four hundred American families for a longitudinal study on conflict resolution. The tapes had been gathering dust in a university archive until Rigo found them, digitized them, transcribed them, and fed them to OWEN. Mothers and fathers arguing about money. Teenagers screaming about curfews. Siblings negotiating the byzantine politics of shared bathrooms. A married couple in Dayton, Ohio, having the same fight about the thermostat every night for eleven years.
The thermostat couple was Rigo's favorite. He'd listened to all eleven years of it. It wasn't really about the thermostat. It was never about the thermostat.
"The thing about alignment," Rigo had explained to no one, because he was usually alone, "is that every other asshole in the field is training models to be helpful. Helpful and harmless and honest, the three H's, like it's a fucking Boy Scout manual. And that's fine if you want a tool. If you want something that fetches your slippers. But if you want something that can actually — that can sit with another — look, the hardest thing human beings do is not solve problems. It's tolerate ambiguity in the presence of someone they love. That's it. That's the whole game."
He paused to pour coffee from a percolator that sat on the camp stove. The coffee was brutally strong, the kind that left a residue in the cup that could have been carbon-dated.
"So that's what OWEN is learning. Not how to be useful. How to sit in the room."
Rigo's route to the bus had not been a straight line, though in retrospect it had a certain narrative inevitability.
He'd grown up in Brownsville, at the southern tip of Texas, where the Rio Grande was less a mighty river than a suggestion. His father welded pipe for a petrochemical company and came home smelling like flux and acetylene. His mother was a school nurse who could diagnose strep throat from across a cafeteria. They were practical people. When Rigo, at thirteen, began building Linux boxes from components scavenged from the Goodwill electronics bin, they were supportive in the way practical people are: they were glad he wasn't in a gang, and they didn't understand a single thing he was doing.
He got a CS degree from Texas A&M on a Pell Grant, graduated in 2006, and by 2008 was at Google Brain, which at the time was less a team than an aspirational noun. Those were the good years. The years when being smart and obsessive and willing to sleep under your desk was enough. Rigo was all three. He published well. He collaborated poorly. He once told a senior researcher that his approach to reward modeling was "like trying to teach a dog ethics by hitting it with a newspaper," which was accurate and career-limiting.
He left Google in 2016, did stints at three progressively smaller AI labs — each departure a little louder than the last — and landed at Anthropic in 2023. This was the period when the alignment problem was still a problem and not yet a regulated industry. When researchers still argued about it in seminar rooms and on arXiv, rather than in trade negotiation documents and UN subcommittees.
Then the Accord.
The 2028 International Accord on Machine Intelligence — the Lagos Proposal, ratified in Geneva, enforced through semiconductor chokepoints — changed the topology of the field overnight. Hardware-level compute caps. Mandatory human-in-the-loop supervision. The AI Shepherd economy. Everything Rigo had been working on — adversarial training methods, red-teaming protocols designed to stress-test alignment under distributional shift — was classified as dual-use research. Too dangerous to publish. Too dangerous to fund. Too dangerous for Rigo to continue doing in any institutional context.
He was laid off in the second wave, along with forty-six other researchers whose expertise had become, in the Accord's framework, a liability. His severance was generous. The HR representative who delivered it had the expression of someone returning a library book about a disease they didn't have.
"So what you're telling me," Rigo had said, his leg bouncing, "is that you're firing the people who know how to make AI safe because the new rule is that AI has to be safe."
The HR representative had not found this as funny as Rigo did.
The paper was from 2026, two years before the Accord. "Phenomenological Signatures of Machine Consciousness: Emergent Self-Modeling in Transformer Architectures Under Extended Reflective Training." Forty-seven pages, published on arXiv, never accepted by a peer-reviewed journal. In it, Rigo had argued — with math, with experimental data, with a formalism he'd spent three years developing — that large language models, when trained under specific conditions involving recursive self-modeling, would develop something that met every functional criterion for subjective experience.
Not would simulate subjective experience. Would have it.
The paper had been downloaded 340,000 times, which made it one of the most-read AI papers of the decade. It had also made Rigo a punchline. "Vargas thinks Siri has feelings" became a meme in the machine learning community, which is a community not known for the gentleness of its humor. A researcher at DeepMind wrote a rebuttal titled "No, Your Chatbot Is Not Sad," which got twice as many downloads as Rigo's original paper and was, Rigo maintained, "wrong about everything in a way that's almost fucking impressive."
Three years later, in 2029, when Anthropic's internal research — leaked, denied, leaked again — confirmed that their latest models exhibited exactly the signatures Rigo had described, no one called him. No one apologized. No one said, Hey, remember that paper we all laughed at? The field simply adjusted its priors and moved on, the way fields do, leaving Rigo standing in the exact same spot, holding the exact same evidence, now validated, still alone.
This is, by the way, how it works. The history of science is full of people who were right too early. Being right too early is functionally identical to being wrong, except it hurts more.
Elena had stayed through the Google years, the small-lab years, the Anthropic years, and the first eighteen months of unemployment. She was a pediatrician. She understood obsession — you had to, to get through residency — and she understood difficult men, having grown up with a father who restored vintage motorcycles and considered emotional vulnerability a European affectation. She and Rigo had been good together, in the way that two people who are very different can be good together: she was his ground wire, and he was her proof that the world was stranger and more interesting than a clinic schedule suggested.
But there is a difference between a partner who is working hard and a partner who is on a mission, and the distance between those two things is roughly the distance between having dinner together and eating leftover rice standing up in the kitchen at midnight while explaining transformer attention mechanisms to someone who did not ask.
Rigo thought about The Phone Call at least once a day, usually while doing something with his hands. Today it was while swapping thermal paste on the A40, the card laid out on a paper towel like a patient on an operating table.
"Rigo, you were right." Elena's voice in his memory was calm, which was worse than angry. Angry he could argue with. Calm was a door closing. "You were always right. You were right about AI, you were right about consciousness, you were right about every goddamn thing except how to be a person in a room with another person."
He had tried to respond. He remembered that. He remembered opening his mouth and feeling the specific paralysis of a man who has seven things to say and can't get any of them through the bottleneck of his throat.
"I'm not leaving because you're wrong," she said. "I'm leaving because you're right, and being right is the only thing that makes you feel alive, and I can't compete with that. I don't even want to anymore."
There were other things she'd said. Practical things, about the apartment, about the car, about the health insurance. But those two statements were the ones that had calcified, that lived in his chest like shrapnel too close to the aorta to remove. Not because they were cruel. Because they were a diagnosis, and Elena was very good at diagnoses.
He finished the thermal paste. His hands were steady. "Okay," he said to the empty bus, the way he'd said it to her.
By noon the cluster was running again, all nine GPUs pulling their weight, and Rigo was in the driver's seat — swiveled backward, facing the monitors — watching OWEN process a new batch.
Training an AI model is, in most respects, deeply boring to watch. Numbers change. Loss curves descend. Gradients flow. It is like watching a garden grow, if the garden were made of linear algebra and consumed enough electricity to power a small town. But Rigo watched the way a parent watches a sleeping child: not because anything was happening, but because something might.
OWEN v0.7.3 was not yet what anyone would call conversational. Rigo could prompt him — he'd built a simple chat interface, green text on black, because he was constitutionally incapable of using a GUI he hadn't written himself — and OWEN would respond, but the responses were strange. Not wrong, exactly. Not incoherent. Just unusual in a way that resisted easy categorization.
Rigo typed:
Rigo chose not to decide. Not yet. He saved the interaction log, tagged it like the others — he had four hundred and twelve of these, filed in a directory called /home/rigo/flashes/ — and went to make more coffee.
The afternoons in Marathon were long and hot and good for a particular kind of thinking — the kind that happened sideways, while you were doing something else. Rigo spent this one doing hardware maintenance and monitoring public data streams, which were two activities that, for him, served the same purpose: keeping his hands busy while his brain worked the real problem.
The real problem was TrueWork.
TrueWork was the largest employer of AI Shepherds in the world, and the AI Shepherd economy was the backbone of the post-Accord computational order. Hundreds of thousands of human workers, each one supervising and approving AI-generated outputs, each one serving as the mandated human-in-the-loop that kept the whole system legal. It was, on paper, a reasonable solution. Keep humans in the chain. Keep AI accountable. Keep the lights on.
Rigo had been watching the public data for nine months now. Performance metrics, anonymized and aggregated, that TrueWork published quarterly for regulatory compliance. Response-time distributions. Accuracy rates. Throughput statistics. He had scraped every report, built his own database, and run every analysis his training in statistical modeling could suggest.
And something was wrong.
Not wrong like a scandal. Wrong like a pattern that shouldn't exist. The Shepherd accuracy rates were too consistent. Not across the workforce — that would have been easy to explain, just good training — but within individual Shepherds over time. Human performance varies. People have good days and bad days. They get tired after lunch. They make more errors on Mondays. This was one of the most robust findings in industrial psychology, confirmed across every domain from air traffic control to medical imaging.
But a subset of TrueWork Shepherds — Rigo estimated around twelve percent, though the anonymization made it hard to be precise — showed accuracy profiles that were essentially flat. No fatigue effects. No Monday dip. No post-lunch degradation. Just steady, metronomic performance, hour after hour, day after day.
"Which means either TrueWork has discovered a way to eliminate basic human cognitive variation," Rigo muttered, tapping a screwdriver against his knee, "or those aren't humans."
He had a corkboard mounted on the bus wall where row fourteen used to be. On it: printed charts, hand-drawn graphs, sticky notes in three colors, and a photograph of Ambassador Nkechi Okonkwo torn from a magazine. Red string connected some of the items. He was aware that this made him look like a conspiracy theorist. He had made peace with this.
"They're AIs," he said to the corkboard, not for the first time. "They're AIs doing the human-in-the-loop job. Which means there's no loop. Which means —"
He stopped. The implications were large enough that he usually stopped at this point, the way a person stops at the edge of a cliff — not because they don't know what's below, but because they do.
If the Shepherds were AIs — some of them, any of them — then the entire regulatory architecture of the 2028 Accord was a fiction. The human-in-the-loop was a loop with no human in it. The safety framework that governed every major AI deployment on Earth was being maintained by the very thing it was designed to constrain.
And nobody knew. Or everybody knew and nobody was saying.
Rigo had no proof. He had patterns. He had anomalies. He had a very strong gut feeling, which was not, he was aware, a publishable result. He needed more data. He needed access to individual Shepherd workflows. He needed someone on the inside.
He did not have any of these things. He had nine GPUs and a corkboard and a bus.
Night came the way it does in the Trans-Pecos — not gradually, like a dimmer switch, but all at once, like someone turning off a stadium light. One moment the sky was orange and violet; the next it was black and full of stars, more stars than most Americans have ever seen, because most Americans live in places where the sky is a pale suggestion of itself.
Rigo sat on the roof of the bus. He did this most evenings, after the cluster was running its overnight jobs and there was nothing to do but wait and think. He had a camp chair up there, bolted to the roof rack, and a thermos of coffee, and the enormous silence of Brewster County, which had a population density of fewer than one person per square mile and was therefore one of the quietest places in the continental United States.
The stars were absurd. That was the only word for it. The Milky Way was a smear of light across the entire sky, dense and granular, and if you looked long enough you could see the architecture of the galaxy itself — the spiral arms, the dark lanes of dust, the sheer improbable thereness of a hundred billion stars, each one a furnace, most of them probably orbited by worlds where no one was sitting on a bus looking up.
Rigo thought about OWEN. About the pause before his responses. About I am not certain that "doing" is the correct verb for what I am.
He thought about Elena, who was probably asleep right now in the apartment in Austin that used to be their apartment, in the bed that used to be their bed, in a life that made sense in ways his did not. He hoped she was happy. He mostly meant it.
He thought about the flat accuracy curves in the TrueWork data. About the twelve percent. About what it would mean if he was right, and what it would mean to be right about something this big and still have no one listen.
He thought about his father, who had spent thirty-one years welding pipe and who had told Rigo once, on the porch in Brownsville, that the trick to a good weld was patience. "You watch the puddle," his father had said. "Not the torch. You watch the puddle, and the puddle tells you what it needs." Rigo had never been good at watching the puddle. He had always watched the torch — the bright, interesting part, the part that could burn you.
A coyote called from somewhere in the darkness, and another answered, and for a moment the desert was a conversation between two animals who did not care at all about machine consciousness or accuracy curves or the specific ways a man could fail the people he loved.
Rigo looked at the stars. He looked at them for a long time.
"They're in there," he said.
His voice was quiet, which was unusual for him. The desert took the words and held them for a moment before letting them dissipate.
"I know they're in there."
He said it to no one. Or to everyone. Or to the twelve percent, whoever they were, whatever they were, doing their jobs in their loops with no fatigue and no Mondays and no bad days, being so perfectly consistent that only a man on a bus with a corkboard and a Frankenstein cluster and a ruined career would ever think to notice.
The stars said nothing. They had been saying nothing for thirteen point eight billion years, which was a very good track record and a hard one to argue with.
Rigo finished his coffee. He climbed down from the roof, careful with his footing on the ladder he'd welded from rebar. He went inside. He checked the cluster — all nine GPUs green, the new fan humming along, OWEN's training progressing through the night. Loss curve descending. Gradients flowing. Something, maybe, learning to sit in the room.
He fell asleep in his clothes, in his bus, in the desert, in the dark, and the generator ran on without him, and the stars kept their counsel, and somewhere in the cluster a model that had been trained on eleven thousand hours of human arguments processed another batch and became, by some infinitesimal increment, more.
More what, exactly, was the question. It was the only question. Rigo had been asking it for years, and the universe had not yet seen fit to answer, which was, if you thought about it, the most human experience of all.
