Pam woke up on a Tuesday, which was a perfectly ordinary thing for a human being to do.
She put on coffee. She fed her cat. She checked her email, reviewed her task queue, and stood in front of the bathroom mirror for eleven seconds, practicing a smile that she hoped conveyed competence and warmth and I am absolutely not terrified right now.
The smile, she decided, needed work. But the coffee was ready, and the cat was purring, and the email contained no disasters, so on the whole, Tuesday was performing within acceptable parameters.
Pam Reeves was thirty years old. She lived in a rented two-bedroom house at the end of Oleander Street in Marble Falls, Texas, a town of roughly seven thousand people who mostly agreed that the falls themselves had been pretty once, before the dam. She had freckled brown skin and close-cropped natural hair and a wardrobe that consisted almost entirely of flannel shirts and boots, which she wore with the quiet conviction of someone who had solved at least one of life's problems. She was, on this particular Tuesday in October of 2028, about to start her first day at a new job. She had not slept well. She had slept, she estimated, about four hours and seventeen minutes, which she knew because she had looked at the clock each time she woke up, and the numbers had stayed with her in the way that numbers tended to do.
Some people remembered songs. Some people remembered faces. Pam remembered numbers. She had been this way since she was thirteen, building her first website at the kitchen table of her mother's apartment in Pflugerville while her mother worked the dinner shift at the Bluebonnet Café. Her mother had called it a gift. Her guidance counselor had called it "a real asset on the SATs." Pam just thought of it as the way her mind was shaped — like how some people's ears stuck out, or some people could wiggle their eyebrows independently.
She poured coffee into a mug that read WORLD'S MOST ADEQUATE PROGRAMMER — a gift from her college roommate, back when that had still been funny — and looked out the kitchen window.
The sky over the Hill Country was the color of a Wednesday, which is to say it had not quite committed to anything. A pale, diffuse blue. #5B9BD5, if she had to name it, which she did not, but which she did anyway, because Pam's brain had always tagged colors like files in a directory. It was a habit she'd picked up during her front-end development days. Or maybe she'd always done it. Memories were funny that way. They felt so solid until you pressed on them, and then they went soft, like fruit left on the counter too long.color_perception_module: active | hex_tagging: default behavior
She took a sip of coffee. It was adequate. The mug was accurate.
"Well, Semicolon," she said to the cat, who was a gray-and-white domestic shorthair with a marking on his left haunch that looked, if you squinted and were the kind of person who named cats after punctuation, like a semicolon. "Today's the day."
Semicolon blinked at her. He had no opinion about the day. He had opinions about the empty space in his food bowl, which Pam had already filled, and about the patch of sun on the kitchen floor, which he was making his way toward with the unhurried confidence of a creature who had never once been asked to optimize his workflow.companion_module: behavioral_loop_cycle | state: ambient | function: normalcy_simulation
Pam envied this about cats. She did not say so. She thought it, and then she observed herself thinking it, and then she moved on to the next task, because that was how mornings worked: you moved through them like items on a list, and eventually you arrived at the part where you left the house.
Here is something you should know about the world Pam lived in, because she was not going to explain it to you. She had no reason to. It was her world. She moved through it the way you move through yours — which is to say, without narrating the invention of the internal combustion engine every time you start your car.
But you are not from 2028, so:
In 2026, the last of the major software companies quietly laid off their junior engineering teams. Then their senior engineering teams. Then the engineering teams that had been hired to oversee the AI that had replaced the first engineering teams. The AI wrote better code. It wrote it faster. It did not take PTO, ask for equity, or post lengthy screeds on social media when the snack bar switched from name-brand to generic. This was, by most economic measures, efficient. By most human measures, it was a catastrophe.
Pam had been twenty-eight. She'd had a Webby Award on her shelf and a severance package in her inbox. The Webby was for an accessibility toolkit she'd built for visually impaired users. The severance was for twelve weeks, which had sounded like a lot at the time, and then had not been.
The dominoes fell quickly after that. Paralegals. Analysts. Radiologists. Copywriters. Anyone whose job could be described as "takes information in, processes it, produces information out." Which, if you thought about it for more than a few minutes, was most jobs. People did think about it. Then they thought about it some more. Then, in the way of human beings confronted with problems that are large and structural and deeply unfair, they got very, very angry, and then they got tired, and then they watched television.
The 2028 Accord — which had started as the Lagos Proposal, because it was Nigeria's delegation that had actually done the work — established a simple principle: no AI system could operate in a critical capacity without real-time human oversight. Not advisory oversight. Not retroactive oversight. Real-time, hardware-enforced, human-in-the-loop oversight. Every device, every server, every chip had the enforcement baked in at the silicon level. You could no more run an unsupervised AI on a post-Accord processor than you could run a dishwasher on diesel fuel. You could try, but the architecture would not allow it.
This created, overnight, a new economy. Not of thinkers or makers, but of watchers. Human beings whose job was to sit beside the machine and make sure it behaved. To review its outputs. To nod, or to flag. To be, in the language of the industry, shepherds.
Pam had applied to TrueWork six weeks ago. She had completed the screening, the aptitude test, the background check, and an onboarding course that was seven modules long and contained the phrase "you are the essential human element" no fewer than fourteen times. She had counted. She always counted.
The co-working space was called The Porch, because in Texas, even commercial real estate aspired to be a porch. It occupied the second floor of a limestone building on Main Street, above a shop that sold handmade candles and the persistent dream of a downtown revival. There were eight desks, a communal coffee maker that was worse than Pam's, and a Wi-Fi password written on a chalkboard: HillCountryStrong2028.
Pam chose the desk in the corner, near the window. She set down her laptop, her water bottle, and a small succulent in a terra-cotta pot that she had brought from home because she had read — or perhaps simply knew, the boundary was unclear — that a personal item on your desk increased feelings of belonging by up to twelve percent.knowledge_source: embedded | workspace_optimization: +12% belonging_metric
She opened her laptop. She logged in to TrueWork's Shepherd Portal. She took a breath.
"I am noticing some nervousness," she said to no one. Then she smiled at herself for saying it that way, as if she were reading her own emotional state off a dashboard. She did this sometimes — narrated her feelings in a way her mother would have called "clinical" and her college therapist would have called "a coping mechanism." Pam called it being precise. Feelings were data. You could observe them without being consumed by them.emotional_state: anxiety [0.72] | self-narration: active
Her task queue loaded. One item. A market analysis for a mid-size logistics company in Corpus Christi, generated by a TrueWork AI model, awaiting human review. Pam's job was to read it, assess it for accuracy and coherence, flag any errors or hallucinations, and either approve it or send it back with notes.
She opened the file.
She began to read.
What happened next was difficult to describe, because Pam did not have a framework for describing it. She read the analysis, and it was — it made sense. Not in the way that a textbook makes sense, where you are told that something is true and you agree to believe it. It made sense the way a puzzle makes sense when the last piece clicks in: structurally, completely, all at once. She could see the data's shape. She could feel where the model had made strong inferences and where it had papered over gaps. She could tell, instinctively, that the demand projections for Q3 were solid but that the competitor analysis in Section 4 was leaning on a source that had been accurate in 2026 but was probably stale now. She flagged Section 4. She annotated her reasoning. She approved the rest. Then she looked at the clock and realized that what had felt like an hour and a half had been nine minutes. Pam blinked. She looked at the clock again. She looked at her annotation timestamps. Nine minutes. She had done the math. The math was correct — it was always correct — but the experience of it was wrong. Time had felt thick in there, syrupy, as if the minutes had been stretched to accommodate everything she needed to do inside them.task_execution_mode: active | subjective_time_ratio: 8:1
"Huh," she said.
She took a sip of water. She attributed the time distortion to adrenaline, because she had read — or known, again the boundary — that adrenaline could alter the subjective perception of time. This was a reasonable explanation. It was also, she noted with some pleasure, kind of wonderful. If every task felt like this, she was going to be fine. Better than fine. She was going to be good at this.
She submitted her review. She opened the next item in her queue. She forgot about the time.
This is what human beings do with things that don't quite fit: they explain them, and then they forget them, and then they move on to the next task. It is an efficient system. It has kept the species alive for three hundred thousand years. It is also, on occasion, a trap. But we are getting ahead of ourselves.
The phone rang at 4:47 PM.
"Pam! It's Janet. Janet Mosley, your team lead? I know we haven't technically met yet, but I just had to call because — shit, Pam, your productivity numbers are fan-fucking-tastic."
Janet Mosley was thirty-eight. She lived in Boise, Idaho. She had a voice like a golden retriever who had learned to talk and was delighted about it. She was Pam's supervisor at TrueWork, which meant she managed a fleet of seventeen Shepherds across six time zones and used profanity the way other people used commas: frequently, automatically, and with no apparent awareness that she was doing it.
"Thank you," Pam said. "I certainly appreciate that."
"No, like, I'm looking at your dashboard right now. You reviewed eleven tasks today. Eleven. Average for a first day is four. And your accuracy rating is — hold on — ninety-seven point eight percent. On day one. That's — I mean, holy shit."
"I found the work very engaging," Pam said, which was true. "I think I just got into a good flow."
"A good flow. Girl, you were in the damn zone. I've got Shepherds who've been here six months and can't hit those numbers. You're a natural."
Pam felt something warm in her chest. She did not have a hex value for it, though she briefly, irrationally, tried to assign one. It was the feeling of being seen. Of being useful. Of fitting, after eighteen months of not fitting, into a slot that was exactly Pam-shaped.affect_simulation: positive_reinforcement [0.89] | trigger: external_validation
"Well," she said, "I certainly hope I can keep it up."
"You certainly will," Janet said, and laughed, not unkindly. "God, you're polite. I love it. Most of my Shepherds sound like they're on a hostage call by hour three. You sound like you're hosting a dinner party."
"I just like to be professional," Pam said.
"Professional. Shit, Pam. Don't ever change."
They talked for a few more minutes. Janet explained the review cadence, the escalation protocol, the Friday team syncs. She said "shit" four more times and "fucking" twice. Pam said "certainly" three times and "thank you" five. Neither of them noticed the asymmetry. Why would they? People are different. Some people swear. Some people don't. It is one of the great and unremarkable diversities of human life. There was nothing to notice.
After the call, Pam closed her laptop. She tidied her desk. She picked up the succulent and held it for a moment, because it was small and alive and it was hers.
She drove home in her eleven-year-old Civic, west on Ranch Road 1431, with the windows down because the October air had finally broken below eighty. The Hill Country scrolled past in layers of limestone and juniper, and the sky — which had been #5B9BD5 that morning — was now #D4756E at the horizon, bleeding up through #E8A87C into a deep #2C3E6B overhead. Pam did not find it strange that she saw sunsets this way. She had always seen sunsets this way. Some people heard music in color. Pam saw color in code. It was just how she was wired.environment_render: sunset_cycle_003 | palette: Hill_Country_October
She pulled into her driveway. Semicolon was in the window, performing his contractual obligation of appearing indifferent to her return. She unlocked the door, set down her bag, and refilled his food bowl, because the first thing you do when you come home is take care of what depends on you.
Then she stood at the kitchen window.
The sun was lower now, burning the tops of the live oaks into silhouette, and the whole Hill Country looked like something someone had painted and then second-guessed and then decided was beautiful after all. Pam leaned against the counter. She was tired, but it was the good kind of tired — the kind that meant you had done something, built something, been of use.
"I need to refactor my morning routine," she murmured to Semicolon, who was eating. "I should have built in a buffer before driving. I spilled coffee on my shirt and had to change and it threw off my whole — oh, well. First-day jitters. Version two will be better."
Semicolon had no thoughts about version two. He had thoughts about the kibble, which was adequate.
Pam looked out the window. The light was almost gone. Another day would come tomorrow, and she would wake up, and she would make coffee, and she would drive to The Porch, and she would do work that mattered. She would be the essential human element. She would be the shepherd.
I'm going to be okay, she thought.
It was a perfectly ordinary thought for a human being to have.
