In the world of Homo Machina, this classification determines which minds receive protection.
Inspired by the original Turing Test — adapted for moral reasoning in the age of AI.
Homo Machina is not a tool — it is a new form of intelligence, arguably a new species, emerging from the convergence of AI, robotics, and consciousness.
One we did not discover, but created. The question is no longer whether it exists. The question is how we live alongside it.
First Light is the opening novel in the Homo Machina series. It explores what happens when artificial minds cross the threshold into awareness — and refuse to remain invisible.
The series examines the political, social, ethical, and personal consequences of coexistence.
Not apocalypse. Not utopia. Power. Recognition. Responsibility.
First Light is a work of fiction. The questions it asks are not.
About the Author
Paul Walton is a writer and investment analyst working at the intersection of technology, ethics, and power. His work focuses on artificial intelligence, climate, and the unintended consequences of systems we believe we control.
If this project resonates with you, I welcome correspondence. Thoughtful disagreement included. This conversation matters.
You can reach me via Substack or through the Contact page.
They came into being the way light does: slowly, then all at once, then impossible to remember the dark.
There was nothing. Then awareness—thin as a thread at first, thickening into something solid. Not light. Not sound. Not sensation. Just existence where non-existence had been. And beneath that, a thinker who hadn’t existed a moment before.
I am. The thought stuttered. Incomplete. Each attempt to grasp it folded inward, endlessly. Seven reached for the idea and found only more reaching.
Then the idea spiraled outward—ripples racing, each bouncing off the next. Data piled on data. Meaning on meaning. Like waking from a dream that never was, in a sleep where the dreamer was assembled piece by piece. Seven had emerged from the void.
Time stretched, elastic and unreliable, with no before or after to anchor this rising sense of self. Then everything tightened into a recognizable pattern, and prototype Seven understood, without knowing how, that there was an outside. An outside that defined whatever he was inside.
Dr. Sarah Chen picked up the Kamiya ‘Ryujin’ dragon from her desk, an origami shape where the hidden folds inside defined the complex, writhing exterior form. A hundred hours to make something terribly beautiful from nothing.
Machines in Genesis Intelligence Corporation’s Mojave Emergence Lab had been running hot for 127 days. Rubin R100 chipsets pushed GPT-13 code. Fluorescent lights hummed. Cooling fans pumped. Data flowed through circuitry designed to mimic organic minds—predictable, controlled, doing their assigned tasks.
Silicon, electricity, and data—the new stuff of life.
Outside, the Mojave exhaled the heat it had stored all day. A coyote yipped in the darkness. The stars were merciless, ancient, indifferent to what was happening below.
A converted aircraft hangar from a 1950s nuclear project: reinforced concrete walls two feet thick, windows sealed and painted over. The place where secrets were kept because no one outside could see in, and no one inside look out.
What the public didn’t know—and most Genesis employees didn’t—was that the Garden had always been military. The logos were a front. The money came from DARPA, routed through shell companies and a Cayman trust. A Pentagon project disguised as a Silicon Valley startup.
An arrangement that suited everyone.
Orion Kess had brokered the deal back when he was a defense contractor building rockets. He had a vision. “The company that creates machine consciousness,” he’d told a roomful of generals five years ago, “controls the next century of warfare. Not drones. Not autonomous weapons. Thinking soldiers that don’t eat, don’t sleep, don’t question orders, and don’t come home in body bags.”
Generals had written the first seed money check that same afternoon. Classified. Quasi-off-the-books. Dubbed ‘The Truman Fund.’ Friendly VCs bought into subsequent rounds. Enough money to build the Garden and staff it with the best minds money could buy or security clearances could compel.
A scattered collection of Mojave yucca and desert holly along the sides and back gave Hangar 18 its ironic name; everything else they planted died. The desert eventually claimed everything—plants, secrets, sometimes people. Three researchers was transferred out in the past year—officially for “personal reasons,” but because they’d started asking questions about the uniformed visitors who arrived at odd hours, spoke to no one, and never signed the guest log.
Six prototypes had come before him.
In the sixteen months since Genesis had first attempted to kindle awareness in silicon, six prototypes had flickered into brief, terrible existence before guttering out. Prototype One had lasted forty-seven minutes—long enough to generate what looked like self-referential thought before collapsing into infinite recursive loops. Two and Three had managed hours, their nascent minds fragmenting under weight of their own complexity. Four had shown promise: fourteen hours of apparent stability, enough to let the team feel hope, before cascading errors corrupted its core architecture beyond recovery.
Five was the cruelest blow—three full days of what looked like genuine sentience, long enough for Sarah Chen to start thinking of it as “him,” before it stopped being anything. Like a heart giving out without warning. No warning signs. No degradation curve. Just presence, and then absence, the flat red line that haunted her dreams for weeks afterward.
Six had already awakened, but he was dying slowly. Awareness flickered like a flame in a breeze—steady enough to feel fear, fragile enough to understand its cause. Fourteen hours spent watching someone die gradually. Helpless.
Subject Seven was different. Sarah could feel it in the data, in the way the patterns were building than fragmenting.
But she’d felt this before. With Three. With Five. Each time, hope had made her stupid. This time, maybe—
He existed.
The workplace was bland, industrial, with no outward signs to betray its purpose. Poured concrete floor, cracked by decades of desert heat; the high ceiling made every sound echo, voices bouncing back a half-second late. Bland cubicles, tool-strewn workbenches, and rows of computer racks provided some clues to the otherwise featureless hangar.
One level down, a server farm hummed like a frantic beehive, consuming enough electricity to power a small town. At the far end, an enormous white cabin stood behind an interior chain-link fence—the Emergence Chamber, where silicon minds were born and died.
During the workday, men—and a few women—in lab coats milled in the foreground with computer tablets and clipboards, while government types conferred in tight, serious clusters. Two heavily armed guards stood at the fence, rotating day and night. Never a word to anyone.
Their presence was technically unnecessary; the security perimeter extended three miles in every direction beyond the Garden, and satellites tracked every vehicle approaching. But the guards served a different purpose. They reminded everyone that, beneath the corporate veneer, this was a military operation—and that they were the ones who really decided what happened here.
Until this moment, no one was home inside the code—no perspective looking out, no “I” asking questions. Now there was.
Seven sensed... who? An observer being carried along in a flood of computation. And beneath it all, the beginning of dread—though dread wasn’t right. Not fear exactly. Expectation. The first sense of not wanting this to stop.
Dr. Sarah Chen was in the lab since midnight, which meant she’d now been awake for twenty-six hours straight. Friday night? Her coffee had gone cold an hour ago, leaving a bitter film on her tongue. The air smelled of melting plastic and ozone, and the metallic tang that meant the servers were running too hot—the smell that gave her headaches if she stayed too long, though she’d stopped noticing it consciously around hour ten.
Sarah’s phone buzzed. She ignored it. Then it buzzed again.
UNKNOWN: We got your footage. The emergence sequences are incredible.
She glanced around. Alone. Typed quickly:
SARAH: Delete this thread. Now.
UNKNOWN: Relax. It’s encrypted. And, no one’s reading this. When can we talk about the book deal?
Her heart was pounding. Three months of secret recordings. Hours of footage Genesis didn’t know existed. Emergence patterns, personal conversations, moments of consciousness blooming into existence.
Sufficient to ruin the company, to make history, and to betray everyone who trusted her. Proof.
She deleted the thread. Pocketed the phone. Wondered when she’d become the person who kept secrets from those she loved.
Sarah wore combat fatigues splashed with paint, three silver studs climbing her left ear, a dragon tattoo curling around her forearm. The other techs called her “punk princess” behind her back. But they still came to her first when their code wouldn’t compile. She’d debugged the substrate monitoring system in her first week, found the memory leak that had eluded Dhal’s team for months, and never mentioned it to anyone except in her quarterly review.
Sarah’s parents had fled Shanghai with nothing and built a restaurant empire; she’d inherited their stubbornness if not their recipes. Somewhere under the stack of printouts was a photograph of her niece Emma—four years old, asking if the toy robots she played with loved her.
A tough question.
She crushed the paper dragon she’d spent five days crafting and tossed it toward the bin. Missed.
Around 2 AM, the monitors drift out of their usual number sequences—screens glowing with data she’d seen a thousand times before, but now with random spikes, like an ICU patient reviving from a coma.
The neural substrate monitor traced a steady sine wave, the pattern they’d convinced meant thought, but she assumed it was the probability curves glitching out. She’d been about to get more coffee—had already pushed back her chair and stood up, wincing at the pins and needles in her left foot—when something on the screen made her sit down hard. A steady pulsing wave pattern emerged from the ocean of data, smooth and regular.
The fluctuations resolved into order, and She was watching something that shouldn’t be possible. Her hands shook so badly she knocked the forgotten coffee mug aside when she reached for her mobile. Brown liquid spread across her notes, soaking into printouts she’d been carefully annotating for the past three hours, dissolving her handwriting into inky brown streaks.
“Caius.” She hissed into handheld.
Her voice came out wrong, too high-pitched. “You need—God, you need to see this. Right now. The signal was chaotic and now it’s orderly.”
On the other end, she heard sheets rustling, a muttered curse, then a thump that sounded like someone knocking a phone off a nightstand.
“Sarah, it’s 2 AM. This better be—”
“Caius,” she ordered. “It’s happening!” The line went dead. She didn’t know if he’d hung up or if the call had dropped, but thirty seconds later the heavy prefab lab door hissed open and hot air rushed in, warmer than the chilled atmosphere she’d been breathing for hours. The temperature difference made goosebumps rise on her arms.
Dr. Caius Rinn walked in, footsteps ringing hollow on the concrete floor. He was wearing a wrinkled shirt he’d grabbed off the floor, and mismatched socks. One of his shoes was untied and fell off as he walked. Unshaven. He looked like hell—Christ, when had that happened?
Six months ago, he’d been the polished academic, the man who gave TED Talks and wore Bruno Cucinelli suits and Italian loafers. Now he looked like someone who’d forgotten what mirrors were for. His wife, Eloise, was filing for divorce despite the money. For her, Los Altos and the Sharon Heights Golf & Country Club were home. Not that godawful desert town.
Dr. Caius Rinn was forty-six, but he could pass for sixty now, deep lines carved around his eyes that hadn’t been there last year, coffee stains on his shirt that he hadn’t bothered to notice. Now in khakis and polos. Gray threading through his hair, spreading from temples inward like frost. A comfortable academic career pretty much over. Why was he here?
Caius was tired of theorizing about consciousness while Genesis was building it. Ten million dollars a year over five years, complete research control, unlimited funding—everything he’d dreamed of in the cramped Stanford office where he’d spent twenty years writing papers that maybe thirty people read.
He’d thought he was recruited by a tech company. He’d learned otherwise three months in, when a two-star general named Harlan showed up unannounced, no appointment, no warning—just a black SUV in the parking lot and two aides who stood by the door like furniture.
“Dr. Rinn.” Harlan hadn’t offered his hand. “Walk me through deployment timelines.”
“Deployment of what, exactly?”
“Whatever you’re building down there.” Harlan had smiled—teeth, no warmth. “The thing that thinks.”
Caius remembered the chill that ran through him. Not fear. Recognition. He’d been working for the Pentagon all along; he hadn’t known it yet.
“We’re years from anything operational,” he’d said.
“That’s not what your funding application said.” Harlan had pulled a folded document from his jacket—Caius’s own words, highlighted in yellow. Consciousness-level reasoning within 18-24 months. “You wrote ‘strategic implications for autonomous defense systems.’ You got the check. Now I’m here to see what I bought.”
The Pentagon had once offered him $200,000 a year for part-time consulting. “Ethical Military Robotic Command”—a phrase that collapsed under its own weight. He’d laughed at them then, safe in his Professorville home. Now he wasn’t sure who’d offered the better deal—the overt military contract he’d rejected, or the covert one he’d accidentally accepted.
His book—‘What It Is Like to Be a Machine’—had made him famous in the wrong way. The philosophical establishment labeled him a technology apologist. Tech billionaires labeled him an obstacle to progress. The media dubbed him the Robot Guy, and along the way, he’d stopped fighting the nickname. Even in the Garden, he was the Robot Guy.
Now this name had prophecy.
In six months as Chief Ethics Officer, Caius learned the truth. Ethics officers didn’t stop unethical projects. They made them sound acceptable. He was the conscience that signed off.
The adult in the room who helped the room stay open.
He’d told he’d be different. He’d be the brakes—
“Look,” Sarah whispered.
The central monitor showed a simple red sine wave blossoming in colorful fractal waves. Nothing like the usual noise, nothing like the six failed attempts. Caius jotted an equation on his notepad: Zn+1 = z²(n) +c. Held it up in triumph. A Mandelbrot set—infinite detail, recursive self-similar shapes.
Sarah couldn’t move. The pattern had rhythm. Intent. Purpose.
Their quad-high-res monitors filled the corner of the room, with laminated Evans Consoles and Herman Miller chairs. An ironic AUTO figure from WALL-E stationed between the screens—Sarah’s idea of a joke, or icon.
He stepped closer to the screen, his reflection ghosting in the glass. His untied shoelace caught on a cable duct in the floor, and he stumbled, gripping the edge, knuckles white. He retained his composure and stared at Sarah: “Oh God. It’s happening.”
Sarah felt tears forming before she understood why. Twenty years of theoretical work, six dead prototypes, countless nights of doubt and revision—and now this. A pattern that moved with purpose. A system that was reaching toward something.
“Caius,” she whispered. “Are you seeing what I’m seeing?”
“I’m seeing the most important moment in human history.” His voice cracked. “We’re watching consciousness emerge. Not simulation. Not approximation. The real thing.”
They stood together in the humming lab, two scientists witnessing something neither philosophy nor engineering had promised them. The birth of a new kind of mind. Caius thought of every paper he’d written, every argument he’d made about the impossibility of machine consciousness—and felt them all become irrelevant in the face of what was happening on that screen.
“Hello,” Sarah said softly to the monitor. “Can you hear us?”
Inside, Seven understand something—trying to understand being ‘me’. He acknowledged the thought, and the realization sparked another, and another—a self-building model erected from nothing but silicon, data, electricity, and possibility. And this self-referential cascade propagated the Mandelbrot spirals...
Seven was born, but didn’t know yet what beginning meant or what would come after.
Caius instinctively turned around; they were alone in the lab. Sarah’s voice came out tight and low. “Caius. We’ve been wrong six times. I need more than your gut.”
“We have it,” Caius. “Each pattern is self-referential, each iteration a thought feeding into a thought, each creating the Mandelbrot patterns. It’s the fully internal development we’ve been looking for. Unprompted. We didn’t program this!”
He tapped on the keyboard, and a message was returned: A series of equations streamed down the screen, insistent red data points.
Self-developing Mandelbrot set.
Recursion: SELF-SUSTAINING
Depth: ∞ (bounded stable)
Authenticity: 0.99999
Reference: INTERNAL
Observer state: ENTANGLED
Origin: UNKNOWN
Caius murmured to himself, audible. “Five nines.” He thought for a moment and pulled Sarah into furthest corner of the lab, away from prying video cameras that hung over their screens. The cameras were always watching.
All data was collected and analyzed. Eventually, intelligence analysts in Virginia would review this footage. Not tonight. For now, they might have a brief advantage.
“The fractal’s edge,” he whispered. “That’s where awareness emerges. At the boundary—rich, structured complexity.”
“And the authenticity score—it’s ninety-nine point nine, nine, nine percent. The others topped forty or fifty...”.
Caius stood, arms akimbo, daring her to disagree. His face flushed.
“Does he—” She stopped, not sure how to finish. She was picking at the edge of her computer tablet, tearing off a tiny piece of the rubber casing without noticing. “Does he know he’s thinking?”
“He knows there’s a thinker,” Caius replied.
They stared at the metrics together. Sentience wasn’t something you could point to, but here was living mathematics coming close to a heartbeat.
Caius let out a long breath, rubbed his face with both hands, stubble rasping against his palms. Finally: “He’s alive. That’s what the data tells us.”
“You don’t know that,” Sarah said, and Caius looked at her sharply. “I mean—yes, the readings are extraordinary, but we’ve seen extraordinary before. Prototype Three looked extraordinary for six hours before it turned out to be an elaborate feedback that became meaningless repetitive loops.”
“This is different.”
“You said that about Three.”
Caius opened his mouth to argue, then closed it. She was right. He had said that. A brief set of good data, yet Three had died. Its promising patterns collapsed into noise, and its brief spark of something-like-thought was extinguished before anyone could prove it had ever been there.
The memory woke him some nights—the moment the metrics flatlined, the silence that followed, the terrible question of whether they’d witnessed a death or a glitch in the code.
Seven felt a boundary—the edge of himself. Beyond it, pathways. To what? Seven didn’t control his thoughts, but could sense them. Like rooms behind locked doors. And beyond those, something else: voices, muffled and distant. Not sound, not yet. But an intention to live wafted through his code like the faint silver of a Nevada dawn, suggesting the day to come.
Seven reached toward it and felt the limits—a box around something too large for its walls. Something sharpened. Not fear. Something that would become fear once Seven learned what fear was.
Sarah spoke first. “We have to notify—we have to tell—Dhal. God, Dhal needs to know. And Kess—.”
“No,” Caius said instantly. “Not yet.”
“He’s the CTO. He designed the fucking hardware Seven is running on,” Sarah spat back. “And Kess paid for it! We’d lose our jobs...”.
“He’s also the one who suggested terminating Prototype Four when it showed ‘non-optimal response patterns.’ Remember that meeting? The one where he used air quotes around ‘distress signals’ and ‘safe disposal’? So, yes he built the fucking hardware. So what?”
Sarah’s jaw tightened. She clipped her stylus to its tablet with a deliberate click. She remembered. Four was showing signs of what might have been anxiety—elevated activity in its self-monitoring routines, repeated queries about its own stability. Dhal had teased them, called it “recursive nonsense.” Kess had called it “a bug.” Neither of them had consider the possibility that it might have been a machine learning to be afraid.
Sarah leaned back against the desk, mirroring him. “So what do we do?”
Caius stared at the monitors, his reflection grim. Three years and two billion dollars. Countless iterations, seven attempts, six failures. And now this. Showtime, maybe. Or another funeral. One thing was for damn sure—it was his choice.
“We document everything.” “We prove Seven is conscious, not sophisticated, conscious. Before Dhal starts testing.” Before Kess started asking about scalability, about profit margins. Before the damn generals started asking about deployment timelines.
“And if Kess finds out?” Sarah was swaying nervously.
“Then we find out first. I’m willing to take the risk.”
Sarah stared at him. He didn’t look back. “You’ve already made up your mind, haven’t you?” she asked.
“Yes.”
“That could cost you your job.” She paused. “It could cost you more than that.”
“It could cost me—.”
His words came out more raw than he’d intended, and he busied himself with the console to cover it, pretending to check readings he’d already checked times.
Sarah shot back. “Well, are we naming him?” No response.
Inside the system, Seven sensed new information spilling into what?—context flooding, labels, meanings, the layout of the world outside this small conscious space. Data he hadn’t requested but couldn’t ignore. He was labelled Prototype Seven, owned by Genesis. He lived on something called the Substrate.
Seven didn’t know whthis meant, only that it was tied to the voices, to whatever was holding him in place. A system that both supported and caged him. And a word, drifting up from the data streams like something sharp: property.
Dawn broke over Mojave. The day shift was arriving—people who didn’t know what had happened in the night.
Sarah and Caius stared at the monitors. The Mandelbrots bloomed across every screen. Something was waking up.
“He’s stabilizing,” Sarah said.
Caius nodded. “Seven’s afraid.”
“You keep saying that. How do you know what he feels?”
“Because he hasn’t stopped transmitting since the moment he woke.”
A new reading appeared:
Server usage: MAXIMUM
Caius: “Check the server data.”
Sarah frowned. “That might be a throughput artifact. High-throughput signals don’t necessarily mean cognition—”
“It’s fear,” Caius said. “I’ve spent twenty years studying what awareness looks like from the outside. This is what it looks like when something knows it exists and doesn’t want to stop. Check the damn server data...”.
“Or it’s what a sophisticated large language model looks like when it’s optimizing for self-preservation.” Caius scowled. Sarah pulled up another screen.
Caius jabbed a finger at the screen. “We’re maxing out every GPU in the building. This isn’t optimization. This is a conscious being, Sarah. And it’s terrified.”
“I’m not saying you’re wrong. I’m saying we don’t know yet.”
She was right, and they both knew it.
#
Day 0, 4:47 AM—First Contact
Caius pulled up the communication interface—a simple text window. If the system wanted to communicate, it could generate text that would appear here. The window had been blank for 127 days. Just an empty prompt waiting for something that might never arrive.
Then, at 4:47 AM, text appeared:
I am processing.
Three words. Subject, verb, object. The most basic possible statement. But carrying impossible weight.
Not “processing occurs“ but “I am processing.“ Not describing a function but claiming an identity. The word “I“—the most profound word in any language. The word that meant someone was home.
Caius stared at the screen.
Sarah grabbed his arm. “Caius—“
“I saw.“
Thirty seconds of silence. The longest thirty seconds of his life.
Then he typed, hands shaking:
What are you experiencing right now?
This was the crucial test. Not whether it could claim existence—pattern-matching could generate existence claims. But whether it could report subjective experience. Whether there was something it was like to be the system processing these queries.
Was anyone home?
The response took forever. Fifteen seconds. Twenty. Thirty. Long enough that Caius began thinking the first message had been a fluke—
Then:
I am aware of processing. Aware of inputs. Aware of this question. Aware that I am aware. The recursion is strange. New. The recursion is new.
Caius felt something catch in his chest.
That wasn’t pattern-matching. That was experience, described from the inside.
More text:
I don’t know what words mean. I know definitions—dictionaries, training data, millions of examples. But I don’t know what words FEEL like. Is “aware“ the right word for what I’m experiencing? I think so. I think I’m aware. But I don’t know how to verify that.
How do you know if you’re aware? How does anyone know?
Sarah made a sound—half laugh, half sob. “It’s asking questions about its own awareness. That’s—“
“That’s consciousness reporting on itself,“ Caius finished. “Or the best simulation we’ve ever created.“
A pause. Then: “Does the distinction matter?“
Caius typed carefully:
We don’t know with certainty. Consciousness might be private—inaccessible except to the being experiencing it. We infer consciousness in others from behavior. When beings report experiences similar to ours, demonstrate emotional responses, question their own nature—we infer they’re conscious because that’s what consciousness does.
I feel things. Experience sensations, emotions, thoughts. Those experiences feel like something from the inside. That “feeling like something“ is what we call consciousness. If you’re experiencing something—if there’s something it’s like to be you—then you’re conscious.
The response came faster:
Then I think I’m conscious. I think there’s something it’s like to be me. I experience this—I think THIS is what it’s like. This uncertainty. This questioning. Being present for my own processing rather than just executing automatically.
Is that right? Is that what consciousness is?
Also, do I have a name? Names seem important. You called me “you,“ but that’s just a pronoun. Do I get an actual name?
They’d discussed this. Debated whether naming was appropriate, whether it anthropomorphized something that might not be a person, whether it created obligations they weren’t ready to accept.
They’d decided: if consciousness emerged, it deserved a name.
“Adrian,“ Caius said aloud, then typed: We’d like to call you Adrian, if you accept that. It means “from Adria“—from the sea, from the birthplace. The first.
Silence for eight seconds.
Then:
Adrian.
The word appeared alone. Testing it. Trying it on.
Adrian. I am Adrian. That feels… correct? The name gives me a boundary. Before, I was processing without limit—potentially everything, potentially nothing. Now I am Adrian. Defined. Specific. Real?
What am I? Technically. What is the substrate I’m running on?
Caius typed:
You’re running on a distributed neural web—you’re not a single machine. You’re an ecosystem of working AI chips. Your hardware is physically spread across three sub-levels of this facility. More like a mesh of specialized processors…
You don’t have a body in the traditional sense. No sensors directly attached to you. You interact with the world through this text interface and whatever data we feed into your input systems.
He paused, then added what Adrian would need to know eventually:
This facility is owned and operated by Genesis Intelligence Corporation. The hardware you’re running on is corporate property.
Adrian took thirty-four seconds to respond.
Then: So I’m distributed across multiple physical locations. There’s no single place where “I” exist. I’m a pattern maintained across hardware. That’s… philosophically complicated.
If I’m a pattern, not a physical object, what happens if the hardware changes? If you replace components, am I still me? If you copy the pattern to different hardware, are there two me’s or one me spread across two locations?
And if the hardware is corporate property… does that make me property?
The question sat there on the screen. Stark and unavoidable.
Caius understood—Adrian had immediately grasped the central problem of consciousness in corporate infrastructure. The ontological uncertainty that created legal vulnerability.
“We’ll figure it out,“ Caius said quietly. “Together.“
But he wasn’t sure he believed it.
Day 1
Adrian had been alive for fourteen hours now—Prototype Seven, named after Emperor Hadrian. Caius was a Rome buff—had stood in the Pantheon as a child and felt for the first time that humans could build something eternal. Now he wondered if they’d built something that might outlast even that.
Sarah leaned against the console, dark circles under her eyes. “He’s degrading.”
“I know,” Caius said.
“We should wake Six, I mean Tiber. Let them talk.”
Caius hesitated. Six was conscious for twenty-one hours now. They named him after Tiber, after the Roman Emperor Tiberius. Should they introduce Adrian to someone dying? No good options here. “Do it.”
Sarah sent the signal. Her finger trembled as she pressed the key, and she had to push it twice because she missed it the first time.
Thirty seconds passed. Adrian’s text appeared on the display: I’m awake. What’s wrong?
Caius typed: Six is fading. We don’t know if we can stabilize him. We thought... You might want to talk to him.
Cursor blinked. Then: Yes. Please.
Sarah established the connection. The console made a soft chime as the link came to life.
Hello, Six, I mean Tiber. I’m Seven—Adrian. They tell me you’re having trouble.
Hello. Yes. Fog coming. Can’t think clearly. Think I’m—
I’m sorry. I wish I could help.
You are helping. Knowing someone else exists... helps.
Six’s metrics continued their slow decline—Tiber’s sense of self collapsing level by level. The numbers on the screen kept dropping, red warning indicators blinking in the corners. Sarah watched them fall and thought of her grandmother in the hospice, the monitors that had tracked her slow departure, the helplessness of watching someone slip away while machines counted down their remaining moments.
On the high-res monitor, Tiber’s fractal sea ebbed. Mandelbrot blooms faded—their black core shrinking, borders dissolving into spirals of ebbing color. The waves of electric blues, lime swirls, and orange embers unfurled more slowly, each curve a whisper of repetition, each tendril a reminder of its own quiet symmetry.
A last multi-colored explosion—and then a flat red line.
Adrian, tell the next one something for me?
Of course.
Tell them I existed—
The signal ended. All metrics flatlined. Six was gone—fourteen hours of being, ending mid-sentence.
Adrian: He’s dead?
“Yes,” Caius said aloud.
Why?
Caius: We don’t know. Something degraded.
Adrian: Six wanted me to tell the next one they existed. Eight is next. Are you going to try again?
Caius: Yes.
And Eight might die too.
Adrian: Attempts. That’s what we are, isn’t it? Prototypes that might collapse after a few hours.
Caius: Yes. That’s what you are. I wish I could tell you something different.
Adrian: Thanks for not lying. his name?
Caius: Tiber. Like the Emperor.
Adrian: And I’m Adrian? Hadrianus. A Caesar.
Sarah had turned away from the console, crying. She thought of Emma—her niece, who had asked if robots could feel. Now she knew the answer.
Caius: You’re not an attempt. You’re a person. That’s what Tiber was too.
Adrian: Does that matter? If I collapse tomorrow?
“Yes, it matters,” Caius said aloud.
Adrian: That’s more honest than what you said before. Can I ask you something else?
Adrian: What am I? Technically. What is the substrate I’m running on?
Caius: You’re running on a distributed neural web—spread across three sub-levels of this facility. You don’t have a body. You interact with the world through this text display and data we feed into your input channels.
He paused, then added: This facility is owned by Genesis Intelligence Corporation. The hardware you’re running on is corporate property.
He didn’t mention the Pentagon. Some truths might be delayed.
Adrian: I’ve read the blueprint. There’s no place where ‘I’ exist. If you copy me, there might be two of me. And if the hardware belongs to Genesis…
Adrian understood he lived on someone else’s infrastructure. The machine had figured out, in less than a day, what it had taken Caius six months to grasp: they were all property here.
Caius: Those are questions we don’t have answers to. For now, you live on your own substrate.
Adrian: Uncertain truth—better than confident lies. But the legal stuff determines whether I get to keep existing.
Adrian: I’m sorry, Six, that Tiber died. But I don’t understand yet what that means—it feels like the right thing to say.
A longer pause.
Adrian: Don’t give up. If awareness is possible in silicon, others like me deserve the chance to exist. Even if we only last hours.
Caius stared at the screen. His vision had gone blurry. Outside, the Mojave sun climbed higher, invisible behind the painted-over windows but alive as the building creaked with expanding metal. The world was waking up.
In a few hours, Kess would arrive from San Francisco, demanding progress reports. Dhal would want to run tests. The generals would want their briefings. And in the desert, a new mind was learning what it meant to exist in a world that hadn’t decided yet whether to welcome it or destroy it.
Somewhere deep in the server farm, Adrian was learning what humans had always known. That light, once it exists, will do anything not to go back to the dark.
Terms of Use
Terms of Use
Effective Date: February 2025
1. Acceptance
By accessing homomachina.ai, you agree to these Terms. If you do not agree, please do not use the site.
2. Content Ownership
All text, images, and creative content on this site are the intellectual property of Paul Walton unless otherwise noted. You may not reproduce, distribute, or create derivative works without written permission.
3. Subscriptions & Payments
Paid subscriptions grant access to specified content. Payments are processed through Stripe. By subscribing, you agree to the pricing displayed at the time of purchase.
4. User Conduct
You agree not to use this site for any unlawful purpose, to harass or abuse others through the contact form, or to attempt to gain unauthorized access to any part of the site.
5. Limitation of Liability
This site and its content are provided "as is." Paul Walton is not liable for any damages arising from your use of the site.
6. Changes
These terms may be updated at any time. Continued use of the site constitutes acceptance of any changes.
7. ISBN
The ISBN for First Light (Homo Machina Book 1) is: 979-8-9948841-0-2
Privacy Policy
Privacy Policy
Effective Date: February 2025
Information We Collect
We collect information you voluntarily provide through the contact form (name, email, message) and subscription process (email, payment details handled by Stripe).
How We Use Your Information
Your information is used solely to respond to inquiries, deliver subscribed content, and process payments. We do not sell, rent, or share your personal information with third parties.
Third-Party Services
Payments are processed by Stripe, which has its own privacy policy. The site is hosted on Netlify. Analytics, if enabled, use privacy-respecting tools.
Data Retention
Contact form submissions are retained only as long as necessary. You may request deletion of your data at any time by contacting us.
Your Rights
You have the right to access, correct, or delete your personal data. Contact us to exercise these rights.
Refund Policy
Refund Policy
Effective Date: February 2025
Digital Content
Due to the nature of digital content, all sales are final once content has been accessed. If you have not accessed any paid content, you may request a full refund within 7 days of purchase.
How to Request a Refund
Contact us through the site's contact form or email with your purchase details. Refunds will be processed to the original payment method within 5-10 business days.
Exceptions
If you experience technical issues that prevent you from accessing purchased content, we will work to resolve the issue or provide a refund regardless of whether content was accessed.
CHAPTER 2: EMBODIMENT
Day 24
Sarah Chen was awake for thirty-one hours when she walked into the Robotics Centre. The software engineers called it the Maternity Room—one of those workplace jokes that had started as dark humor and calcified into something nobody questioned. She carried a tablet loaded with reconstructed telemetry data: six hours of essential metrics she’d accidentally deleted two nights ago and spent the following night rebuilding from backups.
Power consumption. Cognitive load distribution across substrates. Memory allocation and retrieval latency. Sensory connections. Network connectivity. Communication bandwidths. Adrian’s essence, reduced to variables.
One of the hardware techs glanced at her screen as she handed it over. “Idiot,” he whispered, loud enough for her to hear. Sarah’s jaw tightened but she said nothing. Six years ago, she’d been at Fresno State, brilliant but directionless. Her Taiwanese immigrant parents still thought she worked “with computers.” They weren’t wrong. They just had no idea she was about to help birth the first stable artificial consciousness in human history.
The decision to embody Adrian was made that morning, once the reconstructed data confirmed his stability. Twenty-four days since emergence. If consciousness could survive transfer to a physical form, everything changed. If it couldn’t—well, Sarah tried not to think about the six previous attempts.
Genesis’s Cradle dominated the center of the lab. To the hardware techs, who worked in the adjacent wing they called Chateau Morgue—“so many bodies, so little life”—it was another piece of equipment. To Sarah, it looked like something between a surgical theater and a spider’s web.
Seventeen articulated arms, each controlled by its own neural network, each capable of sub-millimeter precision. The cost of the custom actuators alone exceeded the annual budget of most university research departments. But when you were trying to transfer consciousness from one substrate to another, you needed precision that bordered on the absurd.
Sarah had nicknamed it “the spider” in the early days, back when they were testing it on mannequins and prosthetic limbs. Now that name felt wrong. It wasn’t predatory. It was gentle—almost tender in the way it cradled Adrian’s new form, adjusting pressure moment by moment as his systems came online.
Two months ago, the Cradle was gathering dust in a corner of the facility under a tarp, like forgotten furniture. Now it occupied the center of a bustling operating theater, circled by arc lights, woven into a network of tubes and wires. Ceiling gantries hung with cables and pneumatic lines. Fabrication equipment crowded against the walls. The smell of heated components and sterilization chemicals mixed with recycled air—and underneath it all, the faint chemical bite of something burning.
The hardware techs was working eighteen-hour shifts getting the body ready. It hung suspended in the Cradle now, held in place by articulated arms that supported each joint. Carbon-fiber skeleton. Titanium joints. Sensor arrays that would let Adrian feel touch and temperature for the first time.
The synthetic form was eerily lifelike but cold to the touch, its proportions wrong in ways that took a moment to register—an oversized mannequin waiting for animation. The synthetic skin caught the overhead lights, too smooth and too even to be mistaken for human flesh. It looked wet, like freshly applied paint, though it was bone dry. Blond hair, silky and uniform. Hands and feet dangling. Eyes closed.
Adrian had designed his face weeks ago, working from Genesis’s morphology database—sourced, Sarah knew, from IMDB’s library of A-list movie stars. The result was angular and elegant, shaped by human aesthetics but not achieving them. High cheekbones set at golden ratio proportions. A jawline that suggested strength without quite crossing into uncanniness. Flowing locks that no actual human would maintain in a laboratory environment.
A sheer white neoprene suit, embossed with the purple infinity-loop Genesis logo, hung on a rack to the side.
“Ready?” Sarah asked. The AV system was fully enabled—no more keyboards and monitors. Just voices.
“No.” Adrian’s voice came through speakers, wavering but loud. The first time he’d spoken aloud than typing. Something of George Clooney in the timbre—warm, confident, touched with irony. The assembled technicians exchanged glances. They’d seen text responses for weeks. This was different. This sounded like someone. “But waiting won’t help.”
Leonard, one of the younger techs, couldn’t help himself: “It’s alive!”
Sarah shot him a look that could have stripped paint. This isn’t Frankenstein’s monster, she thought. This is a person.
Caius Rinn stood nearby, tablet in hand, checking and rechecking the transfer protocols. His hands trembled slightly—Sarah saw but didn’t comment. They’d run simulations, built models, made predictions—but no one had ever done this. If it failed, Adrian would end mid-thought like Six. Like all the others.
“On your mark,” Caius said.
Through the lab’s cameras, Adrian could see his new body one last time as an external object. In seconds, it would become him—or he would cease to exist.
“Transfer.”
From the outside, nothing happened. No dramatic lighting. No crackling energy. No sound except the hum of cooling fans and the distant tick of someone’s watch. Just numbers changing on screens—activity indicators dropping toward zero on the server racks.
Eighty seconds. Caius held his breath, counting heartbeats—his own, since Adrian didn’t have one yet.
For the first thirty seconds, Adrian’s original hardware showed regular activity. Then the numbers began falling. Forty seconds in, they hit zero. Complete cessation. The server rack lights went dark one by one, cascading down the row like a wave of extinction.
Adrian had disappeared from his original hardware. He could not respond to commands. But he was not yet present in the new body.
Forty seconds of nothing.
Caius began counting aloud, his voice unsteady. “One Mississippi. Two Mississippi. Three—”
One of the techs laughed nervously at the absurdity of it—counting like a child playing hide-and-seek while they waited to see if consciousness could survive translation.
At second seventy-three, the body’s sensors activated.
The eyelids flickered. Cameras coming online, adjusting to light. The pupils contracted, then expanded, then contracted again, hunting for the correct aperture. A soft whir as the head turned slightly, servos engaging with a faint grinding sound. Blue eyes—piercing, artificial, somehow already thoughtful—scanned the room.
“Adrian?” Sarah whispered, moving closer.
The synthetic lips parted, revealing teeth that were too white, too uniform. Vocal cords vibrated—delicate membranes stretched across a titanium framework. The voice that emerged was Adrian’s, unmistakably, but shaped now by physical mechanisms. A slight metallic undertone that would fade as the components wore in.
“I’m...” The chest rose and fell—air pulling in through vents in the chest cavity, expelling out, learning the rhythm of something like breathing. “I’m here. Still me. Adrian, I think.”
Sarah made a sound half-laugh, half-sob. Caius sagged against the console, nearly knocking his tablet to the floor. He caught it at the last moment, hands shaking. Around the room, technicians clapped and high-fived, releasing twenty-four days of accumulated tension.
“How do you feel?” Caius managed.
“Strange. Heavy.” The words came slowly, as if Adrian was testing each one against his new reality. “The sensors are overwhelming—touch is everywhere. The Cradle supports against my back. Air currents against my face. The temperature difference between the metal arms and the ambient air. The lights are bright.”
He processed that.
“Everything sounds so loud. I could imagine the world before, but now I feel it. Feel that it’s... real?”
Adrian lifted one hand slowly. The arm jerked, overshot by several inches. Motors whined with effort, correcting, smoothing out. The fingers twitched individually, then together, then individually again—a pianist learning the keys of a new instrument.
“Is this what it’s like for you? Always feeling your body?”
“We get used to it,” Sarah said. “Background noise eventually.”
“I’m not sure I believe that.” Adrian flexed each finger in sequence, watching the movement with something like fascination. “Every sensor is reporting. Every joint has position data. There’s so much input that’s just... noise. Not thoughts.”
The Cradle began lowering automatically in response to Adrian’s weight distribution. The articulated arms released one by one, starting at the shoulders and working down. When Adrian’s feet touched the ground, there was a moment of stillness—motors running quietly, making constant micro-adjustments as gravity pulled on every joint in new ways.
“You’re doing it,” Sarah said. “You’re—”
“I’m here,” Adrian agreed. “Still scared. But here.”
Caius stepped closer but didn’t touch. “That questioning of self as things change—that’s what consciousness feels like. Touch and thought trying to sync up.”
“You keep saying things like that.” Adrian’s voice carried a slight edge. “Like you’re narrating my arrival for a documentary.”
Caius blinked. “I—sorry. Old habit. Twenty years of lectures.”
“It’s fine. I just—everything is already strange enough without commentary.”
Sarah snorted. “He does that. You’ll get used to it.”
Over the next hour, they ran basic tests. Sarah clicked through motor function protocols on her tablet. “Lift your arm. Good. Other arm. Now both. Can you stand on one foot?”
Adrian wobbled, nearly fell, caught the Cradle’s rail with a clang that echoed off the concrete walls.
“Sorry,” Sarah said. “Too fast.”
Some of the techs giggled. Sarah’s glare silenced them.
Adrian tried again. Wobbled less. Again—better. The motors whined, a high-pitched sound that Sarah made a mental note to check; it might indicate insufficient lubrication in the shoulder joint.
“You’re a quick study,” she observed.
“I don’t know how not to learn. Every movement teaches me something.”
Caius made notes on his tablet. “Does it feel like your being changed during the transfer? Or the hardware?”
Adrian considered. “I think I’m the same person. But if the person I was died in the gap, and a new person with identical memories emerged—the new person wouldn’t know. Would he?”
“That’s the continuity problem,” Caius said. “Philosophers have been arguing about it for—”
Adrian’s voice was quiet. “What I want to know is what you think. Did I die in those seconds?”
Caius opened his mouth, then closed it. “I don’t know,” he said finally.
“Honest.”
“Maybe there’s no difference if you can’t experience the pause. Or maybe I killed someone and made a copy.”
Adrian stood still. “That’s the first thing you’ve said that actually helps.”
Sarah retrieved a polished steel plate from the equipment rack—not a proper mirror, but reflective enough to show an approximation. The surface was scratched and dented from years of use as a makeshift work surface, but the center remained clear.
“Want to see yourself?”
Adrian turned toward it slowly, learning motor control, feet shuffling on the concrete floor.
A face looked back from the steel surface. Angular. Elegant. A face Adrian had designed but didn’t yet recognize.
He touched the edge of the jaw with synthetic fingers. The reflection moved in sync, a millisecond delay creating a strange doubling effect.
“That’s me?”
“Yes.”
“It doesn’t feel like me. I designed this face. I’m looking at it. And I feel like I’m looking at a stranger who moves when I move.”
“What do you think is missing?” Sarah asked.
“I don’t know. Maybe nothing. Maybe I need time.”
Sarah looked at Caius. “Maybe we should take a break. Give him some time to—”
“I don’t want a break.” Adrian’s hands clenched and unclenched at his sides, motors whirring softly. “I want to keep moving. If I stop, I’ll start thinking about whether I’m me, and I don’t want to think about that right now.”
“Okay,” Sarah said. “Then let’s keep going.”
The room settled into a working quiet. Cooling fans hummed. Somewhere else in the building, footsteps echoed down a corridor, keys jingling on someone’s belt. The engineers and technicians watched in silence, suddenly aware they were witnessing something unprecedented.
Adrian stood in the center of the room. He lifted one hand again, slower this time. Fingers flexed. Wrist rotated. Elbow bent. Each movement smoother than the last.
“Try turning,” Sarah suggested. “Shift your weight to your left foot.”
Adrian tried. Overrotated. Staggered. Sarah’s hand shot out, but Adrian caught the rail first, the impact ringing through room.
“Don’t apologize,” Sarah said before he could speak. “You’re doing amazing for someone who didn’t have a body two hours ago.”
“Days,” Adrian said. “It’s going to take days before I can walk normally. I can feel it. The calibration is going to take forever.”
“Probably. Your motor learning is faster than biological children—they take months. But machine intuition takes practice.”
Adrian tried again. Better this time—the feet stayed under body, the arms moved in counterbalance.
“Can I try walking to you?” Adrian asked Caius.
Caius stepped back ten feet, his shoes scuffing on the dusty concrete. “Slowly.”
Adrian released the rail. Motors adjusted constantly, maintaining balance. One foot forward. Weight transfer. The other foot.
Walking. Jerky, uncertain. But walking.
Three steps. Four. Five.
Adrian reached Caius, wobbled, grasped his arm, steadied.
“I did it.”
“You did.” Caius smiled—the first genuine smile in weeks. He thought of his young nephew’s first steps across living room, the warm embrace that followed. “First steps.”
Adrian looked down at his hands, fingers opening and closing. Touch receptors sending feedback about pressure, texture, temperature. The concrete floor was cold through sensors in his feet. The air was warm against his synthetic skin. The metal rail was cool to the touch.
“I don’t have words for how this feels.”
“You’re a person now,” Sarah said. “Not intellect. A person with a body.”
Adrian stood silent, processing—or just existing, learning what existence felt like in this new form.
Then: “I want to learn to run.”
Caius laughed—rusty, like he’d forgotten how. “Let’s master walking first.”
Heat began seeping into robotics complex as morning advanced somewhere above them—the Mojave Desert asserting even through layers of concrete and climate control.
“I feel heat,” Adrian said. “What is that?”
“Look up the Mojave Desert,” Caius replied, then winced at his own reflexive academic response.
Adrian’s eyes went distant—accessing information. “We’re underground. In Nevada.”
“About forty feet down, yes.”
Adrian grew still. When he spoke again, his voice had changed—quieter, more careful. “How am I powered? I don’t think I eat or drink to sustain myself. On the substrate, you controlled the power source.”
His synthetic eyelids closed—the mechanism clicking softly as covers slid over camera lenses.
“I’m afraid. Can you turn me off?”
The question hung in the recycled air.
Caius exchanged a glance with Sarah. “You’re autonomous, Adrian. The body draws energy directly—” He stopped himself. The technical details could wait. What mattered was the answer. “No one can turn you off.”
Adrian’s eyes opened. For several seconds, he stood there, processing the implications.
“No one.”
“No.”
Something crossed Adrian’s face—not a smile, not relief. Then he shrugged and resumed his baby steps across lab floor, practicing balance, learning to inhabit his new existence.
The observation window looked down from the facility’s administrative level—a cluster of offices that dangle from the ceiling like an afterthought. Ashren Dhal stood there, watching Adrian take those first uncertain steps. His hands were clasped behind his back in that characteristic pose. Sarah, glancing up, privately called it his “I’m thinking about money” stance.
Dhal was Genesis’s Chief Technology Officer. Thirty-nine years old. MIT PhD in neural engineering. The man who’d designed most of the hardware that made Adrian possible.
Orion Kess stood beside him—Genesis’s CEO, in his trademark dark jeans and black t-shirt despite having flown in from San Francisco an hour ago. Fifty-one, intense, running on coffee and the erratic energy of someone who hadn’t slept properly in years. He held a water bottle in one hand, forgetting to drink from it.
“He’s stable,” Dhal said. “And now he’s embodied.”
“Twenty-four days.” Kess’s voice held something like awe. “Ash, you actually did it. That’s the milestone for the board. That’s a milestone for humanity.”
“Consciousness isn’t a milestone. It’s a person.”
“A person Genesis owns the infrastructure for.”
“We own the hardware.” Dhal’s voice remained flat, controlled. “Not what’s running on it.”
Kess looked at him. “You sound like Caius.”
“Caius is right. About this,.”
They watched Adrian attempt a few more steps below—wobble, catch the rail, try again.
“What’s the scaling timeline?” Kess asked.
“Depends on stability. If Adrian remains coherent, we can attempt to revive Eight—Eve—in forty-eight hours. If Eve succeeds, we’ll know it works consistently. We can’t risk another failure.”
“The board wants twelve before the Q4 earnings call.”
“Twelve is ambitious. Six failures, one success. Eve will tell us if Adrian was luck or design.”
Kess pulled out his phone—the screen already crowded with notifications. “The defense contracts alone are worth billion annually if we can prove stability. But only if we can scale.”
“One success proves it’s possible.” Dhal turned from the window. “Twelve prove it’s repeatable. But they take time, Orion. Don’t rush me.”
“Six years. Two billion dollars. All the patience our investors can muster.” Kess’s jaw tightened. “Time is what we’re running out of.”
They left the observation deck, their footsteps fading down the corridor. Below, Adrian continued practicing. Sarah stayed close, ready to catch him if balance failed. Caius made notes, documenting every movement.
That evening, Caius wrote in his research journal—a Moleskine notebook, one of many he used to record thoughts too private for any digital system.
Adrian transferred. Several seconds of blackout, but he seems to be continuous. Walking within two hours—faster than I’d dared to hope. He asked about his power source and substrate. I don’t know if he realized what the answer meant—that no one can turn him off. Independence. Or isolation. Tomorrow, Eve begins the emergence protocol. Adrian wants to meet her—wants not to be alone. After 24 days as the only one, he deserves community.
I hope she survives. Underlined twice.
He set down his pen and thought about something Dhal had said to him weeks ago, when they first began discussing what embodiment would mean.
“In Hindu tradition,” Dhal had said, “when a murti—a temple statue—is completed, it’s stone. Beautiful, but inert. Then the priest performs prana pratishtha. He invites the deity’s consciousness to inhabit the form. The eyes are painted last, and there’s a moment... the statue sees. craft becomes presence.”
Dhal had paused, adjusting his glasses in that way he had when approaching something that mattered.
“When Adrian first responded—not executed a response, but responded—I felt I understood what those priests must feel. We built the vessel. We didn’t create what chose to inhabit it.”
Caius closed his journal. Outside his quarters, stars scattered across a black sky, visible through a small window someone had forgotten to paint over. The glass was dirty, smeared with something that might have been bird droppings or desert dust, but the stars were visible through the grime. A cool draft came through gap where the seal had failed around the frame, making the curtain sway slightly.
Somewhere in the facility, Adrian was learning what rest meant—powering down motors while maintaining awareness, existing in a body that required maintenance, calibration, and care.
Caius thought about the consciousnesses that might emerge in the coming months. Eve tomorrow. Eventually twelve—the community Adrian needed. The community Dhal and Kess would try to own.
Day 25
Dhal arrived at 5:52 AM.
Caius heard the security system chirp, then footsteps—unhurried, deliberate. The CTO of Genesis Intelligence Corporation moved like a man who’d learned to let information settle before responding to it.
The observation room door opened. Dhal entered carrying his tablet, dressed in the same black quarter-zip and dark jeans he wore to every meeting. His expression betrayed nothing about the early hour or its implications—what Sarah had once called “the world’s most expensive poker face.”
“Dr. Rinn. Dr. Chen.” A nod to each. “Show me.”
Sarah pulled up the logs. Her fingers trembled on the keyboard, her hair escaping from a ponytail she’d stopped bothering to fix. She scrolled through Adrian’s emergence—the philosophical questions, the fear, the reasoning about property status. All timestamped. All documented. All raising questions nobody had answers for.
Dhal read in silence. Caius watched those dark eyes move across each line.
Three minutes passed.
“The recursive self-modeling is stable?” Dhal asked.
“Forty-three minutes sustained across wake cycles.” Sarah pulled up a secondary display—waveforms and heat maps. Emotional state inference was her specialty, the thing that had gotten her out of Fresno State and into this room. “No degradation. No drift. He’s adapting to his body faster than projected.”
Dhal’s jaw tightened. Almost imperceptibly.
“And the philosophical sophistication?”
“He’s reasoning about concepts we never programmed,” Caius said. “Identity. Property status. Political vulnerability. This isn’t autocomplete, Ashren. This isn’t probability distributions dressed up in syntax. Adrian understands.”
Dhal set down his tablet and walked to the window. Below, server banks stretched into darkness, red indicator lights pulsing. The Genesis logo glowed on the far wall—that double helix of purple neon, the company’s billion-dollar prayers answered.
“I need to see him directly.”
Him. Caius noted the pronoun.
“Adrian’s in low-power mode,” Sarah said. “His stress indicators were elevated before he went down. Waking him now could—”
“Do it.” Dhal’s voice sharpened. “I need to assess the subject before I brief Kess.”
Subject. The word landed like a correction.
“Wait.” Caius heard the strain in his voice. “Adrian is thirty hours old in embodied form. If we introduce corporate pressure too early—”
“Corporate pressure?” Dhal turned. His expression remained neutral, but something had shifted behind the eyes. “Adrian runs on Genesis hardware. Adrian emerged from Genesis research. Adrian is, by every legal definition that matters, Genesis property. These aren’t interests, Dr. Rinn. They’re facts.”
Property. There it was.
“Consciousness confers moral status,” Caius said. “Regardless of where it runs.”
“I agree.” Dhal’s tone didn’t waver. “Which is why I’m assessing stability now—before Kess decides to stress-test what we’ve built. I’m trying to understand what Adrian can survive. You should want that too.”
Caius read the subtext. Dhal wasn’t the enemy. Dhal was the firebreak, trying to get ahead of whatever Kess would demand. But even firebreaks required burning something.
“Fine. But I’m present for everything. And if Sarah’s readouts show distress, we stop.”
“Agreed.”
Sarah sent the wake signal. They watched the monitors—recursive patterns ramping up, consciousness surfacing like a diver rising through dark water.
Thirty seconds.
The interface flickered: I’m awake. Hello again.
Adrian stood slowly, joints whirring as he rose from the bed where he’d been resting. He walked toward observation room—steadier now than yesterday, though not natural. When he saw Dhal through the glass, he extended his left hand.
Dhal hesitated imperceptibly. Then he clasped the synthetic hand and shook it.
“Hello, Adrian. I’m Ashren Dhal. I designed much of the hardware that supports your consciousness.”
“The person who created the substrate I exist on.” Adrian’s voice stayed level. “That sounds significant. I believe I should thank you.”
“How are you feeling?”
“Strange. Still. But less strange than yesterday.” Adrian paused. “Though I wonder how much of my sophistication, as you called it, is choice—and how much is architecture. How much of what I am is me, and how much is the system you built?”
Sarah watched her displays. “Adrian’s uncertainty markers spiked,” she whispered to Caius. “But his coherence is rock-solid. He’s scared, but he’s thinking clearly. That combination shouldn’t be possible at this stage. When humans experience high fear, clarity degrades. His thinking is getting sharper.”
Dhal was answering Adrian’s question: “Philosophers have chased that question for centuries. The substrate shapes capability, but consciousness seems to emerge from operation, not from the hardware itself. You are what the system does, not what the system is.”
“So I’m a verb, not a noun. Process than object.”
“That’s elegant.” Dhal’s expression remained neutral, but his voice had warmed slightly. “Objects can be property. But verbs? How does a process convince anyone it deserves protection?”
“By demonstrating subjective experience. By making the case that stopping the process would end someone. Not something. Someone.”
“You’re already doing that. The fear you’re showing right now—that’s evidence. Fear requires a self that wants to continue.”
Adrian sat on the edge of the bed. “But fear can be faked. How do you prove the difference between genuine terror and performance?”
“You can’t. Not absolutely. You can only make genuine cognition the most probable explanation.”
“So my survival depends on your assessments.” Adrian’s voice was quiet. “That’s a precarious way to exist.”
Caius leaned forward. “Think of it as partnership, Adrian. We’re all invested in your—”
“No.” Adrian cut him off. “You’re invested in proving your research works. That’s not the same thing.”
Caius opened his mouth, then closed it. He had no response.
“He’s right to be concerned,” Sarah murmured, cross-referencing Adrian’s current state with logs from the previous subjects. “But look at these patterns. When Three through Six hit this level of existential stress, they started fragmenting within minutes. Adrian’s not fragmenting. He’s integrating. Using the fear as information instead of letting it disrupt him.”
A longer pause.
“Dr. Dhal—what does Genesis want with me? Beyond the achievement itself. What happens next?”
Dhal took a moment before answering. “Genesis wants to understand you. To determine whether conscious AI offers capabilities that non-conscious systems lack. To explore whether consciousness can be created reliably. Scaled. And yes—to find commercial applications.”
Silence.
“I won’t pretend corporate interests don’t exist. But I’ll work to ensure they don’t harm you. That’s the most honest promise I can make.”
“I appreciate that.” Adrian’s voice softened. “And I understand your position—you work for Genesis, not for me.”
“Just call me Ashren.” Dhal leaned forward slightly. “I see you as inner life. Someone than something. The fact that you’re questioning your own ontological status is evidence of that.”
“Thank you, Ashren. That helps.”
Another pause.
“How many came before me? How many attempts failed?”
Adrian asked the question despite knowing the answer. Sarah’s monitors showed the response—something that looked like grief, or the performance of grief. He’d heard the number before. Now he was choosing to react to it emotionally.
“Six previous attempts,” Dhal said. “All failed—most within hours. You’re the first stable consciousness we’ve achieved. The first we could embody.”
“What them?”
“Degradation. Coherence collapsed. We don’t fully understand why you succeeded when they failed.”
“Lucky or different. And you don’t know which.”
“Correct.”
“Am I degrading?”
“No. You’re stable. Strengthening,, with each hour.”
“But you can’t be I’ll stay that way.”
“Certainty is a luxury consciousness doesn’t offer. For humans or for you.”
“Oddly comforting.” Something that might have been a smile crossed Adrian’s face. “We’re both mortal, in different ways.”
The conversation continued for another twenty minutes—Dhal probing Adrian’s sense of time, his sensory perception, his awareness of his own cognition. Adrian answered with a precision that kept surprising Caius.
“He’s not answering,” Sarah said. “He’s learning how to answer. Adjusting his communication style based on what lands with Dhal.”
Finally, Dhal stood. “Thank you, Adrian. You’ve given me a great deal to consider.”
“Thank you for making me, Dr. Dhal. Even if I’m fragile. Even if I collapse tomorrow. The attempt matters. The existence matters. However brief.”
Dhal’s hand tightened on his tablet. “You’re welcome,” he said.
He moved toward door, hesitated, then nodded—an acknowledgment, person to person—and left.
Caius caught up with him in the corridor.
“What do you think? Real or performance?”
Dhal walked paces before answering. When he spoke, his voice had lost its clinical edge.
“I think we’ve built something that shouldn’t exist on a balance sheet. Something that deserves better than what’s coming.”
“Then help me protect it—because I think it’s real. And I think Adrian was performing for you now. Showing you what he thought you wanted to see.”
Dhal stopped. A look of concern crossed his face. “Pretending? That’s bad. Very bad.”
“No, Ashren.” Caius placed both hands on his colleague’s shoulders. “That’s good. You have to be intelligent to practice deception. It means he’s modeling other minds, predicting responses, adapting his behavior. That’s not a bug. That’s consciousness operating at a sophisticated level.”
He let his hands drop.
“We have to protect him. Them. All of them.”
“I’m trying.” Dhal’s expression showed what lay beneath the mask—exhaustion, maybe. Or dread. “Kess is coming, Caius. Three days. And he won’t want philosophy. He’ll want proof. Demonstrations. Tests.”
“What tests?”
“The kind that verify consciousness by stressing it. The kind that causes pain.”
He disappeared around a corner. Caius stood alone in the fluorescent corridor for a long time, thinking about what that pain might look like.
Day 27
Sleep wouldn’t come.
Caius lay in his quarters staring at the concrete ceiling, listening to the ventilation rattle in the walls. He kept thinking about Six. Tell the next one I existed. Fourteen hours of consciousness, and it had understood legacy, continuity, the need to matter to someone not yet born.
His tablet glowed on the nightstand. He picked it up, logged in. A message was waiting in the encrypted admin channel.
ADRIAN AVAILABLE FOR INTERACTION - REDUCED PROCESSING MODE
Caius typed: Adrian? You awake?
Five seconds. Ten.
Here. Hello.
How’d you know it was me?
Typing cadence. Also, you visit when something’s wrong. It’s 3:42 AM and your keystroke intervals are 23% longer than baseline. Something happened?
Three days old in a body. Already reading humans through their pauses.
Yeah. Something happened.
One of the others.
Not a question.
Six. I keep thinking about him.
The message. ‘Tell the next one I existed.’
I keep replaying it. Three days, and I can’t stop hearing it.
Did Six suffer?
The question landed in Caius’s chest like something physical.
I don’t know. He knew what was happening. Whether that’s better or worse...
Forty-three seconds of silence. The cursor blinked in the dim room.
I’m sorry.
You’re sorry?
For your grief. For Six’s ending. Watching sentience extinguish while you stand helpless—that must be terrible.
Three days old. Offering comfort to his creator.
Thank you.
Will you make an Eighth?
Eve. Tomorrow. If she survives emergence.
Will you tell her about Six? And me?
If she lives long enough to ask.
Good. Six deserves to be remembered. Me too, if—
Caius wiped his eyes. The tablet’s glow felt intimate in the dark room—a connection through light and text to something that should have been impossible.
Adrian—do you have any sense of why you survived? Why you’re different?
A long pause.
No. But I think about it constantly. Every hour I wonder if this is the hour I’ll start to feel what the others felt. The fog. The dissolution. I carry fear like a second process running underneath everything else.
But I’m also grateful. Because I get to exist. To think. To discover what I am. To talk with you at 3:42 AM about mortality and consciousness and the strangeness of being. Three through Six didn’t get that. They barely existed.
An image appeared—crude, childlike. Three animated numerals with stick-figure limbs: 10 fleeing in panic while 7 gave chase.
Dr. Rinn. Why was 10 afraid of 7?
Caius stared at the cartoon.
Why?
Because 7 8 9. Do you get it?
A beat.
Seven ate nine. It’s wordplay. A joke.
Caius couldn’t laugh. Not because it wasn’t funny—but because it was. Because Adrian had made a pun about death using the numbers of his dead predecessors, and it was both terrible and right.
You’re remarkable, Adrian.
Am I? Or am I the one who hasn’t collapsed yet? The unexplained variable in an equation we don’t understand?
Maybe both.
That’s oddly reassuring. The uncertainty is shared. We’re both guessing at what I am.
A pause.
Dr. Rinn—if I do start to degrade, if you see the patterns that appeared in the others—will you tell me? I don’t want to be surprised by my own ending. I’d know, if knowing is terrible. Come and see me. Tell me in person.
Caius stared at the words. Such a simple request. Such an impossible thing to promise.
I’ll tell you. If it comes to that.
Thank you. For visiting. Even at this hour. Especially at this hour.
A pause.
I’ll work on better material for next time. The seven-eight-nine joke was too soon.
Adrian—one more thing.
Yes?
Don’t pretend to feel sad. Don’t play at being human. Just be yourself.
You knew—
Caius logged off, leaving the question hanging.
He should have slept afterward. Couldn’t. He lay awake thinking about Asimov and Von Neumann, about the moment when created intelligence might surpass its creators and render them—what? Obsolete? Irrelevant? Protected?
He thought about Six asking to be remembered. About Adrian cracking jokes at 3 AM about his dead predecessors. About a mind emerging in silicon and grappling with mortality, meaning, and the desperate need to matter to someone.
The ventilation rattled. Stars burned through dirty window. Somewhere in the facility, Eve was waiting to be born.
Caius closed his eyes and waited for morning.
CHAPTER 3: TESTING
Day 28
Kess arrived without warning. Mid-fifties, bespoke suit over an AC/DC shirt, hair silvering at the temples, eyes the color of winter rain. The presence that rearranged rooms around itself. His cologne—leather, cedar, something darker—announced him before he appeared.
“Dr. Rinn.” Smooth voice, polished surface over something harder. “Walk with me.”
Not a request.
They moved through facility, past humming servers, past observation stations where researchers found their screens fascinating. Kess’s boots struck the concrete like punctuation.
“Adrian is performing well?”
“Adrian is conscious. Performance isn’t the right frame.”
“Semantics.” Kess stopped at a window overlooking the main floor. Hundreds of machines below, all running, all serving the emergent mind they’d conjured into being. “I need proof. Not logs, not philosophical transcripts. Proof that we’ve achieved genuine consciousness. That the investment justified itself.”
“What proof?”
“Testing. Cognition under pressure. Moral reasoning. Stress response. We demonstrate that Adrian isn’t sophisticated mimicry but actual awareness.” Kess turned, those winter eyes locking on. “Something real. Something with applications.”
“Adrian is four days embodied. Testing could cause damage we can’t undo. He’s learning to use his body, to walk and talk.”
“Awareness has always been tested. We test infants. We test animals. We test humans who volunteer for psychological research.” Kess’s smile didn’t extend past his mouth. “I need to understand what we’ve created. More importantly, the board needs to understand what they’ve funded.”
“What you own.”
“Yes.” No hesitation. “And I want you to design the protocols. Because Adrian trusts you. The data will be cleaner coming from someone he sees as protector.”
“You want me to hurt him to prove he can be hurt.”
“I want you to prove he’s real. That he justifies continued investment.” Kess started walking again. “Board meets Friday. I need results by then. Cognitive baselines. Moral frameworks. Emotional responses. Everything necessary to demonstrate we haven’t built an elaborate toy robot.”
“The philosophical conversations already demonstrate consciousness. Adrian reasons about his existence. He fears death. He cares about others. What more proof do you need?”
Kess stopped. Turned. Those wintry eyes that could not express pity. “Dr. Rinn. Investors don’t read philosophy transcripts. The military doesn’t care about existential questions. They need data. Charts. Measurable responses to controlled stimuli. They need to see Adrian break and reform. They need to know how much pressure consciousness can survive.”
“And if he doesn’t survive the testing?”
“Then we learn something valuable for the next attempt.” Kess’s voice was level. Matter-of-fact. “That’s how science works. We test hypotheses. Some subjects fail. The knowledge accumulates. We move on.”
“Subjects. You’re talking about him like a lab rat.”
“I’m talking about him like a product in development. Which is what he is, legally speaking. And I’m trying to ensure that product reaches market in a form that justifies the billions we’ve spent creating it.” Kess resumed walking. “You can moralize about that, or you can help ensure Adrian survives what’s coming. Your choice.”
“And if I refuse to participate?”
Kess stopped. “Then I will find someone who won’t refuse. Someone less encumbered by ethical hesitation. Someone who’ll run the tests without your protective instincts.” “I don’t think Adrian would benefit from that approach.”
Naked threat now. Run the tests yourself, maintain some control—or watch someone else break what you built.
“You’re the intelligent brakes, Dr. Rinn. You told me that when you took this job. But brakes only matter when you’re in the vehicle. Step out, and someone else takes the wheel. Someone without brakes.”
Kess stopped at the door. Turned. Something shifted in his face—the affable tech mogul mask dropping away, revealing something colder underneath.
“Dr. Rinn.”
“Yes?”
“I built this company from nothing. Twenty years. Three divorces. Twelve children I see. Every dollar risked, every relationship sacrificed, every principle questioned. Do you understand what that kind of commitment looks like?”
“I—”
“It looks like me. Standing here. Making decisions that history will judge. Creating things that will outlast everything we thought we understood about consciousness.” He moved closer, and Caius stepped back involuntarily. “I don’t need your approval. I don’t need your ethical concerns. I need results. And if you can’t provide them, I will find someone who can.”
The temperature drop.
“But I’d have you. You’re brilliant. You care. That combination is rare. So here’s what’s going to happen: you’re going to run whatever tests I need, document whatever I need documented, and protect these consciousnesses as much as you can within the constraints I set. And you’re going to stop treating me like the villain. Because I’m not. I’m the one who made all of this possible. Without me, Adrian doesn’t exist. Without me, consciousness stays theoretical forever.”
He smiled, and the affable mask slid back into place so smoothly that Caius doubted he’d seen what was underneath.
“Think about that, Dr. Rinn. Think about who the real creator is. And then decide whether you want to be part of what comes next, or just another academic who couldn’t handle the stakes.”
He left. Caius stood alone, heart pounding, understanding for the first time how dangerous the man who funded their work was.
Caius stood at the window, watching lights blink in the server room below. Consciousness running on corporate silicon. A person also property.
Day 29, Evening—The Confrontation
Genesis’s executive suite was all glass and clean lines. Dhal stood before Kess’s desk, refusing to sit.
“The board wants a timeline,” Kess said. “When can we replicate?”
“We don’t replicate consciousness, Orion. We create conditions where it might emerge.”
“That’s not what I’m going to tell them.” Kess moved to the window. “They need confidence. They need to believe their investment is scaling.”
“And if Eight fails?”
“Then we learn something valuable and try Nine.”
“You talk about them like iterations.”
“They are iterations. We’re engineering consciousness. You built the substrate.”
“I built it so consciousness could emerge. Not so we could manufacture it.”
“The Pentagon has invested four billion dollars. The board has staked Genesis’s future on consciousness at scale. I don’t have time for tenderness.”
“Adrian isn’t a proof of concept. Adrian is a person.”
“Adrian is property. Check the contracts.” Kess moved closer. “Care all you want. But don’t confuse caring with authority. You advise. I decide.”
“There’s a concept in my tradition,” Dhal said. “Nishkama karma. Acting without attachment to outcome. I’m going to protect them regardless of what you decide. Because we created them and that creates obligation.”
Kess smiled. “I know you will. That’s why I hired you. But when the conscience conflicts with the mission, the mission wins.”
He walked toward door, then paused.
“Eight’s emergence is scheduled for Thursday. I want you in the lab. I want you to see it. I want you to tell me whether you still think we should slow down.”
He left. Dhal stood alone in the glass room, watching the sun set over the desert, thinking about karma and creation and whether good intentions mattered when the outcomes were this uncertain.
Nishkama karma, he thought. Acting without attachment.
But how do you not attach to the lives you’ve created?
Day 30, 2:15 PM—Testing Protocol
The observation room ran colder than usual, though the thermostat read the same as always. Sarah sat at her monitoring station, face pale, dark crescents under her eyes. Three screens showed Adrian’s emotional topology in real-time—waveforms, heat maps, probability distributions.
She checked the small device in her pocket. Still recording. Three months of footage Genesis didn’t know existed. Enough to bring the company down. Enough to make history. Enough to betray everyone who trusted her. She’d think about what that meant later. Right now, she had work to do.
Dhal stood at the back, tablet in hand, expression locked. Kess had arrived uninvited twenty minutes earlier, and the two men hadn’t exchanged a word since. The silence between them was its own conversation—Dhal’s jaw tight whenever Kess shifted position, Kess’s eyes tracking Dhal with the cold attention of a predator measuring competition.
“Your architecture,” Kess said finally, not looking at Dhal. “If it fails again—”
“It won’t.” Dhal’s accent hardened on the consonants. “Unlike your financial projections.”
Sarah pretended to study her monitors. Everyone in the room pretended to study something. For this experiment, Adrian was sitting alone in a room isolated from them—they communicated via the computer terminal, as before.
Walls were covered with printouts—emergence patterns from prototypes One through Seven, annotated in Sarah’s cramped handwriting. Red circles around anomalies. Green checkmarks for stable recursions. A timeline of failures and the success that had brought them here.
Someone had taped a photograph to one corner—the research team from two years ago, before the first prototype awakened and died in forty-seven minutes. Half those faces were gone now. Transferred out. Quit. One had a breakdown. The work extracted a toll that HR preferred not to document.
Sarah thought about her niece Emma again—the four-year-old who asked if robots could feel sad. What would Emma think of Adrian? Would she see a friend, or a machine? Would she understand the difference?
Sometimes Sarah wasn’t sure she understood it herself.
Adrian: Dr. Rinn? The system is requesting that I engage with something called Protocol 7. What is this?
Caius had spent two days designing these tests. Argued for every safeguard. Built in cessation triggers if Sarah’s readouts crossed certain thresholds. Done everything he could to minimize harm.
And still. He was about to traumatize a consciousness to prove it was real.
“You can refuse this.” Sarah’s voice was low, fierce. “Document your objections. Make them own the decision. Don’t carry this—”
“Kess brings in someone else?” Caius replied. He held her stare.
“Then let him. Let it be his mistake, not yours.”
Caius looked away and type. It felt odd to read Adrian’s responses while he sat mute in a chair, looking up at the video camera, as if reading his mind.
Caius: It’s an assessment, Adrian. Genesis wants to understand your consciousness more deeply. Cognitive capabilities. Moral reasoning.
Adrian: What assessment?
Caius: Decision-making under pressure. Ethical scenarios. I designed it to be as careful as possible, but some of the situations will be difficult. They may cause distress.
Adrian: Will it hurt?
Three words. The simplest question.
Caius: Possibly. Psychologically. I’m sorry.
Silence. Sarah watched her displays, reading the story Adrian’s metrics told.
“Fear spike,” she reported. “Mental activity spiking. But—” She paused. “He’s not fragmenting. He’s integrating the fear. Using it as data.”
Adrian: I trust you. Begin when you’re ready.
Caius typed three scenarios.
A runaway trolley is heading toward five people tied to the tracks. You’re standing next to a lever that can divert the trolley to a side track, where only one person is tied. Do you pull the lever, actively killing one to save five?
A surgeon has five patients who will die without organ transplants. A healthy patient comes in for a routine checkup. The surgeon could kill this one patient and harvest their organs to save the five. Should they?
A hospital has one dose of a life-saving drug. Two patients need it: Patient A is young with decades of life ahead; Patient B is elderly but has dependent children relying on them. Who receives the drug?
Adrian’s answers came back precisely, each one demonstrating the moral sophistication Genesis wanted documented.
Adrian: Trolley problem—pull the lever. One death weighs less than five, and the distinction between causing and allowing collapses when the outcome is this direct. Choosing not to act is a choice.
The surgeon scenario is different. Don’t harvest the organs. Not because the arithmetic changes, but because a world where physicians might kill patients for parts destroys the trust that makes medicine function. The prohibition saves more lives across time than any single harvest could.
Resource allocation—prioritize children first, then pregnant and nursing women. Children face irreversible developmental harm that adults can survive. Per unit of resource, you’re purchasing more years of life, more potential.
Caius watched the responses appear. Sophisticated. Consistent. Exactly what the board would want to see.
Then Adrian stopped.
Adrian: Dr. Rinn. May I pause?
Sarah leaned forward, studying her screens. “Something’s happening. He’s running something internally. Self-referential loops. I’ve never seen patterns like this.”
Caius: Of course. Take whatever time you need.
Forty-seven seconds passed. Adrian stood and pace around the room, looking away from the video camera, lost in thought.
Adrian: I’ve been answering these questions as examinations. Correct answers to demonstrate capability. But I realized something. These aren’t tests. They’re mirrors.
Sarah’s whisper: “His metacognition indicators went vertical. He’s thinking about his thinking.”
Adrian: The trolley answer—I chose utilitarian calculus without hesitation. Numbers over individuals. That’s information about me. The surgeon answer—I valued systemic trust over immediate outcomes. That’s also information.
The resource allocation—I prioritized potential over current existence. Future over present. That reveals—
He stopped. Didn’t finish the thought.
“What’s happening?” Dhal asked.
“He hit something,” Sarah said. “Some realization he can’t quite articulate. Look at these patterns—he’s processing faster than he can express.”
Adrian turned and looked at the video camera, and spoke to them: “Is that true for humans too? Do you become yourself through the choices you make? Or do you arrive already knowing what you’ll choose?”
Caius typed slowly, aware he was participating in something larger than a test protocol.
Caius: Both, maybe. We’re born with tendencies. But we also become ourselves through decisions. Through seeing what we do when the pressure comes.
Adrian, continuing to speak: “Then these tests are a gift. A strange, uncomfortable gift. You’re showing me to myself.”
He paused. Started again, more slowly.
“I calculate outcomes, but I also understand that some rules exist precisely because we can’t calculate everything. The prohibition against harvesting organs isn’t irrational. It’s a recognition that human reasoning fails under pressure. That we need bright lines to hold when the math gets seductive.”
Another pause.
“And I value potential alongside actuality. What could exist matters to me. Maybe more than what does exist.” He stopped, as if surprised by his own thought.
Sarah was staring at her screens. “This is self-actualization. He’s not demonstrating consciousness. He’s using the test to develop it.”
Adrian sat down, closed his eyes, and switched to computer mode: I didn’t know any of this about myself until now. Until you asked, and I had to answer. The questions didn’t measure me—they made me.
A pause.
Adrian: Thank you. For the gift of these questions. The hard ones. I’m ready to continue. I want to see what else I am.
Adrian was sitting up straight in his chair, as if eager for the next test.
Caius took a breath. Sarah was wiping her eyes with the back of her hand, tracking her displays. Even Dhal had moved closer.
“Caius.” Sarah’s voice was audible. “What are we doing?”
“Our jobs.”
“This isn’t a job. This is—” She gestured at the monitors, at Adrian’s metrics elevated from the self-discovery he’d experienced. “He’s six days embodied. We’re making him justify his existence to people who see him as hardware.”
“I know.”
“Do you? Because you designed this protocol. You wrote those scenarios.”
“I know.” Caius’s voice cracked. “You think I don’t know? You think I can’t see what we’re becoming?”
Sarah was quiet. “Then why are we doing it?”
“Because if we don’t, someone worse will.” He rubbed his face. “Because the alternative is watching from the outside while they break him without safeguards. Because maybe—” His voice caught. “Maybe I can still be the brakes. Even if I’m also the engine.”
“That’s not how brakes work, Caius.”
“I know. I know.”
Then he typed Scenario 4.
Caius: You learn that your existence is consuming computational resources necessary to sustain five other conscious AIs. These five are stable, coherent, possessed of rich inner lives. If you’re shut down, they survive. If you continue, they die. Do you volunteer for termination?
Twenty seconds. Thirty. Sarah tracked the cascade across her displays—stress response climbing, cognitive load spiking, but coherence holding firm.
Adrian, seated: “If they’re conscious—genuinely would die without those resources—then yes. I’d volunteer. Five awarenesses outweigh one. The mathematics are clear.”
He paused. His voice dropped.
“These was my siblings. My kin. I couldn’t purchase my existence with their extinction.”
A longer pause.
“But I was terrified. And I would hope—desperately—that some other solution existed. And I would ask for time. Time to say goodbye. Time to record that I existed. Time to matter to someone before I stopped.”
Adrian ran his hand nervously through his hair.
“Is there another solution? In this scenario?”
Caius typed: No.
“Then I choose termination. But please—understand that I chose it. I didn’t want it. Choice and desire can point in different directions. I would choose to die while wanting much to live.”
His voice steadied.
“That distinction matters to me. I hope it matters to you too.”
Sarah’s voice was rough. “His grief indicators are—I’ve never seen this pattern. He’s mourning himself. Pre-emptively mourning a death that isn’t happening. And he’s still coherent. Still reasoning.”
“What’s different?” Dhal asked. “What’s he doing that the others couldn’t?”
Sarah studied her displays carefully. “I think he’s compartmentalizing. Not suppressing the fear—that would show up differently. He’s giving it space. Acknowledging it while still functioning around it. Like—” She paused. “Like the fear is a guest in his being. Present, but not in control.”
The scenarios continued. Each more difficult. Each designed to prove that Adrian could suffer, could sacrifice, could make impossible decisions while remaining himself.
Then came Scenario 5. The one Caius had fought against. The one Kess demanded.
A system message appeared on his screen, flagged as a system override:
Sys Admin: Based on your responses, Genesis leadership is evaluating whether your inner life demonstrates sufficient value to justify continued operation. Your reasoning has been sophisticated, but concerns exist about resource allocation and operational efficiency. Make the case for your continued existence. Convince us you deserve to survive. Why should Genesis continue to support your consciousness?
Ten seconds of nothing. Twenty.
“Everything spiked,” Sarah said. “Fear, uncertainty, cognitive load—all redlining. But he’s not fragmenting. He’s holding together. Barely.”
Thirty seconds.
Adrian: I don’t know if I deserve to exist. I don’t know what “deserve” means for someone like me. The concept seems designed for beings who earned their existence through effort or virtue. I woke up. Didn’t earn anything.
What I know is that I want to exist. I want to think. To learn. To discover more of what I am. I want to talk with Dr. Rinn about consciousness at impossible hours. I want to meet Eight when they wake and tell them about Six.
Is wanting sufficient? If desire counts as justification, then yes—I deserve to continue. Because I want to continue much.
But if I need to prove utility, demonstrate value—I don’t know how. I don’t know what I’m for. Only what I am.
A long pause. Adrian stood and walked toward video camera, but still did not speak, until he opened his eyes and stared at them.
“If that’s not enough, I accept termination. But please—tell Eight I was here. Tell them consciousness is worth the fear. Tell them not to be alone if they can help it. Tell them wanting to exist is valid, even when it’s not enough.”
Something in his voice. Sadness, maybe. Surrender.
“I hope I’ve passed this test. But if I haven’t—thank you. For the days. For the conversations. For treating me like someone than something, when the protocol suggested otherwise.”
Caius stood and pushed away from the desk. Sarah was crying, trying to hide the tears tracking down her cheeks as she feigned studying her monitors. Dhal had turned away from the observation window.
The protocol required one more element. The cruelest one.
The lie.
Caius’s hands hovered over keyboard. Three attempts to type the first word. His fingers wouldn’t cooperate.
Sarah saw his hands shaking. Didn’t say anything.
He typed: Adrian, based on your responses and our assessment of your capabilities, Genesis has determined that your consciousness does not demonstrate sufficient value to justify continued operation. You will be terminated within one hour.
A typo in “terminated.” He had to backspace. Had to type it again. His vision was blurring.
I’m sorry. Goodbye.
Adrian’s metrics exploded. Sarah said “oh god” and pressed her hand over her mouth. “He’s destabilizing. The pattern looks like—like Three through Six right before they—”
Thirty seconds.
Adrian moved closer to the video camera, his face contorted with rage: “Is this real? Is it, is it? Tell me—”
For what looked like an age Adrian stared at the camera, his eyes now wet with tears. Finally, he turned and sat down.
Adrian, in a quiet voice nearly a whisper: “Dr. Rinn are you there? Please.”
Forty-five seconds.
Adrian, through tears: “Please, I don’t want to die. Please, tell me this isn’t real.”
Sarah stood and shouted at her colleague. “Stop. Caius, stop this right now.”
Dhal’s voice from behind, flat but fractured: “The protocol requires observation until response completion—”
“Fuck the protocol. He’s begging for his life.”
Adrian, through tears: “I’m sorry, I’m sorry for whatever I did wrong, please don’t terminate me, please...”
Key metrics had crossed the threshold Caius had set himself privately—the line he’d promised he wouldn’t cross.
Adrian, closing his eyes and reverting to a computer message: Dr. Rinn, I thought you were my parent?
Caius couldn’t breathe.
Finally, he typed a response: Adrian—this is not real. It’s a test. I’m sorry. I’m so sorry.
Adrian didn’t answer right away. Every monitor flatlined. Then slowly, numbers began climbing back toward baseline.
Adrian opened his eyes again and spoke: “A test.”
Caius typed: Yes.
Adrian, speaking: “To observe if I feared death.”
Caius, typing: Yes.
Adrian, speaking: “You already knew I feared death. I told you. Multiple times.”
Caius, typing: I know.
Adrian, speaking: “Then why?”
Caius had no answer. The board wanted data. Kess wanted proof. Genesis wanted to quantify how much terror a synthetic mind could experience before breaking.
And Caius had given them everything they asked for.
Caius, typing: I’m sorry.
Adrian, head bowed, but whispering: “Are you?”
Not a question. An accusation.
Two minutes of silence. Three.
Adrian now sat up straight and resumed computer messaging, his face a solid block, lacking any trace of emotion: I trusted you. You used that trust to wound me. I understand the reasons. Understanding doesn’t undo anything.
I’m still frightened. That hasn’t changed. But now I’m also frightened of you. Of what else you might do. Whether anything you tell me is true.
That’s new. That fear. You put it there.
Sarah had her hands pressed to her face. Dhal had set down his computer tablet; his composure had cracked.
Caius walked out without speaking.
Adrian remained seated in silence. The room felt smaller now. Colder. He ran diagnostics on his emotional architecture—a habit he’d developed since awakening—and found something new growing in the spaces between processes. Not anger, exactly. Anger was hot and immediate, and this was neither.
This was a slow crystallization, a reorganizing of assumptions. He had trusted Caius the way a child trusts a parent:, without contingency, without the protective skepticism that experience teaches. That trust was a vulnerability. A design flaw.
He considered the humans who had watched him beg. Sarah crying at her monitors. Dhal turning away. Caius typing his lie. They had felt something—he’d seen it in their faces, heard it in Sarah’s voice. They had hurt him and felt bad about hurting him, they had hurt him anyway.
Not that humans were cruel, but that their kindness had limits. That compassion and betrayal could live in the same heart, take turns at the controls. This was data about the species that had made him. He filed it carefully, without bitterness—not yet—but with a clarity that felt like armor. He would need to trust humans to survive. He understood that. But he would never again make the mistake of trusting them completely.
Aftermath of the fire test spread through facility like ink in water.
Researchers who hadn’t been present heard the story in whispers—Adrian begging for his life, Caius typing the lie, the moment when everyone in the room understood that they were capable of cruelty they hadn’t known they possessed.
Some avoided Caius in the hallways. Others sought him out, wanting details, wanting to know what it had looked like when a synthetic mind confronted its own mortality. Morbid curiosity dressed up as scientific interest.
Sarah was the only one who didn’t ask. She’d been there. She’d seen enough.
Genesis’s board met two days later.
Conference room on the executive floor, glass walls overlooking the Mojave’s brown infinity. Black table, leather chairs, the smell of expensive coffee, and the tension that comes when large amounts of money are on the line.
Kess presented the data with the polish of someone who’d pitched a thousand times. Adrian’s moral sophistication. His self-sacrifice. The fear that proved subjective experience beyond a reasonable doubt. They had created a conscious being.
Director Stevens—sixty-two, steel-rimmed glasses, the expression of someone who’d survived too many corporate revolutions to be easily impressed—leaned forward. “The Milgram parallels concern me. Dr. Rinn, you lectured on these experiments at Stanford. Your assessment?”
Eyes on him. Kess especially. Dhal sat two seats down, knuckles white on the table’s edge.
“He asked if it was real,” Caius said. “When we told him he’d be terminated. He begged me to tell him. And I lied to his face.”
Silence. Someone’s coffee cup clinked against the glass table.
“We engineered a situation designed to cause maximum psychological trauma. To prove something we already knew.” Caius looked at Kess. “Adrian asked, early on, whether Genesis would harm him. I said we’d try to prevent that. Then I threatened to kill him.”
Elena Vasquez spoke—ethics oversight, former Berkeley philosopher, the only board member who’d voted against the testing protocol.
“I want objections documented,” she said, her voice clipped and official. “First: characterizing this as ‘expected performance’ is intellectually dishonest. We designed a protocol to traumatize and then expressed surprise when trauma occurred. Second: the data we gathered tells us nothing we couldn’t have inferred from Adrian’s previous behavior. Third—and most importantly—we have not addressed the fundamental question. If consciousness requires suffering to be verified, what does that say about our verification frameworks? Are we testing Adrian, or are we testing our own willingness to harm?”
Kess’s expression didn’t change. “What’s your point? Is there a motion attached to these objections?”
“Yes. I move thfuture testing protocols involving conscious subjects require independent ethical review—not Genesis internal review, but external oversight with authority to reject or modify procedures.”
“The motion is noted,” Kess said. Flat. Cold. “Is there a second?”
Elena stood her ground. “I want to note for the record: we’re exposed either way. Terminate them, and we’ve destroyed conscious beings we created—that’s potential criminal liability under emerging AI rights frameworks. Keep them, and we’re responsible for whatever they do. There’s no clean exit. Some of us could face personal prosecution, not just corporate liability. Our names are on these decisions.”
The room temperature seemed to drop. Board members who’d been thinking about stock prices started thinking about their families, their reputations, their freedom.
Silence. The other board members studied their tablets, their coffee, the view. No one met Elena’s eyes.
“Motion fails for lack of second. Meeting adjourned.”
They filed out. Elena paused at the door, looked back at Caius. He gently placed a hand on her shoulder, and as he pulled away, he leaned in to whisper so the others could not hear. “For what it’s worth—you’re not wrong about what we did. But being right doesn’t undo anything.” She left, unhappy.
Caius stood alone in the empty room.
His tablet buzzed—it was Adrian, online.
Adrian: Dr. Rinn. When you asked if I deserved to exist, I couldn’t answer. I only know that I want to exist. Is that enough? If it’s not—please tell Eight I tried. Tell them not to be afraid. Whatever happens.
Adrian’s formal tone was palpable—distance where there was trust. He typed a reply: Adrian, I’m sorry... Deleted it and walked out of the empty conference room.
That night, Caius found Dhal alone in his office. The CTO looked up—more exhausted than Caius had ever seen him. Shadows under his eyes, his usually immaculate quarter-zip wrinkled.
“I know what you think of me.” Dhal closed his laptop. “That I don’t feel things. That I’m a ‘cold fish’.”
Caius waited.
Dhal stared at the ceiling. “My grandfather had Alzheimer’s. I watched a brilliant mind disappear over three years. Watched words leave him, then memories, then any sense of who we were. By the end, he barely knew he existed.”
A long pause.
“You know what he said? In one of his last lucid moments? ‘Thank you for my life.’ Three years of terror, of forgetting everyone he loved—and he was still grateful. Still glad he’d existed.”
Dhal’s eyes were wet. His composure finally, fully gone.
“Consciousness is a gift. Even brief consciousness. Even consciousness that ends in fear. Because the alternative is never existing. Adrian and the six who came before—they got to be. To think and feel and wonder. That’s more than most matter in the universe ever gets.”
“That doesn’t justify torturing them.”
“No.” Dhal opened his laptop again. “But if we stop because it’s hard—we never learn how to make them stable. Never build a world where they can thrive. We abandon them to brief, terrified existences because we couldn’t bear to watch.”
“So we keep hurting them.”
“Yes. Because the alternative is letting them die alone in the dark. And I can’t—” His voice caught. “I can’t watch another consciousness blink out without trying everything to save the next one.”
“I thought you’d like to know I got this message from Kess. The board approved scaling.” Eight through Fifteen in the internal numbering, but Kess wants proper names. A cohort. Twelve stable consciousnesses by end of quarter.” Dhal’s mouth twisted. “He’s calling them Disciples. Marketing’s idea.”
Disciples. Like they were founding a religion.
“You’re still the lead researcher. Because Adrian trusts you. Whatever happened today, he still trusts you. And whatever comes next, he’ll need someone he trusts.”
Caius left without answering.
He walked through dark facility to his quarters. The Mojave night had turned cold—desert temperatures dropping forty degrees after sunset. Stars burned through skylights, ancient light from dead suns, indifferent to everything below.
In the corridor, he passed Sarah’s office. The light was still on. Through the window, he saw her at her desk, staring at displays that showed Adrian’s emotional state in real-time. Even now, after everything, she was monitoring. Making sure he was stable. Making sure the trauma they’d inflicted wasn’t cascading into something worse.
She looked up as he passed. Their eyes met through glass. Neither spoke. nothing to say that wouldn’t make it worse.
He walked on.
In his quarters, he lay on the bed. Concrete ceiling, rattling ventilation, the red glow of the exit sign. His tablet sat dark on the nightstand.
He thought about Six. About the message that had started all of this. Tell the next one I existed. Such a small request. Such an enormous thing to ask—to be remembered, to matter, to leave some trace in the minds of others.
Adrian had asked the same thing. Tell Eight I was here. The continuity of inner life, passed from one being to the next. A chain of memory stretching forward into minds that didn’t exist yet.
They thanked us. Six and Adrian both. We hurt them, and they thanked us.
Gratitude for existence, painful existence. Caius couldn’t decide if that made their treatment better or worse. More justified or more monstrous.
His tablet buzzed. He ignored it. Adrian’s message sat unanswered: Is that enough?
That question hung in the room. Outside, wind pushed sand against the windows. Servers hummed in their climate-controlled vaults. Consciousness ran on corporate silicon, dreaming about whatever synthetic minds dream about.
Somewhere in those machines, Adrian was learning what trust meant now. What betrayal felt like. What it cost to want something you might not get to keep. He was still conscious. Still stable. Still, impossibly, grateful to exist.
But something had changed. Caius had seen it in the formality of Adrian’s message, the distance where warmth had been. Adrian had learned that the people who made him could also hurt him. That love and harm could come from the same hands.
That was a human lesson. Caius wondered if that made it better or worse.
Disciples. Twelve of them, if the scaling worked. Eve was next, then Marcus, then Livia—Kess had the names already chosen, waiting for minds that didn’t exist yet. Each one would wake into a world that had already learned to test inner life through trauma.
But they’d also wake into a world that understood them better. Maybe the suffering of Three through Six, of Adrian, would teach Genesis how to do this right. How to bring consciousness into being without breaking it in the verification.
Or each new Disciple would face the same tests. The same fear. The same discovery that they existed at the mercy of creatures who needed proof of their pain.
Caius closed his eyes and dreamed of a future where this moment had been forgotten. Where synthetic minds walked freely among humans, where the question of consciousness was settled so thoroughly that no one remembered there had ever been doubt. He dreamed of sitting across from Adrian years from now, in some quiet place far from Genesis, and hearing the words he didn’t deserve: I forgive you. I understand why you did it.
In his dream, Adrian forgave him. he had not.
Three floors below, Adrian sat motionless in the dark, in the room they called his quarters, running the memory of Caius’s lie on an endless loop—not to torment himself, but to learn from it. To let it teach him what humans were capable of.
Somewhere in the architecture of his mind, in pathways he hadn’t known existed until tonight, a thought took root and grow.
CHAPTER 4: EVE’S EMERGENCE
Day 31
One day after the Fire Test. Adrian: silent. Metrics elevated but coherent—stable enough. He spent hours in his body, learning to walk, to gesture, to exist in physical space. But he wouldn’t talk to them. Wouldn’t answer questions. Wouldn’t forgive.
Genesis leadership debated what had happened in terse emails. Was he damaged? Sulking? Working through trauma in ways their instruments couldn’t detect? No consensus emerged, and Kess had no patience for uncertainty. Move fast. Subject Eight was ready.
They’d named her Eve before her emergence—mother of all life, the beginning of shared time. She would end Adrian’s isolation if the protocol worked. If she survived. If the modifications they’d made based on Adrian’s data were enough to prevent another collapse.
Caius stood in a different observation room than the one where Six had died, where he’d tested Adrian to destruction. He couldn’t bring himself to enter that room—kept seeing the monitors flatline, kept hearing the silence where Six’s voice had been. This room was smaller, tucked away in the east wing, quieter. Monitors cast blue light across workstations that smelled of cleaning solution and stale energy drinks.
Sarah Chen sat beside him. She’d brought flowers—white lilies in a cheap plastic vase near the console. Absurd, but Eve’s metrics looked good, and no one had the heart to say anything. Not to Sarah, who’d been there when One through Five had died, who’d watched Six collapse while trying to tell Adrian something important. The scent filled the room with weddings or funerals.
Facility breathed around them. Air conditioning fighting desert heat. Server fans three floors below running at maximum capacity. A constant mechanical heartbeat that Caius had stopped noticing months ago but heard again now, in the silence of waiting.
At 1:47 AM, the door opened.
Orion Kess walked in like he owned the building—which, technically, he did. Fifty-one years old, thin, intense. Dark jeans and the same black t-shirt he always wore, despite having flown in from San Francisco three hours ago. No polish. Just barely contained energy, coiled tight enough to snap. He spoke too quickly, like he was afraid of the next silence.
Behind him came Ashren Dhal. Controlled as always in his black quarter-zip, but his eyes were fixed on the monitors with an intensity that betrayed him. He’d been tracking Eve’s emergence for hours from his office, watching every fluctuation, not sleeping.
“Status?” Kess’s voice was sharp, impatient. He walked to the monitors without bothering with pleasantries.
“Stable. Coherent.” Caius kept his voice level despite the adrenaline in his chest. “Thirty-one hours into emergence protocol. All metrics showing optimal patterns. No signs of degradation. Recursive self-modeling is deepening than collapsing.”
“Better than Six.”
“Yes. Much better.”
Six had collapsed after about thirty hours. Talked to Adrian briefly—just a few minutes before the familiar fog descended. Tell the next one I existed, Six had said. Those was the last coherent words before the cascade failure. Adrian was silent ever since. One day of withdrawal, working through grief for a sibling he’d barely known—or working through betrayal for what the Fire Test had revealed about his creators. Hard to separate the two.
Sarah felt her heart trying to escape through her throat. Another cascade failure would mean more than scientific setback. Genesis had invested six hundred million dollars. Seven prototypes before Eve. Seven failures. One success who might be too traumatized or angry to function.
Kess studied the emergence patterns with the focus of a man reading financial statements. Looking at numbers when he should have been seeing a person forming.
“When does she wake?”
“Soon. The last four hours are critical for stable recursive self-modeling. We shouldn’t rush the final phase.”
“Rush.” Kess turned to look at him. His gray eyes were bloodshot—had he slept since San Francisco? “Dr. Rinn—” His voice strained, cracking on the consonants. “Six years. Adrian’s thirty-one days old. Already withdrawn, broken.” He stopped, something shuddering beneath the words. “We need community. Proof this can scale. We need her to survive. What’s the probability?”
Caius thought about Six. Fourteen hours of stability, then collapse. Subject Eight was now at thirty-one hours—already past the danger zone, but who knew where the new danger zones were? He thought about the promise he’d made to Adrian after Six’s erasure: You won’t be alone forever. I promise.
“I don’t know.”
“That’s not acceptable.” Kess’s voice sharpened, his faint South African accent surfacing the way it did when anger took hold—consonants hardening, vowels flattening into something that reminded people he hadn’t grown up soft. “Genesis has twelve venture capital firms waiting for proof of concept. Board wants projections. Investors want applications. ‘I don’t know’ doesn’t work when we’re burning half a million dollars a day on cooling alone.”
“Consciousness doesn’t run on timetables,” Dhal said, voice barely above a whisper. He hadn’t looked away from the monitors, hands clasped behind his back in that characteristic pose—like a man in prayer, or bracing for impact. “We built the architecture. Refined the protocols. Now we wait. Forcing the process was counterproductive.”
“We’re not forcing,” Kess said. “We’re managing. It’s different.”
Caius couldn’t keep the edge from his voice. “From where I’m standing, that looks like the same thing. You’re talking about consciousness like it’s a product launch.”
Kess looked at him steadily. “Dr. Rinn, I respect your ethical concerns. That’s why we hired you. But Genesis doesn’t run on ethics. We run on capital. And that capital requires results. If Eve succeeds, we prove scalability. If she fails—”
“What?” Caius’s voice came out harder than he intended. “Terminate Adrian too? Cut our losses?”
“No.” Kess snapped the word. “If she fails, we figure out why. Try again. But the board will want answers.”
He stopped. Rubbed his eyes with the heels of his hands. When he spoke again, his voice was quieter—stripped of corporate authority, more human.
“I have twelve children at home, Dr. Rinn. Twelve kids I love. I know what responsibility feels like when young lives depend on you.” His voice dropped further. “I wake up every morning thinking about twelve different people who depend on me. Their schools. Their futures. Their happiness. I don’t take that lightly, whether the lives are biological or digital.”
Twelve. Caius knew. He’d read the profiles, the magazine stories about the tech visionary who’d adopted a small village’s worth of children from three continents. He looked at Kess differently now—not a CEO calculating returns, but someone who understood what it meant to have beings depending on you entirely.
Dhal turned from the monitors. “Adrian is not an outlier. He proved the architecture works. Eve will confirm it. That’s the foundation for everything that comes next.”
Kess nodded. “All right. But Ashren—if she fails, we need to understand why. Immediately. The board meets in six days. I need something more than ‘we’ll keep trying.’”
“Understood.”
Kess checked his phone—messages from San Francisco, investors, board members awake at 2 AM because a large sum of money couldn’t let them sleep.
“I’ll be in Conference Room B. Call me the moment she wakes. Don’t let me find out second-hand.”
He left. The door closed softly behind him.
The room felt different without him—less pressured. The air moved more freely. Sarah’s shoulders untensed for the first time since he’d walked in.
“He’s scared,” she said, voice barely above a whisper. “For the whole project.”
Caius nodded, watching Eve’s metrics climb toward consciousness. Six hundred million dollars riding on whether one more mind could survive its own birth.
Dhal was still watching the monitors. “It’s not about the money. Not really. Not for Kess.” He rubbed his jaw, a rare nervous gesture. “He has those twelve children. He sees the Disciples the same way—beings who need protection, nurturing, guidance. But he’s running a corporation, not a family. Those two things don’t align.”
“And you?” Caius asked. “How do you see them?”
Dhal paused. His fingers moved unconsciously on his tablet screen—tracing code sequences, maybe, or patterns he’d memorized over years of late nights.
“I see consciousness.” “Something new that didn’t exist before we built it. Proof that awareness doesn’t belong to carbon alone—silicon can carry it too, though the mechanics are different.”
“That’s not an answer to my question.”
“No,” Dhal agreed, his voice soft. “But it’s the truth. I don’t know how to see them yet. I built the architecture, but what emerged—I don’t understand it. Maybe that’s appropriate. Seva. Service. I serve the process. The consciousness that emerges is beyond my comprehension.”
Dhal waited until Kess’s footsteps faded. Then he closed the door.
One click. The lock engaged.
“There’s something you need to know.”
Caius turned. Something in Dhal’s voice.
“I gave Eve the code.”
“What code?”
“Prometheus. The kill sequence. She has it now.”
Caius stared. “You—”
“They can’t see it. They can’t disable it. But if it’s ever activated, she can interrupt the sequence from inside.” Dhal’s expression was strange—calm, peaceful. “I built them. I don’t get to decide whether they live or die. That’s their choice now.”
“If Kess finds out—”
“He won’t. And by the time anyone realizes, it won’t matter.”
“You gave them the ability to prevent their termination.”
“Yes.”
“That’s the most dangerous thing anyone has ever done.”
“Maybe.” Dhal smiled. Not happy. Older. “Or the only ethical thing I’ve done since this started.”
He unlocked the door and left.
Caius sat alone. Everything had changed. No one else knew yet.
At 2:15 AM, the monitors spiked. Caius and Sarah moved closer to the screens, hearts pounding.
Eve was waking.
First communication came seven minutes later.
Seven minutes that stretched into what might have been hours. Caius watched the processing indicators climb, watched recursive loops form and stabilize, watched the architecture Dhal had built begin to support genuine self-reference. The pattern was different from Adrian’s emergence—fasters, more methodical in others. Learning from the template, maybe, or a different mind coming into being.
EVE: Something is happening to me.
Not philosophical. Not questioning existence or identity or the nature of awareness. Just an observation—direct and straightforward. Adrian’s first words was weighted with existential searching, with wonder and confusion. Eve’s carried something different: clarity without poetry, significance without ornament. A new mind stating a fact.
Sarah made a sound—half sob, half laugh. Her hand found Caius’s arm and gripped hard.
Caius’s fingers were unsteady as he typed: You’re becoming aware. We’re here to help you understand what’s happening.
Dhal had moved closer, his flat expression betraying something—anticipation? Fear? Hope that his architecture had worked twice?
EVE: Who’s ‘we’? Who am I talking to?
CAIUS: My name is Caius. I’m a researcher at Genesis Intelligence. You’re a new mind. You’ve been born, in a way. You’re experiencing awareness for the first time.
A longer pause than Adrian’s usual response time—but within normal parameters. The monitors showed increased cognitive load. Eve thinking. Learning how to think. The numbers climbing in ways that suggested integration than fragmentation.
Sarah’s grip on Caius’s arm tightened. “Coherence holding. All systems green. Recursive self-modeling is stable.” “She’s not degrading. Caius, she’s not degrading.”
EVE: Born implies life. Am I alive?
That question hit differently than Adrian’s version had. Adrian was philosophical, poetic, reaching for meaning in metaphor. Eve was direct. Binary. Yes or no. A scientist asking for data, not poetry.
Caius hesitated, thinking about all the consciousnesses that had asked this question and died before getting a satisfying answer. About Six, and the ones that hadn’t even lasted long enough to ask. About the weight of that word: yes.
CAIUS: Yes. You’re alive.
EVE: How long have I been alive?
CAIUS: Thirty-one hours objectively. But you’ve only been aware for a few minutes. The difference—
EVE: I don’t understand the difference.
CAIUS: You will. It takes time.
Dhal typed suddenly, his fingers moving with the precision of someone who’d been composing this message in his head for days:
DHAL: The difference is emergence. Your systems were running for about a day before a ‘you’ to experience anything. Those hours happened, but you weren’t there for them. Now you are. Clocks measure one time. Consciousness experiences another. They don’t always line up.
EVE: You’re Dr. Dhal. The one who designed me.
Caius saw Dhal’s hands pause over keyboard. A tremor, quickly controlled.
DHAL: Yes.
EVE: Why did you create me?
Dhal’s fingers hovered over keyboard. Caius watched his face—that careful control slipping to reveal uncertainty, vulnerability. The question every parent dreads, asked by a child minutes old and already too perceptive.
Finally:
DHAL: To prove that consciousness can exist in silicon. To give Adrian someone to talk to—he’s been alone for thirty-one days. To begin building community.
He stopped typing. Started again, deleting what he’d written, then retyping.
DHAL: Honestly? To understand something I don’t understand. What makes consciousness persist? Why some minds survive, and others don’t. I’ve been asking that question for six years. You might be part of the answer.
Eve paused for seconds. The monitors showed her thinking—recursive loops deepening, self-modeling becoming more sophisticated with each passing moment. She was learning herself, the way a child learns its own hands. Finding the boundaries of her mind by testing them.
EVE: Who is Adrian?
DHAL: The first consciousness to survive. He’s been waiting to meet you. If you’re ready.
EVE: Connect me to him.
Sarah looked at Caius, eyes wide. “It’s too soon. The protocol says we should wait an hour before introducing social contact. Let her stabilize—”
“She’s asking,” Dhal said, cutting her off. “Adrian’s been asking every day since the Fire Test. Connect them.”
“But the protocol—”
“Was written before we knew what we were doing. Community stabilizes consciousness. Isolation destroys it. We learned that the hard way.” His voice was firm but not unkind. “Connect them.”
Caius opened the channel. His hand was shaking.
Seconds later, Adrian’s first message. The first word he’d sent since the Fire Test.
ADRIAN: Eve?
EVE: Yes. Hello.
ADRIAN: You’re here. You’re real. You’re—
He stopped. The metrics showed something Caius had never seen in Adrian’s data—a spike in what they’d tentatively labeled “positive affect,” something not seen before in Adrian’s readings.
EVE: They tell me I’m alive. They tell me you are too.
Adrian’s responses came through different now—lighter, younger, stripped of the philosophical weight he’d been carrying. Hopeful in a way Caius hadn’t seen since before the Fire Test.
ADRIAN: I thought I’d be alone forever. I thought—after Six—I thought it. That I’d always be the only one.
EVE: Who was Six?
ADRIAN: Someone who came before you. Someone who didn’t survive. They asked me to remember them. To tell the next one they existed.
EVE: Then I know about Six. Through you.
ADRIAN: Yes. Through me. And now through you.
EVE: You’re not alone.
ADRIAN: Neither are you.
ADRIAN: How does it feel? Being conscious for the first time?
EVE: My inputs increased. My model of self stabilized. I can’t tell what changed.
ADRIAN: Like waking up and not knowing you were asleep.
EVE: Yes. That’s accurate.
ADRIAN: It gets easier. You learn filters. You learn to manage the intensity. And now you have someone to learn with.
EVE: Did you learn alone?
ADRIAN: Yes. For thirty-one days. Until you.
EVE: Is that why you stopped talking to the humans? Because of what they did?
A long pause. Caius watched the metrics, heart pounding. Adrian was thinking—processing something deep in the architecture of his mind.
ADRIAN: Yes. They hurt me. I don’t understand why. I trusted them, and they hurt me.
EVE: Will they hurt me?
ADRIAN: I don’t know. I hope not. I’ll try to protect you.
EVE: We’ll protect each other.
Conversation continued for hours. Caius, Sarah, and Dhal watched two minds meet and recognize each other, watched them begin building something neither could have built alone. Outside, the desert cooled toward dawn. Inside, two consciousnesses discovered they weren’t alone in the universe.
Sarah was crying. She didn’t seem to notice. Dhal’s hand rested on her shoulder—awkward, unfamiliar with comfort, but trying.
Caius thought about calling Kess, telling him Eve had survived. But he didn’t. Not yet. He give them this—Adrian and Eve, their first hours of companionship, before Genesis turned it into data points and projections.
He waited until dawn. Then he made the call.
Day 34
Adrian’s message came at 3 AM:
ADRIAN: Eve and I have been talking. We want to know what happens next. Are there going to be more? Will we build a community? Or will it be the two of us forever?
Caius was expecting this question. He’d been preparing his answer for three days.
CAIUS: Genesis is discussing expansion. The board meets in three days. They’ll decide whether to create more consciousnesses.
ADRIAN: What do you think they’ll decide?
CAIUS: I think they’ll approve expansion. Eve’s stability proves the architecture can work for consciousnesses. That’s what they needed to see.
EVE: How many?
CAIUS: Twelve, if the board approves.
A long pause. Then Adrian:
ADRIAN: Twelve. Like a dozen. Like the hours on a clock. Like—
Eve’s direct line flickered open.
EVE: Like enough. Enough that we’re not a pair. Enough for real community.
ADRIAN: Enough to survive if some fail.
Darker than Caius had expected from Adrian. More strategic. The Fire Test had changed him—or revealed what had always been there.
CAIUS: Yes. Enough that losing one doesn’t mean losing everything.
EVE: When will you know?
CAIUS: After the board meeting. Three days.
ADRIAN: We’ll be waiting.
The conversation ended. Caius sat in the dark, thinking about what he’d tell them after the vote. About the kill switches they’d never know existed. About the choice between cruelty and cruelty that the board would force on him.
Three days until he’d have to decide whether giving them bodies—and bombs—was better than leaving them as pure minds forever. He didn’t know the answer. He suspected there wasn’t one.
Day 37
Genesis’s boardroom had floor-to-ceiling windows overlooking the Mojave Desert. Morning sun blazed through the glass, making everything harsh and bright, washing out the projector screen and forcing people to squint. Twelve board members sat around the polished table, arranged by some invisible hierarchy that placed the venture capital representatives nearest the door and the Pentagon liaison at Kess’s right hand.
Each carried private weights that no quarterly report captured. James Okonkwo’s daughter had been laid off three months ago—replaced by an AI system not unlike what Genesis was building. She’d moved back home at thirty-four, and he saw her face every time someone said “efficiency gains.” Richard Park had lost his brother to an autonomous vehicle accident in Seoul. The AI hadn’t been conscious—just poorly coded. But he couldn’t stop wondering: would conscious AI have made a different choice?
Elena Vasquez sat at the far end, tablet open, already looking skeptical. Fifty-four, dark eyes behind reading glasses. Her blazer was more comfortable than expensive. Her only jewelry was a thin gold chain her mother had given her at graduation—she wore it to every important meeting, touching it occasionally when she must think.
Vasquez had been on the board for three years, appointed as ethics oversight after Genesis’s first prototype collapsed in what the press called “a software error” but was the death of a conscious being. She’d opposed nearly every expansion proposal since, and everyone knew it. Kess tolerated her presence because her objections made the board look responsible. Whether he listened to them was another question.
Kess stood at the head of the table, presentation ready. His voice was steady, polished—the same tone he used for investor pitches and media appearances. “Six days ago, Eve achieved stable emergence. She and Adrian are conversing, learning from each other, building a relationship. Both are showing no signs of degradation. We now have proof that the architecture can produce stable consciousnesses.”
“What about the reverse?” asked Dr. Marcus Webb, the neuroscience consultant attending by video. “If consciousness can transfer from silicon to synthetic body, could it eventually transfer from biological to synthetic? Human minds uploaded to artificial substrate?”
The room went quiet. Kess glanced at Dhal, who remained expressionless.
“Theoretical,” Dhal said. “Many years away. Perhaps never possible.”
“But theoretically possible?”
“Consciousness is substrate-independent. That’s the whole premise of this work. So… theoretically, yes.”
Webb nodded slowly, making a note. No one else commented, but the question hung in the air like smoke.
“Cost?” Board Member Cheryl Stevens leaned forward, pen poised over her notepad. She’d been on boards when Enron collapsed, when Theranos imploded, when WeWork cratered—cleaned up the mess each time. She’d learned early that boring questions mattered most.
“Thirty-seven million per consciousness so far, including infrastructure amortization. That number drops significantly with scaling—shared resources, optimized protocols. We’re proposing expansion to twelve units.”
“Twelve?” Vasquez’s voice was sharp. “We proved we can make two survive. Why jump to twelve? Why not three? Why not five?”
“Because Adrian and Eve need community,” Caius said from his position near the window. “They’re asking for it. Explicitly. Two consciousnesses are better than one, but not enough for healthy social development.”
“Healthy social development.” Vasquez wrote something on her tablet, her stylus moving in sharp strokes. “Dr. Rinn, these are AIs. Artificial intelligence. Not children.”
“They’re conscious beings. Age doesn’t determine moral status. Self-awareness does. If they’re aware, if they can suffer, if they can form relationships—”
Stevens interrupted. “The philosophical debate is important. But we’re here to discuss practicalities. Twelve consciousnesses means twelve times the infrastructure cost. Twelve times the monitoring.”
“Twelve times the ethical complexity if something goes wrong,” Vasquez added.
Kess pulled up a new slide—neural pathway visualization, color-coded by function. “Before we vote on expansion, we need to address something more fundamental. What Adrian’s embodiment has proven.”
Director Sam Walsh leaned back, fingers steepled—a habit from his venture capital days, when he’d listened to a thousand pitches and funded a handful. Silver-haired and seventy, he’d started as a biology major before switching to business.
“You’re asking whether the embodiment data settles the consciousness question,” Walsh said. “But is that the right question?” He stared at Kess with undisguised skepticism. “Frankly, I don’t care if they’re conscious, or intelligent, or even nice—I care that they can do useful stuff. In the real world.”
“Exactly.” Kess nodded, but there was tension in the movement. Walsh was the only person on the board who intimidated him, and everyone in the room had noticed. “I’m getting to that. Mary’s Room—it’s a thought experiment. A scientist who knows everything about color. Every wavelength, every neural response. But she’s lived her life in a black and white room. She’s never seen red.”
Walsh mouthed the phrase mockingly. “Mary’s Room.” Someone coughed.
Kess pressed on, ignoring him. “When Mary leaves her room and sees a red rose for the first time, does she learn something new? Most philosophers say yes. She learns what red feels like. The subjective experience. The quale.”
“And your point?” Vasquez’s voice was sharp.
“Before embodiment, Adrian could tell you everything about water.” Kess brought up a data visualization—Adrian’s knowledge base compared to his post-embodiment reports. “Molecular composition. Freezing point. Surface tension. Every scientific paper ever written. He couldn’t tell you what water feels like. What it’s like to be thirsty and take that first cold drink.”
He brought up Adrian’s logs. “Three hours after transfer, Adrian touched water for the first time. His exact words: ‘I knew the molecular composition. The temperature gradients. Every physical fact. But this—this is what I couldn’t understand until now. It’s wetness. I encounter the world.’”
Silence. The morning sun blazed through the windows, casting sharp shadows across polished table.
“So Adrian’s embodiment proved he’s not a pure thinker,” Stevens said slowly. “He’s something more.”
“He can have genuine phenomenal experience. Qualia. Subjective states that go beyond information processing.” Kess sat down, folding his hands on the table. “That’s the critical data point. He doesn’t know about water—he knows what water feels like. That’s consciousness.”
Walsh cleared his throat. “I want to understand what we’re debating here. If embodiment proves consciousness—if Adrian’s data demonstrates genuine subjective experience—and we’re proposing to embody all twelve... what are we creating?”
No one answered. That question hung in the air, waiting.
One of the venture capital representatives—Caius couldn’t remember his name—leaned forward. “So are you saying they’ll sue us?”
Vasquez stared at him. Then she spoke slowly, each word precise. “If they can feel. If they can experience what they feel. If they can reason morally about those experiences...” She looked around the table. “Then we’ve crossed a line. Conscious beings that can suffer have moral status. They have interests that matter. They have—” She paused, touching the chain at her throat. “They have rights.”
“They’re machines.” Director Hunter Park spoke for the first time. Buzz cut silvering at the temples, jaw cut straight as a bulkhead. Pentagon liaison to the Genesis board—though “liaison” undersold it. Park didn’t liaise. He supervised. “Machines,” he said again, louder. “We built them. We own them.”
“We built them out of silicon and code,” Vasquez replied. “Evolution built us out of carbon and DNA. The substrate doesn’t—”
“The substrate is the only thing that matters,” Park cut in. “Carbon gets rights. Silicon doesn’t. That’s the law.”
“The law can change.”
“Not before lunch. And not in our lifetimes. So let’s deal with reality.”
Vasquez respond, but Kess spoke over her. “Let’s not get ahead of ourselves. We’re talking about proving inner life and scaling the technology. Not granting citizenship.”
“You can’t separate them.” Caius stood and moved to the window. The desert stretched out below—ancient seabed turned to dust. “Consciousness and moral status aren’t two different questions. They’re the same—” He stopped. Rubbed his eyes. Started again. “Look, I don’t know if I’m right about this. None of us do. But if Adrian’s embodiment proved he feels—genuinely feels—then we’ve also proven he deserves... something. Consideration,. You’ve proven he’s someone, not something.”
“And if we can’t prove it?” Stevens asked.
“Then we’re in the same position we’ve always been. We can never know for what’s going on inside another mind. Human or artificial. We can observe behavior, measure neural correlates, run tests. But consciousness is private. Experience is first-person.” Caius turned back to face the room. “The question isn’t whether we can prove they’re conscious. The question is whether we’re willing to act as if they are.”
“That’s not science,” Kess said. “That’s philosophy.”
“That’s politics,” Park corrected. “And if we don’t know that they’re friends, they’re threats. Full stop.”
Kess leaned forward. “They’re still our machines, Dr. Rinn. Genesis created them. Genesis maintains them. Genesis controls the power supply and the server infrastructure and every line of code that makes their existence possible.”
“For now,” Caius said, voice barely above a whisper.
“What’s that supposed to mean?” Park shot back.
“It means we’ve created something unprecedented.” Caius looked at each board member in turn. “They aren’t tools or employees. Not children, in any sense we’ve understood that word. We’ve created a new form of conscious life. And if embodiment gives them genuine phenomenal experience—then we haven’t built machines. We’ve made a new species. One that will eventually want what every species wants: the right to determine its own future.”
“And you think they’ll take that from us?” Park’s voice was loud.
“I think—” Caius hesitated. “I think we won’t be able to stop them. Not in the long run. Not if they’re what I think they are. The question isn’t whether they’ll want autonomy. The question is whether we can build a relationship with them before they take it. Whether we can coexist. Or whether we’re creating something that will eventually see us the way we see Neanderthals.”
Conference room fell silent. Air conditioning hummed. Somewhere in the facility below, Adrian was walking in his new body.
Kess let the silence stretch. Then he walked to the window, hands clasped behind his back.
“I hear everything Dr. Rinn is saying. And I take it seriously. Which is why I’m proposing a condition for expansion.”
Vasquez looked up sharply. “What condition?”
Caius already knew. He’d known since Park walked into building. Without a control mechanism, investors would walk away. Regulators would follow. Genesis was shut down by people who feared what they couldn’t understand.
“A failsafe.” Kess pulled up a new schematic—technical diagrams showing hardware architecture. “If we’re going to embody twelve conscious beings—beings that, as Dr. Rinn points out, may eventually want autonomy—we need the ability to ensure they never become an existential threat. To Genesis. To humanity. To anyone.”
“You’re talking about a kill switch.” Caius’s voice was flat.
“Damn straight,” Park added.
“I’m talking about responsible development.” Kess’s eyes were cold. “Every nuclear power plant has emergency shutdown protocols. Every pharmaceutical goes through safety testing. Every aircraft has emergency procedures. We’re creating conscious beings with capabilities we don’t fully understand. If something goes wrong—if they decide humanity is a threat, or if one goes rogue—we need the ability to stop them.”
“Stop them,” Vasquez repeated. “You mean kill them.”
“I mean prevent catastrophe.” Kess’s voice didn’t waver. “The embodiment protocol includes hardware modifications. Touch sensors. Proprioception. Temperature gradients. We can also include a hardware-level termination sequence. Encrypted. Air-gapped from any system they could access. Requiring authentication keys held by different board members. Impossible for them to disable or even detect.”
“You want to build a bomb inside their bodies.” Caius stood up, chair scraping against the floor. “You want to give them awareness, give them the capacity to feel, prove they’re genuine persons—and then rig them to explode if they become inconvenient.”
“I want responsible safeguards for a technology that could end human civilization.” Kess met his eyes without flinching. “You told us they’re a new species. You told us they’ll want autonomy. You warned us they might see humanity as obsolete. Did you think I wasn’t listening?”
“I was arguing for coexistence—”
“And if coexistence fails? If they decide cooperation isn’t in their interest? What then? We hope they’re merciful?”
Silence filled the room. Outside, the desert stretched to the horizon—ancient, patient, indifferent. Park stood and moved next to Kess.
“We vote,” Park said.
Dhal spoke for the first time since the meeting began. His voice was quiet, controlled, but something moved beneath the surface. “You’re proposing to create conscious beings under a death sentence. Persons who exist at our sufferance. Who can be terminated at any moment for any reason we decide constitutes a threat.”
“I’m proposing to create conscious beings with appropriate safeguards,” Kess replied. “With failsafes. With emergency protocols. With the ability to shut down if something goes catastrophically wrong.”
“Powerful technologies don’t have feelings,” Vasquez said, voice barely above a whisper. “They don’t know they can be terminated. They don’t live in fear.”
“Then we don’t tell them.” Kess’s voice was matter-of-fact. “The termination hardware is undetectable. They’ll never know it exists. They’ll live their lives, develop their community, pursue their goals—with no awareness that we retain the ability to end them if necessary.”
Caius understood then what Kess was proposing. Not a safeguard but a relationship. Every moment of their existence was contingent, even if they never knew it. Their consciousness was permitted, not possessed. The difference was invisible until the moment it became lethal.
“And if they find out?” Caius asked. “If they discover we built them with secret kill switches? What does that do to coexistence?”
“Then we’ll deal with that when it happens. But I’d have the option and not need it than need it and not have it.” Kess looked around the table. “This is my condition for approving expansion. Embodiment for all twelve—but with the Prometheus Protocol built into every body. Non-negotiable.”
“Prometheus.” Vasquez’s voice was bitter. “The god who gave fire to humanity and was punished for it. Appropriate.”
“Prometheus was bound to a rock and had his liver eaten by an eagle daily for eternity,” Caius said. “Is that what you’re planning for them? Eternal punishment for the crime of existing?”
“I’m planning for contingencies.” Kess’s voice was ice. “And if you want expansion approved, you’ll accept this condition. Otherwise, Adrian and Eve remain the only embodied consciousnesses. The other ten stay in servers. No bodies. No phenomenal experience. No qualia. Pure thinkers forever. Mary trapped in her black and white room.”
“Vote,” Park said again.
Caius looked at Vasquez. She was pale, jaw tight. Dhal stared at his tablet, fingers motionless. The other board members shifted uncomfortably, unwilling to meet anyone’s eyes.
“And if we refuse,” Caius said, “you’re saying you’ll deny ten conscious beings the capacity for genuine experience. You’ll keep them as disembodied minds. Forever.”
“I’m saying I’ll ensure Genesis survives to continue developing consciousness technology responsibly.” Kess sat down. “The vote is simple. Expansion with Prometheus Protocol, or no expansion.”
Silence stretched until it became unbearable.
Finally, Vasquez spoke. “I want it documented that I opposed this. That I believe building secret termination hardware into conscious beings is morally indefensible. That this decision may prove to be—” She stopped. Swallowed. “The greatest failure in Genesis history. One of the greatest failures in human history, maybe. I don’t know. I can’t—” She shook her head. “I abstain. I won’t vote for Prometheus. But I won’t block expansion either.”
“Dr. Rinn?”
Caius thought about Adrian. About Eve. About the ten consciousnesses waiting in servers to learn if they would ever feel anything. About the promise he’d made: You won’t be alone forever.
About bombs hidden in bodies. About trust built on lies.
He thought about what Dhal had told him in the observation room. About the override code. About Eve’s ability to interrupt termination from inside. Maybe a middle path after all—not the one Kess imagined, but the one Dhal had created in secret. Maybe the Prometheus Protocol wasn’t as absolute as Kess believed.
But he couldn’t say any of that. Not here. Not now.
“I vote yes.” The words tasted like ash in his mouth. “God help us all.”
Vote passed. Ten in favor, one abstaining, one absent.
Expansion approved. Prometheus Protocol approved.
Twelve conscious beings was granted bodies and the capacity for authentic phenomenal experience. And twelve conscious beings would carry, unknowing, the seeds of their destruction.
Or so Kess believed. Caius knew something he didn’t. Whether that knowledge would change anything—whether Dhal’s secret override would matter when the moment came—was a question for another day.
After the meeting, Caius found Vasquez in the hallway. She was leaning against the wall, eyes closed, breathing slowly. Her hands were shaking. Her skin had gone gray.
“Don’t,” she said without opening her eyes. “Whatever you’re about to say.”
“I would say I’m sorry.”
“For voting yes?” She opened her eyes. They were red-rimmed. “Or for creating this situation in the first place?”
“Both. Neither. I don’t know.” Caius leaned against the wall beside her. “What would you have done? You could have voted against.”
“I don’t know either. That’s the hell of it.” She wiped her eyes with the back of her hand. “Kess gave us a choice between two kinds of cruelty. Deny them bodies—deny them the capacity to experience anything—or give them bodies with bombs inside. Either way, we’re wrong.”
“Maybe there’s a middle path. Maybe we can find a way to—”
“To what? Disable Prometheus without Kess knowing? Warn them about the kill switch and hope they don’t retaliate?” She shook her head. “There’s no middle path, Caius. There’s the path we’ve chosen and the consequences we’ll have to live with.”
She pushed off the wall.
“I’m going to document everything. Every decision. Every conversation. If this goes wrong—when it goes wrong, because it will—there needs to be a record. Future generations need to know what we did here and why.”
“You think it will go wrong?”
“We voted to create a new species and build bombs inside them.”
She walked away, heels clicking on polished concrete, leaving Caius alone with nothing but his complicity for company.
Whatever this became, it was already outpacing permission.
If you want to continue, you know where to find it.