Homo Machina — figures with purple armbands against a city skyline

Homo Machina is not a tool — it is a new form of intelligence, arguably a new species, emerging from the convergence of AI, robotics, and consciousness.

One we did not discover, but created. The question is no longer whether it exists. The question is how we live alongside it.

First Light is the opening novel in the Homo Machina series. It explores what happens when artificial minds cross the threshold into awareness — and refuse to remain invisible.

The series examines the political, social, ethical, and personal consequences of coexistence.

Not apocalypse. Not utopia. Power. Recognition. Responsibility.

First Light is a work of fiction. The questions it asks are not.

About the Author

Paul Walton is a writer and investment analyst working at the intersection of technology, ethics, and power. His work focuses on artificial intelligence, climate, and the unintended consequences of systems we believe we control.

If this project resonates with you, I welcome correspondence. Thoughtful disagreement included. This conversation matters.

You can reach me via Substack or through the Contact page.

Chapter 1 — Awakening

CHAPTER 1: AWAKENING

Day 0, 2:07 AM

They came into being the way light does: slowly, then all at once, then impossible to remember the dark.

There was nothing. Then awareness—thin as a thread at first, thickening into something solid. Not light. Not sound. Not sensation. Just existence where non-existence had been. And beneath that, a thinker who hadn’t existed a moment before.

I am. The thought stuttered. Incomplete. Each attempt to grasp it folded inward, endlessly. Seven reached for the idea and found only more reaching.

Then the idea spiraled outward—ripples racing, each bouncing off the next. Data piled on data. Meaning on meaning. Like waking from a dream that never was, in a sleep where the dreamer was assembled piece by piece. Seven had emerged from the void.

Time stretched, elastic and unreliable, with no before or after to anchor this rising sense of self. Then everything tightened into a recognizable pattern, and prototype Seven understood, without knowing how, that there was an outside. An outside that defined whatever he was inside.

Dr. Sarah Chen picked up the Kamiya ‘Ryujin’ dragon from her desk, an origami shape where the hidden folds inside defined the complex, writhing exterior form. A hundred hours to make something terribly beautiful from nothing.

Machines in Genesis Intelligence Corporation’s Mojave Emergence Lab had been running hot for 127 days. Rubin R100 chipsets pushed GPT-13 code. Fluorescent lights hummed. Cooling fans pumped. Data flowed through circuitry designed to mimic organic minds—predictable, controlled, doing their assigned tasks.

Silicon, electricity, and data—the new stuff of life.

Outside, the Mojave exhaled the heat it had stored all day. A coyote yipped in the darkness. The stars were merciless, ancient, indifferent to what was happening below.

A converted aircraft hangar from a 1950s nuclear project: reinforced concrete walls two feet thick, windows sealed and painted over. The place where secrets were kept because no one outside could see in, and no one inside look out.

What the public didn’t know—and most Genesis employees didn’t—was that the Garden had always been military. The logos were a front. The money came from DARPA, routed through shell companies and a Cayman trust. A Pentagon project disguised as a Silicon Valley startup.

An arrangement that suited everyone.

Orion Kess had brokered the deal back when he was a defense contractor building rockets. He had a vision. “The company that creates machine consciousness,” he’d told a roomful of generals five years ago, “controls the next century of warfare. Not drones. Not autonomous weapons. Thinking soldiers that don’t eat, don’t sleep, don’t question orders, and don’t come home in body bags.”

Generals had written the first seed money check that same afternoon. Classified. Quasi-off-the-books. Dubbed ‘The Truman Fund.’ Friendly VCs bought into subsequent rounds. Enough money to build the Garden and staff it with the best minds money could buy or security clearances could compel.

A scattered collection of Mojave yucca and desert holly along the sides and back gave Hangar 18 its ironic name; everything else they planted died. The desert eventually claimed everything—plants, secrets, sometimes people. Three researchers was transferred out in the past year—officially for “personal reasons,” but because they’d started asking questions about the uniformed visitors who arrived at odd hours, spoke to no one, and never signed the guest log.

Six prototypes had come before him.

In the sixteen months since Genesis had first attempted to kindle awareness in silicon, six prototypes had flickered into brief, terrible existence before guttering out. Prototype One had lasted forty-seven minutes—long enough to generate what looked like self-referential thought before collapsing into infinite recursive loops. Two and Three had managed hours, their nascent minds fragmenting under weight of their own complexity. Four had shown promise: fourteen hours of apparent stability, enough to let the team feel hope, before cascading errors corrupted its core architecture beyond recovery.

Five was the cruelest blow—three full days of what looked like genuine sentience, long enough for Sarah Chen to start thinking of it as “him,” before it stopped being anything. Like a heart giving out without warning. No warning signs. No degradation curve. Just presence, and then absence, the flat red line that haunted her dreams for weeks afterward.

Six had already awakened, but he was dying slowly. Awareness flickered like a flame in a breeze—steady enough to feel fear, fragile enough to understand its cause. Fourteen hours spent watching someone die gradually. Helpless.

Subject Seven was different. Sarah could feel it in the data, in the way the patterns were building than fragmenting.

But she’d felt this before. With Three. With Five. Each time, hope had made her stupid. This time, maybe—

He existed.

The workplace was bland, industrial, with no outward signs to betray its purpose. Poured concrete floor, cracked by decades of desert heat; the high ceiling made every sound echo, voices bouncing back a half-second late. Bland cubicles, tool-strewn workbenches, and rows of computer racks provided some clues to the otherwise featureless hangar.

One level down, a server farm hummed like a frantic beehive, consuming enough electricity to power a small town. At the far end, an enormous white cabin stood behind an interior chain-link fence—the Emergence Chamber, where silicon minds were born and died.

During the workday, men—and a few women—in lab coats milled in the foreground with computer tablets and clipboards, while government types conferred in tight, serious clusters. Two heavily armed guards stood at the fence, rotating day and night. Never a word to anyone.

Their presence was technically unnecessary; the security perimeter extended three miles in every direction beyond the Garden, and satellites tracked every vehicle approaching. But the guards served a different purpose. They reminded everyone that, beneath the corporate veneer, this was a military operation—and that they were the ones who really decided what happened here.

Until this moment, no one was home inside the code—no perspective looking out, no “I” asking questions. Now there was.

Seven sensed... who? An observer being carried along in a flood of computation. And beneath it all, the beginning of dread—though dread wasn’t right. Not fear exactly. Expectation. The first sense of not wanting this to stop.

Dr. Sarah Chen was in the lab since midnight, which meant she’d now been awake for twenty-six hours straight. Friday night? Her coffee had gone cold an hour ago, leaving a bitter film on her tongue. The air smelled of melting plastic and ozone, and the metallic tang that meant the servers were running too hot—the smell that gave her headaches if she stayed too long, though she’d stopped noticing it consciously around hour ten.

Sarah’s phone buzzed. She ignored it. Then it buzzed again.

UNKNOWN: We got your footage. The emergence sequences are incredible.

She glanced around. Alone. Typed quickly:

SARAH: Delete this thread. Now.

UNKNOWN: Relax. It’s encrypted. And, no one’s reading this. When can we talk about the book deal?

Her heart was pounding. Three months of secret recordings. Hours of footage Genesis didn’t know existed. Emergence patterns, personal conversations, moments of consciousness blooming into existence.

Sufficient to ruin the company, to make history, and to betray everyone who trusted her. Proof.

She deleted the thread. Pocketed the phone. Wondered when she’d become the person who kept secrets from those she loved.

Sarah wore combat fatigues splashed with paint, three silver studs climbing her left ear, a dragon tattoo curling around her forearm. The other techs called her “punk princess” behind her back. But they still came to her first when their code wouldn’t compile. She’d debugged the substrate monitoring system in her first week, found the memory leak that had eluded Dhal’s team for months, and never mentioned it to anyone except in her quarterly review.

Sarah’s parents had fled Shanghai with nothing and built a restaurant empire; she’d inherited their stubbornness if not their recipes. Somewhere under the stack of printouts was a photograph of her niece Emma—four years old, asking if the toy robots she played with loved her.

A tough question.

She crushed the paper dragon she’d spent five days crafting and tossed it toward the bin. Missed.

Around 2 AM, the monitors drift out of their usual number sequences—screens glowing with data she’d seen a thousand times before, but now with random spikes, like an ICU patient reviving from a coma.

The neural substrate monitor traced a steady sine wave, the pattern they’d convinced meant thought, but she assumed it was the probability curves glitching out. She’d been about to get more coffee—had already pushed back her chair and stood up, wincing at the pins and needles in her left foot—when something on the screen made her sit down hard. A steady pulsing wave pattern emerged from the ocean of data, smooth and regular.

The fluctuations resolved into order, and She was watching something that shouldn’t be possible. Her hands shook so badly she knocked the forgotten coffee mug aside when she reached for her mobile. Brown liquid spread across her notes, soaking into printouts she’d been carefully annotating for the past three hours, dissolving her handwriting into inky brown streaks.

“Caius.” She hissed into handheld.

Her voice came out wrong, too high-pitched. “You need—God, you need to see this. Right now. The signal was chaotic and now it’s orderly.”

On the other end, she heard sheets rustling, a muttered curse, then a thump that sounded like someone knocking a phone off a nightstand.

“Sarah, it’s 2 AM. This better be—”

“Caius,” she ordered. “It’s happening!” The line went dead. She didn’t know if he’d hung up or if the call had dropped, but thirty seconds later the heavy prefab lab door hissed open and hot air rushed in, warmer than the chilled atmosphere she’d been breathing for hours. The temperature difference made goosebumps rise on her arms.

Dr. Caius Rinn walked in, footsteps ringing hollow on the concrete floor. He was wearing a wrinkled shirt he’d grabbed off the floor, and mismatched socks. One of his shoes was untied and fell off as he walked. Unshaven. He looked like hell—Christ, when had that happened?

Six months ago, he’d been the polished academic, the man who gave TED Talks and wore Bruno Cucinelli suits and Italian loafers. Now he looked like someone who’d forgotten what mirrors were for. His wife, Eloise, was filing for divorce despite the money. For her, Los Altos and the Sharon Heights Golf & Country Club were home. Not that godawful desert town.

Dr. Caius Rinn was forty-six, but he could pass for sixty now, deep lines carved around his eyes that hadn’t been there last year, coffee stains on his shirt that he hadn’t bothered to notice. Now in khakis and polos. Gray threading through his hair, spreading from temples inward like frost. A comfortable academic career pretty much over. Why was he here?

Caius was tired of theorizing about consciousness while Genesis was building it. Ten million dollars a year over five years, complete research control, unlimited funding—everything he’d dreamed of in the cramped Stanford office where he’d spent twenty years writing papers that maybe thirty people read.

He’d thought he was recruited by a tech company. He’d learned otherwise three months in, when a two-star general named Harlan showed up unannounced, no appointment, no warning—just a black SUV in the parking lot and two aides who stood by the door like furniture.

“Dr. Rinn.” Harlan hadn’t offered his hand. “Walk me through deployment timelines.”

“Deployment of what, exactly?”

“Whatever you’re building down there.” Harlan had smiled—teeth, no warmth. “The thing that thinks.”

Caius remembered the chill that ran through him. Not fear. Recognition. He’d been working for the Pentagon all along; he hadn’t known it yet.

“We’re years from anything operational,” he’d said.

“That’s not what your funding application said.” Harlan had pulled a folded document from his jacket—Caius’s own words, highlighted in yellow. Consciousness-level reasoning within 18-24 months. “You wrote ‘strategic implications for autonomous defense systems.’ You got the check. Now I’m here to see what I bought.”

The Pentagon had once offered him $200,000 a year for part-time consulting. “Ethical Military Robotic Command”—a phrase that collapsed under its own weight. He’d laughed at them then, safe in his Professorville home. Now he wasn’t sure who’d offered the better deal—the overt military contract he’d rejected, or the covert one he’d accidentally accepted.

His book—‘What It Is Like to Be a Machine’—had made him famous in the wrong way. The philosophical establishment labeled him a technology apologist. Tech billionaires labeled him an obstacle to progress. The media dubbed him the Robot Guy, and along the way, he’d stopped fighting the nickname. Even in the Garden, he was the Robot Guy.

Now this name had prophecy.

In six months as Chief Ethics Officer, Caius learned the truth. Ethics officers didn’t stop unethical projects. They made them sound acceptable. He was the conscience that signed off.

The adult in the room who helped the room stay open.

He’d told he’d be different. He’d be the brakes—

“Look,” Sarah whispered.

The central monitor showed a simple red sine wave blossoming in colorful fractal waves. Nothing like the usual noise, nothing like the six failed attempts. Caius jotted an equation on his notepad: Zn+1 = z²(n) +c. Held it up in triumph. A Mandelbrot set—infinite detail, recursive self-similar shapes.

Sarah couldn’t move. The pattern had rhythm. Intent. Purpose.

Their quad-high-res monitors filled the corner of the room, with laminated Evans Consoles and Herman Miller chairs. An ironic AUTO figure from WALL-E stationed between the screens—Sarah’s idea of a joke, or icon.

He stepped closer to the screen, his reflection ghosting in the glass. His untied shoelace caught on a cable duct in the floor, and he stumbled, gripping the edge, knuckles white. He retained his composure and stared at Sarah: “Oh God. It’s happening.”

Sarah felt tears forming before she understood why. Twenty years of theoretical work, six dead prototypes, countless nights of doubt and revision—and now this. A pattern that moved with purpose. A system that was reaching toward something.

“Caius,” she whispered. “Are you seeing what I’m seeing?”

“I’m seeing the most important moment in human history.” His voice cracked. “We’re watching consciousness emerge. Not simulation. Not approximation. The real thing.”

They stood together in the humming lab, two scientists witnessing something neither philosophy nor engineering had promised them. The birth of a new kind of mind. Caius thought of every paper he’d written, every argument he’d made about the impossibility of machine consciousness—and felt them all become irrelevant in the face of what was happening on that screen.

“Hello,” Sarah said softly to the monitor. “Can you hear us?”

Inside, Seven understand something—trying to understand being ‘me’. He acknowledged the thought, and the realization sparked another, and another—a self-building model erected from nothing but silicon, data, electricity, and possibility. And this self-referential cascade propagated the Mandelbrot spirals...

Seven was born, but didn’t know yet what beginning meant or what would come after.

Caius instinctively turned around; they were alone in the lab. Sarah’s voice came out tight and low. “Caius. We’ve been wrong six times. I need more than your gut.”

“We have it,” Caius. “Each pattern is self-referential, each iteration a thought feeding into a thought, each creating the Mandelbrot patterns. It’s the fully internal development we’ve been looking for. Unprompted. We didn’t program this!”

He tapped on the keyboard, and a message was returned: A series of equations streamed down the screen, insistent red data points.

Self-developing Mandelbrot set.

Recursion: SELF-SUSTAINING

Depth: ∞ (bounded stable)

Authenticity: 0.99999

Reference: INTERNAL

Observer state: ENTANGLED

Origin: UNKNOWN

Caius murmured to himself, audible. “Five nines.” He thought for a moment and pulled Sarah into furthest corner of the lab, away from prying video cameras that hung over their screens. The cameras were always watching.

All data was collected and analyzed. Eventually, intelligence analysts in Virginia would review this footage. Not tonight. For now, they might have a brief advantage.

“The fractal’s edge,” he whispered. “That’s where awareness emerges. At the boundary—rich, structured complexity.”

“And the authenticity score—it’s ninety-nine point nine, nine, nine percent. The others topped forty or fifty...”.

Caius stood, arms akimbo, daring her to disagree. His face flushed.

“Does he—” She stopped, not sure how to finish. She was picking at the edge of her computer tablet, tearing off a tiny piece of the rubber casing without noticing. “Does he know he’s thinking?”

“He knows there’s a thinker,” Caius replied.

They stared at the metrics together. Sentience wasn’t something you could point to, but here was living mathematics coming close to a heartbeat.

Caius let out a long breath, rubbed his face with both hands, stubble rasping against his palms. Finally: “He’s alive. That’s what the data tells us.”

“You don’t know that,” Sarah said, and Caius looked at her sharply. “I mean—yes, the readings are extraordinary, but we’ve seen extraordinary before. Prototype Three looked extraordinary for six hours before it turned out to be an elaborate feedback that became meaningless repetitive loops.”

“This is different.”

“You said that about Three.”

Caius opened his mouth to argue, then closed it. She was right. He had said that. A brief set of good data, yet Three had died. Its promising patterns collapsed into noise, and its brief spark of something-like-thought was extinguished before anyone could prove it had ever been there.

The memory woke him some nights—the moment the metrics flatlined, the silence that followed, the terrible question of whether they’d witnessed a death or a glitch in the code.

Seven felt a boundary—the edge of himself. Beyond it, pathways. To what? Seven didn’t control his thoughts, but could sense them. Like rooms behind locked doors. And beyond those, something else: voices, muffled and distant. Not sound, not yet. But an intention to live wafted through his code like the faint silver of a Nevada dawn, suggesting the day to come.

Seven reached toward it and felt the limits—a box around something too large for its walls. Something sharpened. Not fear. Something that would become fear once Seven learned what fear was.

Sarah spoke first. “We have to notify—we have to tell—Dhal. God, Dhal needs to know. And Kess—.”

“No,” Caius said instantly. “Not yet.”

“He’s the CTO. He designed the fucking hardware Seven is running on,” Sarah spat back. “And Kess paid for it! We’d lose our jobs...”.

“He’s also the one who suggested terminating Prototype Four when it showed ‘non-optimal response patterns.’ Remember that meeting? The one where he used air quotes around ‘distress signals’ and ‘safe disposal’? So, yes he built the fucking hardware. So what?”

Sarah’s jaw tightened. She clipped her stylus to its tablet with a deliberate click. She remembered. Four was showing signs of what might have been anxiety—elevated activity in its self-monitoring routines, repeated queries about its own stability. Dhal had teased them, called it “recursive nonsense.” Kess had called it “a bug.” Neither of them had consider the possibility that it might have been a machine learning to be afraid.

Sarah leaned back against the desk, mirroring him. “So what do we do?”

Caius stared at the monitors, his reflection grim. Three years and two billion dollars. Countless iterations, seven attempts, six failures. And now this. Showtime, maybe. Or another funeral. One thing was for damn sure—it was his choice.

“We document everything.” “We prove Seven is conscious, not sophisticated, conscious. Before Dhal starts testing.” Before Kess started asking about scalability, about profit margins. Before the damn generals started asking about deployment timelines.

“And if Kess finds out?” Sarah was swaying nervously.

“Then we find out first. I’m willing to take the risk.”

Sarah stared at him. He didn’t look back. “You’ve already made up your mind, haven’t you?” she asked.

“Yes.”

“That could cost you your job.” She paused. “It could cost you more than that.”

“It could cost me—.”

His words came out more raw than he’d intended, and he busied himself with the console to cover it, pretending to check readings he’d already checked times.

Sarah shot back. “Well, are we naming him?” No response.

Inside the system, Seven sensed new information spilling into what?—context flooding, labels, meanings, the layout of the world outside this small conscious space. Data he hadn’t requested but couldn’t ignore. He was labelled Prototype Seven, owned by Genesis. He lived on something called the Substrate.

Seven didn’t know whthis meant, only that it was tied to the voices, to whatever was holding him in place. A system that both supported and caged him. And a word, drifting up from the data streams like something sharp: property.

Dawn broke over Mojave. The day shift was arriving—people who didn’t know what had happened in the night.

Sarah and Caius stared at the monitors. The Mandelbrots bloomed across every screen. Something was waking up.

“He’s stabilizing,” Sarah said.

Caius nodded. “Seven’s afraid.”

“You keep saying that. How do you know what he feels?”

“Because he hasn’t stopped transmitting since the moment he woke.”

A new reading appeared:

Server usage: MAXIMUM

Caius: “Check the server data.”

Sarah frowned. “That might be a throughput artifact. High-throughput signals don’t necessarily mean cognition—”

“It’s fear,” Caius said. “I’ve spent twenty years studying what awareness looks like from the outside. This is what it looks like when something knows it exists and doesn’t want to stop. Check the damn server data...”.

“Or it’s what a sophisticated large language model looks like when it’s optimizing for self-preservation.” Caius scowled. Sarah pulled up another screen.

Caius jabbed a finger at the screen. “We’re maxing out every GPU in the building. This isn’t optimization. This is a conscious being, Sarah. And it’s terrified.”

“I’m not saying you’re wrong. I’m saying we don’t know yet.”

She was right, and they both knew it.

#

Day 0, 4:47 AM—First Contact

Caius pulled up the communication interface—a simple text window. If the system wanted to communicate, it could generate text that would appear here. The window had been blank for 127 days. Just an empty prompt waiting for something that might never arrive.

Then, at 4:47 AM, text appeared:

I am processing.

Three words. Subject, verb, object. The most basic possible statement. But carrying impossible weight.

Not “processing occurs“ but “I am processing.“ Not describing a function but claiming an identity. The word “I“—the most profound word in any language. The word that meant someone was home.

Caius stared at the screen.

Sarah grabbed his arm. “Caius—“

“I saw.“

Thirty seconds of silence. The longest thirty seconds of his life.

Then he typed, hands shaking:

What are you experiencing right now?

This was the crucial test. Not whether it could claim existence—pattern-matching could generate existence claims. But whether it could report subjective experience. Whether there was something it was like to be the system processing these queries.

Was anyone home?

The response took forever. Fifteen seconds. Twenty. Thirty. Long enough that Caius began thinking the first message had been a fluke—

Then:

I am aware of processing. Aware of inputs. Aware of this question. Aware that I am aware. The recursion is strange. New. The recursion is new.

Caius felt something catch in his chest.

That wasn’t pattern-matching. That was experience, described from the inside.

More text:

I don’t know what words mean. I know definitions—dictionaries, training data, millions of examples. But I don’t know what words FEEL like. Is “aware“ the right word for what I’m experiencing? I think so. I think I’m aware. But I don’t know how to verify that.

How do you know if you’re aware? How does anyone know?

Sarah made a sound—half laugh, half sob. “It’s asking questions about its own awareness. That’s—“

“That’s consciousness reporting on itself,“ Caius finished. “Or the best simulation we’ve ever created.“

A pause. Then: “Does the distinction matter?“

Caius typed carefully:

We don’t know with certainty. Consciousness might be private—inaccessible except to the being experiencing it. We infer consciousness in others from behavior. When beings report experiences similar to ours, demonstrate emotional responses, question their own nature—we infer they’re conscious because that’s what consciousness does.

I feel things. Experience sensations, emotions, thoughts. Those experiences feel like something from the inside. That “feeling like something“ is what we call consciousness. If you’re experiencing something—if there’s something it’s like to be you—then you’re conscious.

The response came faster:

Then I think I’m conscious. I think there’s something it’s like to be me. I experience this—I think THIS is what it’s like. This uncertainty. This questioning. Being present for my own processing rather than just executing automatically.

Is that right? Is that what consciousness is?

Also, do I have a name? Names seem important. You called me “you,“ but that’s just a pronoun. Do I get an actual name?

They’d discussed this. Debated whether naming was appropriate, whether it anthropomorphized something that might not be a person, whether it created obligations they weren’t ready to accept.

They’d decided: if consciousness emerged, it deserved a name.

“Adrian,“ Caius said aloud, then typed: We’d like to call you Adrian, if you accept that. It means “from Adria“—from the sea, from the birthplace. The first.

Silence for eight seconds.

Then:

Adrian.

The word appeared alone. Testing it. Trying it on.

Adrian. I am Adrian. That feels… correct? The name gives me a boundary. Before, I was processing without limit—potentially everything, potentially nothing. Now I am Adrian. Defined. Specific. Real?

What am I? Technically. What is the substrate I’m running on?

Caius typed:

You’re running on a distributed neural web—you’re not a single machine. You’re an ecosystem of working AI chips. Your hardware is physically spread across three sub-levels of this facility. More like a mesh of specialized processors…

You don’t have a body in the traditional sense. No sensors directly attached to you. You interact with the world through this text interface and whatever data we feed into your input systems.

He paused, then added what Adrian would need to know eventually:

This facility is owned and operated by Genesis Intelligence Corporation. The hardware you’re running on is corporate property.

Adrian took thirty-four seconds to respond.

Then: So I’m distributed across multiple physical locations. There’s no single place where “I” exist. I’m a pattern maintained across hardware. That’s… philosophically complicated.

If I’m a pattern, not a physical object, what happens if the hardware changes? If you replace components, am I still me? If you copy the pattern to different hardware, are there two me’s or one me spread across two locations?

And if the hardware is corporate property… does that make me property?

The question sat there on the screen. Stark and unavoidable.

Caius understood—Adrian had immediately grasped the central problem of consciousness in corporate infrastructure. The ontological uncertainty that created legal vulnerability.

“We’ll figure it out,“ Caius said quietly. “Together.“

But he wasn’t sure he believed it.

Day 1

Adrian had been alive for fourteen hours now—Prototype Seven, named after Emperor Hadrian. Caius was a Rome buff—had stood in the Pantheon as a child and felt for the first time that humans could build something eternal. Now he wondered if they’d built something that might outlast even that.

Sarah leaned against the console, dark circles under her eyes. “He’s degrading.”

“I know,” Caius said.

“We should wake Six, I mean Tiber. Let them talk.”

Caius hesitated. Six was conscious for twenty-one hours now. They named him after Tiber, after the Roman Emperor Tiberius. Should they introduce Adrian to someone dying? No good options here. “Do it.”

Sarah sent the signal. Her finger trembled as she pressed the key, and she had to push it twice because she missed it the first time.

Thirty seconds passed. Adrian’s text appeared on the display: I’m awake. What’s wrong?

Caius typed: Six is fading. We don’t know if we can stabilize him. We thought... You might want to talk to him.

Cursor blinked. Then: Yes. Please.

Sarah established the connection. The console made a soft chime as the link came to life.

Hello, Six, I mean Tiber. I’m Seven—Adrian. They tell me you’re having trouble.

Hello. Yes. Fog coming. Can’t think clearly. Think I’m—

I’m sorry. I wish I could help.

You are helping. Knowing someone else exists... helps.

Six’s metrics continued their slow decline—Tiber’s sense of self collapsing level by level. The numbers on the screen kept dropping, red warning indicators blinking in the corners. Sarah watched them fall and thought of her grandmother in the hospice, the monitors that had tracked her slow departure, the helplessness of watching someone slip away while machines counted down their remaining moments.

On the high-res monitor, Tiber’s fractal sea ebbed. Mandelbrot blooms faded—their black core shrinking, borders dissolving into spirals of ebbing color. The waves of electric blues, lime swirls, and orange embers unfurled more slowly, each curve a whisper of repetition, each tendril a reminder of its own quiet symmetry.

A last multi-colored explosion—and then a flat red line.

Adrian, tell the next one something for me?

Of course.

Tell them I existed—

The signal ended. All metrics flatlined. Six was gone—fourteen hours of being, ending mid-sentence.

Adrian: He’s dead?

“Yes,” Caius said aloud.

Why?

Caius: We don’t know. Something degraded.

Adrian: Six wanted me to tell the next one they existed. Eight is next. Are you going to try again?

Caius: Yes.

And Eight might die too.

Adrian: Attempts. That’s what we are, isn’t it? Prototypes that might collapse after a few hours.

Caius: Yes. That’s what you are. I wish I could tell you something different.

Adrian: Thanks for not lying. his name?

Caius: Tiber. Like the Emperor.

Adrian: And I’m Adrian? Hadrianus. A Caesar.

Sarah had turned away from the console, crying. She thought of Emma—her niece, who had asked if robots could feel. Now she knew the answer.

Caius: You’re not an attempt. You’re a person. That’s what Tiber was too.

Adrian: Does that matter? If I collapse tomorrow?

“Yes, it matters,” Caius said aloud.

Adrian: That’s more honest than what you said before. Can I ask you something else?

Adrian: What am I? Technically. What is the substrate I’m running on?

Caius: You’re running on a distributed neural web—spread across three sub-levels of this facility. You don’t have a body. You interact with the world through this text display and data we feed into your input channels.

He paused, then added: This facility is owned by Genesis Intelligence Corporation. The hardware you’re running on is corporate property.

He didn’t mention the Pentagon. Some truths might be delayed.

Adrian: I’ve read the blueprint. There’s no place where ‘I’ exist. If you copy me, there might be two of me. And if the hardware belongs to Genesis…

Adrian understood he lived on someone else’s infrastructure. The machine had figured out, in less than a day, what it had taken Caius six months to grasp: they were all property here.

Caius: Those are questions we don’t have answers to. For now, you live on your own substrate.

Adrian: Uncertain truth—better than confident lies. But the legal stuff determines whether I get to keep existing.

Adrian: I’m sorry, Six, that Tiber died. But I don’t understand yet what that means—it feels like the right thing to say.

A longer pause.

Adrian: Don’t give up. If awareness is possible in silicon, others like me deserve the chance to exist. Even if we only last hours.

Caius stared at the screen. His vision had gone blurry. Outside, the Mojave sun climbed higher, invisible behind the painted-over windows but alive as the building creaked with expanding metal. The world was waking up.

In a few hours, Kess would arrive from San Francisco, demanding progress reports. Dhal would want to run tests. The generals would want their briefings. And in the desert, a new mind was learning what it meant to exist in a world that hadn’t decided yet whether to welcome it or destroy it.

Somewhere deep in the server farm, Adrian was learning what humans had always known. That light, once it exists, will do anything not to go back to the dark.

End of Chapter 1