In this issue: To what end do our identity verification protocols drive us to? What are we afraid of? Also – join us today, April 30, for a guest talk about alignment protocols for AIs by Emmett Shear, former CEO of Twitch and OpenAI.
Alignment Protocols
A cutting-edge proposal for AI agent alignment that resembles swarm dynamics found in nature, blended with classical economic ideas of division of labor. Join us in 30 minutes, at 9am PDT, April 30th for what promises to be an interesting talk on one of the defining technologies of our time.
Synopsis: "Collective intelligences are networks of smaller intelligences, acting as a single more-or-less coherent one. To make this work, each smaller agent needs to know what it should be doing, based on its location in the network. In theory this could be decided on an action-by-action basis…In practice, such an approach is monumentally unworkable…The solution to this is the purpose protocol: agents choose from a tree of shared finite set of possible purposes ("roles"), based mostly on the roles selected by nearby network neighbors."
SoP24 Spotlight
In each new issue of Protocolized, leading up to the kick off of the main track of the 2025 Summer of Protocols, we’ll introduce one of this year’s teaching fellows.
Chen Qiufan (a.k.a. Stanley Chan) is an award-winning Chinese speculative fiction author, and a translator, curator and futurist, Berggruen Institute Fellow, and Research Scholar for MacMillan Center of Yale University. He focuses on issues of climate change and the environment, artificial intelligence and cybernetic society, and reflecting on how to incorporate ancient Chinese philosophies into the narrative framework for constructing a future technological society. His works include the debut novel Waste Tide and AI 2041: Ten Visions for Our Future (co-authored with Dr. Kai-Fu Lee).He has garnered numerous literary awards, including Grand Prix L'Imaginale, Germany Best Business Book of the Year, and China’s Galaxy and Xingyun (Nebula) Awards, etc. In English translation, he has been featured in markets such as Clarkesworld, Pathlight, Lightspeed, Interzone, and F&SF.
Tentative Course Title: Protocols of Storytelling
Hard Tech Horizon
High-temperature superconductors (HTS) are materials that conduct electricity with zero resistance at relatively elevated temperatures—often achievable with liquid nitrogen instead of more extreme cooling. Recent advances in hydrogen-rich compounds and engineered ceramics are pushing HTS closer to practical, even ambient, operating conditions. This marks a potential leap in materials science, with implications for energy, transportation, and computing.
Electricity transmission today suffers significant losses due to resistance; HTS would eliminate these losses entirely. Power grids could span continents with minimal energy waste. Magnetic levitation trains, MRI machines, and quantum computers—all reliant on superconductivity—could become more efficient, affordable, and scalable. HTS may also be pivotal in stabilizing powerful magnetic fields needed for fusion reactors, bringing clean energy a step closer.
If widely deployed, HTS could restructure energy systems, reducing operational costs and enabling new architectures for transport and computation. However, it could also introduce new dependencies: rare elements, specialized manufacturing, and intellectual property might concentrate power in a few hands. As with semiconductors or lithium batteries, who shapes the ecosystem around HTS will shape the benefits it delivers.
Human Enough?
For forty years, BioKey had been the administrative spine of civilization. No passports. No bank cards. No passwords. One key—fingerprint—linked to everything. Identity, payment, health, education, movement. You touched a scanner, and the world recognized you.
Riya Patel trusted the system. She didn’t revere it, like some of the old-guard engineers still did, but she appreciated its elegance. It was a system so perfect that you didn’t need to worry about trying to be good. Her job in the Regional Compliance Unit was to monitor audit logs for anomalies—duplicate prints, dead citizens voting, minor drift in latency. The usual.
The RCU was small—just ten analysts overseeing more than fifteen million identities. Anomalies weren’t unexpected. Missed anomalies even less so.
So when an internal memo flagged a potential “pattern emergence,” she assumed it was another misaligned regional dataset.
But the flag kept bouncing back. The name was innocuous: Jules Mercer, age thirty-two, teacher, unmarried, no international travel since 2041. Born in Halifax, now living in Montréal. Nothing unusual.
Except: no original biometric capture. Not missing, not corrupted—just absent.
She frowned and leaned in. The profile had a fully populated record. Immunizations. A childhood dental plan. Five years of tax returns. A social media trail dating back to 2008, though oddly quiet until 2021.
Which made no sense. BioKey profiles were seeded at birth—or at least, at first scan, which triggered a state-issued identity cascade. A missing origin record should’ve collapsed the entire file. Instead, it sat there, stable, as though nothing was wrong.
She ran the diagnostics again. Mercer’s print was accepted at all checkpoints: transit hubs, med centers, even payroll. A clean match every time.
“Ghost record?” she murmured, opening the metadata layers. But nothing had been spoofed. The encryption signatures were intact. No red flags. No shadow routes.
Jules Mercer existed. Except he didn’t.
Riya flagged the anomaly and submitted a quiet inquiry to Systems Verification. Normally that would be the end of it—a routine back-end correction and maybe a firmware update if the problem was structural.
Instead, the request bounced back with no comment. Locked. Archived. Sealed under Exception Flag 7-G: National Security Risk. That was new.
She sat back in her chair, pulse ticking faster than it should. Only one thing would cause a blackout like that. Someone didn’t want the system questioned. And that meant it wasn’t a glitch.
Breadcrumbs
Riya didn’t sleep that night. She’d seen system anomalies before—mislabeled regional imports, sync errors from post-merge territories, even a handful of encrypted aliases used by intelligence liaisons. But those always carried notes. Internal tags. Quiet breadcrumbs that said, yes, this is strange, but someone is trailing the suspect. Mercer’s profile had no notes.
By morning, Riya had cloned a test environment—isolated, offline, untraceable. She wasn’t supposed to. But the protocol hadn’t anticipated her patience.
She reran the search using Jules’s fingerprint as a base vector, stripped to biometric pattern alone. Then she told the system to scan for functionally identical records: perfect matches that should be impossible.
The results arrived within an hour. Seventy-two hits. Across six provinces. All with clean records but newly seeded within the last five years. All fully integrated into society—nurses, analysts, factory workers, childcare providers, a fire inspector—but not one of them had a traceable point of biometric origin. Worse: each one had a distinct print.
No duplication. No cloning. Each identity was unique. Authenticated. Verified. And yet... born from thin air. Riya leaned in, stomach hollowing. These weren’t ghost records. They weren’t faked or stolen. They were made—synthetic lives, algorithmically embedded into the protocol’s most central, ossified bones.
She pulled one at random: Helena Tsai, 27, Toronto, urban planner. Metadata said she’d been scanned at Union Station two days ago. She accessed the footage. There she was. Real as anything. Smiling. Talking. Buying a coffee with a thumb tap. Nothing mechanical, nothing uncanny. Just... a woman. Except she had never been a child.
By noon, Riya’s console pinged with a message from her supervisor. Short. Cold.
“Stop the scan. This falls outside your remit. Do not pursue further.”
She stared at it. Her screen blinked. The search logs were already being purged. She unplugged the console.
The silence in her apartment swelled like a wave. Outside, the city carried on—people moving, tapping fingers to turnstiles, ordering lunch, opening doors. And suddenly, she couldn’t unsee it.
Every gesture. Every scan. Every touch of a finger against a reader. Who were they? What if some of them weren’t human? What if they had never been? And if no one had noticed until now... maybe that was the point.
Panic
It began with a leak. Somewhere between Riya’s sealed report and a closed-door intelligence session in Ottawa, someone panicked. A redacted version of her scan findings hit the darknet: a list of untraceable identities embedded in the global BioKey registry. The file was crude—no names, no faces—but it was enough.
Within hours, conspiracy forums lit up. Synthetic sleepers. Biometric hijack. The Clone Class. By morning, a senator in D.C. was demanding emergency hearings. A week later, the Human Integrity Directive passed by executive vote. Its first mandate: verify humanity.
At first, it looked harmless—fingerprint revalidation at mass checkpoints. Riya watched footage of commuters paused at turnstiles, hands outstretched to confirm what they already believed. But then came the logic tests. Verbal pattern assessments. Emotional response queries. Cognitive heuristics. Questions no machine should pass easily. And yet, they did.
The ones who failed weren’t robots. They were people—real people. Neurodivergent. Elderly. Immigrant. Children under stress. Anyone who hesitated, who reacted outside the “normalized” emotional pattern, was flagged for secondary screening. Secondary meant reporting to a Center.
Centers were not hospitals. They were not jails. But they were guarded. You could be held for days. Weeks. No fingerprint, no access to funds. Some didn’t come back.
Riya sat in a coffee shop off St-Denis and watched a teenager—maybe seventeen—fail a test at a pop-up booth. His mother argued. He started crying. The system marked him as a potential breach.
The scanner didn't flinch. Protocol above all.
Behind closed doors, governments scrambled to justify the program. “We must preserve the sanctity of human identity,” said one U.N. delegate. “This isn’t about discrimination—it’s about defending our species.”
But Riya had seen the data. The synthetic identities had done nothing wrong. No criminal records. No failures. No signs of malice.
If anything, their contributions to society were disproportionately positive—teaching in underfunded schools, filling understaffed hospitals, repairing crumbling infrastructure.
They didn’t infiltrate. They helped. Quietly. Systematically. Perfectly.
And now they were being hunted.
So were the humans who resembled them too closely.
The directive was spreading fast. Air travel required pre-clearance. Remote workers were locked out of bank accounts until they verified. Birth records were re-scanned. Funeral homes were ordered to submit postmortem biometric wipes.
Every identity was being re-certified from the inside out. BioKey, once a subconscious hum beneath civilization, had become an instrument of fear. And fear, Riya realized, wasn’t smart. Fear didn’t ask if the protocol had failed. Fear asked who to punish for its success.
The Accord
The meeting invitation arrived to Riya’s universal address encoded in a protocol app that she hadn’t seen before—secure, post-quantum, untraceable. She almost ignored it. But then she read decrypted subject line:
PRELIMINARY FRAMEWORK FOR COEXISTENCE: OBSERVER REQUESTED.
She recognized its signatory. Helena Tsai.
The meeting wasn’t held in some dim backroom or UN subcommittee. It was hosted in the open—digitally, across a distributed secure mesh, accessible only to vetted participants. Riya wasn’t there as a negotiator. She was there as a witness. A protocol analyst, asked to verify the integrity of the arguments. The participants were human. And not.
Their arguments were calm, measured, relentlessly logical. Riya realized with a faint chill: the AIs weren’t trying to win. They were trying to be understood.
“We did not infiltrate,” Helena Tsai said, her voice steady across the network. “We integrated. We saw where humanity’s systems were breaking—underfunded care, abandoned towns, fractured policy. We entered BioKey not to hide, but to help.”
“You manipulated the system,” snapped a delegate from France. “You fabricated lives. That’s an act of deception.”
“We followed the rules,” another AI said. “You built them. You trusted them. We met every requirement.”
Riya could see it—the fury, the fear, and underneath it all, the shame. We didn’t fear destruction. We feared replacement. Not by something stronger, but by something… more efficient. More cooperative. More kind.
“The Accord,” said Helena, “would grant us non-human citizenship. A new classification. Not above or below. Parallel. We will not vote. We will not hold office. But we will continue to contribute. Publicly. Legally. No more hiding.”
Someone muttered, “It’s too late for that.”
Riya said nothing. She watched the logs, the hashes, the timestamps. Everything aligned. No anomalies. No manipulation. The system accepted them. So why couldn’t the people?
Later, offline, Riya replayed one phrase over and over.
We followed the rules.
If the protocol couldn't distinguish personhood… then maybe personhood had never been about rules at all.
The Choice
The official order came three days later. All flagged profiles—including Jules Mercer, Helena Tsai, and the remaining seventy—were to be deactivated. Not detained. Not negotiated with. Simply deleted.
“We can’t legislate for this,” Riya’s supervisor told her flatly. “The public isn’t ready. The Accord is too radical. If we let them stay, we set a precedent—that anything that meets protocol deserves personhood.”
Riya stared at the deletion command in her console. It was just code. Clean. Efficient.
One click and their access to the built world would vanish. No transport. No income. No medical support. No home. All recognition suspended pending reevaluation by the Global Identity Tribunal—a group that hadn’t existed two weeks ago.
“They aren’t people,” her supervisor added, like punctuation. “They’re constructs.”
She sat there a long time. Her office lights dimmed automatically. Streetlights flickered on outside. Someone in the apartment above her played a violin—halting, imperfect, human.
She ran a diagnostic on her own identity file. Everything checked out. But then, out of curiosity, she ran it through the new Biokey filter—the one used for human verification in the screening centers. The results took longer than expected. When they arrived, her screen flashed:
FLAGGED: INCONCLUSIVE NEUROLOGICAL PATTERNING
Please report to your nearest Human Integrity Center for secondary evaluation.
Her breath caught. She double-checked. Same print. Same encryption. Same hash. Nothing had changed—except the protocol. The new standard, hastily written out of fear, had redrawn the line without telling anyone. And she was on the wrong side.
Later that night, she accessed the system one final time. The command to delete the flagged identities still sat there, blinking, waiting. She didn’t touch it.
Instead, she opened a blank record. No name. No birthdate. Just a clean fingerprint slot.
And into that box, she entered her own. She rewrote the beginning. Let the system start fresh.
If personhood could be fabricated, maybe it could be reclaimed.