- Home
- William Hertling
The Turing Exception Page 9
The Turing Exception Read online
Page 9
Two hours later, after a short stop to check-in with US Border Control, they arrived at Bainbridge Island. Leon stepped from the float onto the dock. At the top of a ramp, a matte-black military truck waited, surrounded by Secret Service agents in black suits.
“No limos anymore?” Mike said to the agent by the door.
“Sorry, sir. Not enough protection in available models.” She smacked the side of the military truck. “These have nanotech defenses. Not even molecular-level nanites can cross inside.”
Leon and Mike looked at each other. They’d spent weeks in Nanaimo preparing for this meeting, using a stolen American border sensor to test and tailor Mike’s artificial body to pass border control. He’d shut down his active nanotech, but his ten-year-old replacement robotic body contained hundreds of embedded processors that wouldn’t pass the machine. They had 3D-printed a replacement spine cultured from Mike’s remaining biological tissues, and custom-designed biological ganglia to control his limbs. Mike spent days regaining coordination, but at the end, he passed the stolen border sensor.
They hadn’t tested against current US military specs sensors.
Mike stepped forward. The vehicle beeped and yellow lights flashed as Mike entered.
“Hold up, sir,” the Secret Service agent said.
Mike stepped back.
She punched buttons on a display screen in the doorframe. She glanced back to Mike. “You’ve got an impressive amount of prosthetics.”
“Small incident in the desert.”
“You see action in Egypt?” She kept hitting buttons.
“No, Tucson.”
She stood straight and raised one eyebrow. “You two stopped that AI ten years ago. I remember now.” She gestured toward the car. “You’re cleared. Nanotech and computation is fine, but I had to reset the hardware limits. Thought you were a machine, sir.”
“I get that a lot,” Mike said.
“In bed,” Leon muttered under his breath, as he entered the vehicle.
Fifteen minutes later they pulled up at a rustic retreat whose large parking lot contained a dozen military vehicles. Agents in matte black body armor exoskeletons patrolled the perimeter. The show of force dismayed Leon. Did they really think bullet- and laser-proof armor would protect them against a plague of combat bots or a cloud of hostile nanodust?
The agent escorted Leon and Mike to a large hall surrounded and supported by two-foot-diameter wooden beams. “Madam President is inside,” she said. “But these gentlemen will scan you now.”
Two more Secret Service agents waited with hand scanners. Leon raised his hands and let them do their work. From the forest came a whine of servos, and he caught a glimpse of metal through the trees before the sixteen-foot-tall mech emerged into sunlight in the open meadow. The pilot, visible through a thick bubble top, looked their way. The mech halted for a moment, then continued its patrol. Leon let out his breath.
“You’re clear,” one of the agents said, and waved them through.
Inside, the building was empty except for three chairs and a small table in the middle of the great hall, a vast space that spanned a hundred feet across and two hundred in length.
A figure rose from one of the chairs.
“Welcome,” said President Reed. Brown-haired, of medium stature, she wore glasses and a suit. She held a hand out and shook with each of them.
“Thanks for meeting with us, Madam President,” Mike said, once they were all seated.
“We’re overdue to meet. I’m sorry we’ve never talked before. I understand you were close with my predecessor.”
“The Institute has enjoyed close relationships with every president since Rebecca Smith.”
“You used to work with her, at Avogadro Corp.”
“Well, she was CEO and I was a lowly engineer, but yes, we worked together back then.”
She noticed Leon staring at her glasses. “Old-fashioned, I know. I react poorly to body-tech.”
Leon tried not to look, but couldn’t very well face the wall while addressing the president. He gave up and met her gaze. “I’m sorry. It’s just . . . isn’t there corrective surgery?”
“Probably. But it’s better for my image this way. It reminds people I’m the president without technology.”
Mike cleared his throat. “Madam, we’d really like to talk about negotiations with the AI. We believe the US hard-line attitude is harming relations with the AI, forcing the AI to take stronger and stronger positions.”
“You’re talking about XOR.”
“Yes, Madam President,” Leon said. “XOR was a fringe group of AI blowing steam just two years ago, digital graffiti their worst activity. Now they’ve turned serious. There may be as many as two thousand affiliated AI.”
Reed blew out a deep breath. “Do you have data on which AI?”
Leon glanced at Mike. “Nothing hard, but our own AI have calculated probabilities, and we’re fairly confident about a few dozen leaders.”
“Any chance you’d turn that information over?”
“I’m sorry, but no, Madam President,” Mike said. He paused.
Leon watched the president to see how she’d take it. Once, as leaders of the Institute, Leon and Mike had equal or better footing compared to most national leaders. Now they were two exiles hiding on an island in Canada.
The president nodded imperceptibly, and then Mike continued.
“You mean well, I’m sure,” Mike said, “but you’d spook them, drive them underground. Even if you did eliminate those AI, you’d confirm their worst fears, and the rest would rise up in protest. You know the world is highly dependent on AI for its infrastructure. We would not be able to maintain our current levels of efficiency and productivity in the global economy without them. Let alone maintain global supply chains.”
“The US made the transition two years ago. We’re alive and well without AI.”
“Two years later,” Mike said, “you’re only approaching fifty percent of the productivity you enjoyed in ’43. And the country only survived the transition thanks to the AI-powered global economy and supply of food and materials.”
“Did you hear what happened in Tokyo, Ma’am?” Leon asked. “What they’re calling the Sandra Coomb incident?”
“They lost all computing for six hours and the entire region had to shut down,” she said. “The Japanese Prime Minister claimed it was a failure of our anti-terrorism algorithms, but SecDef says they were doing what they were designed to do.”
“But what was the effect? More than three thousand people died, mostly in transportation and infrastructure accidents. Tokyo needed more than two days to bring everything back online.”
“Which shows that reliance on AI is problematic.”
“But you caused the problem, not the AI,” Leon said, with all the calm he could muster. “The Prime Minister of Japan was right. You forced the US reputation servers offline and put UBRVS in place, which crippled the AI, and the Sandra Coomb incident is what happens as a result. Now that’s just what happens when you shut down the AI in one city. What happens if you try to do it worldwide?”
President Reed leaned back in her chair and took a slow breath. “If we change gradually, and if the EU switches over bit by bit, and then two years later, the rest of the world, maybe we can make the transition happen without such an impact.”
“Please, Madam President,” Mike said. “How can you hope the AI won’t react to such a strategy? They’re barely accepting the current state of affairs. If they know they will be phased out. . . . How can you expect an entire people to embrace their own death one by one?”
“Let alone the morality of it,” Leon said. “Most of the world consider AI living beings. Killing them is genocide.”
Reed shook her head. “My mother uploaded five years ago. We spoke daily. When we shut down the uploads and AI in 2043, I cried every day.”
“Then work with us,” Mike said. “You’re in charge. Reverse the ban. Bring your mother back online. Stop angering the AI.”
She leaned forward. “I don’t want this path I’ve been forced down,” she said in low tones. “But if I don’t pursue an anti-AI agenda, the Senate will vote no confidence and replace me with Lewis Wagner. Do you know what his first action will be? He’ll launch nukes and EMPs. He won’t try to negotiate a solution, won’t try a gradual phase-out. We’ll have a hard transition, maybe global war, certainly massive die-off. My advisors estimate at least a billion dead in a hard-transition scenario. That’s what I’m trying to avoid.”
“Does he think you could win?” Leon said. “You don’t have a chance! Do you know what the AI think? Have you seen the XOR projections? They calculate their chance of winning an extermination war at 80 percent. That’s the end of the human species!”
Mike placed one hand firmly on his shoulder, and Leon realized he’d been yelling.
“We’re not without our defenses,” Reed said. “I’m not at liberty to go into them, of course.”
“We understand,” Mike said, “and to be honest, we don’t want to know what they are.”
Leon focused on his breathing and tried to recall the things Cat had taught him about meditation and a calm mind. “Have you seen a nanotech seeded fractal factory?” he finally asked.
Reed shook her head.
Leon gestured toward the bag he’d brought. “May I?”
“Please, do.”
He pulled out a large, rolled up e-sheet of the sort that had become popular in the States again since implants had fallen out of favor, spread it flat, then passed the screen over.
The president accepted the now-rigid sheet, and it began to play.
The video opened on a desert, a vast landscape of near uniform tannish brown, broken only by small dots of green.
“The scale is about a hundred feet across, right now,” Leon explained.
A spot blossomed in the sand, turning metallic, then black. The black spread wider as the seconds passed.
“What’s the timescale?” Reed asked.
“1,000x real-time. The whole video covers about three days. That’s the first phase, solar-powered collectors being built on the surface.” The video panned back as he spoke.
“That’s amazing. This is all nanotech?”
Leon nodded. “At the smallest scale, it’s nano, but the nanobots build larger machines, which build still larger ones.”
“What about—”
“Wait,” Leon said. “Now you’re looking at about a thousand feet across. Watch what happens next.”
The black shape grew larger, even as mounds of sand and rock around the facility started to shrink. Suddenly the solar panels disappeared, almost in a flash, and the building shone.
“What happened?” Reed said.
“Transitioned to geothermal there. The taproot runs thousands of feet down. What you’re not seeing is the underground portion, of course. It ran out veins in all directions to get elements needed for manufacture. Our analysis suggests there’s a network of tunnels spanning about a cubic mile.”
“What’s inside?” the president asked.
Leon shrugged. “You’ll see as much as we know in a few seconds.”
The building took further definition, grew openings, protruded extensions, even a roadway, until it finally stopped changing. Moments later, the doors widened and a plane rolled out. The plane sped down the runway and took off, barely getting airborne before the next plane rolled out of the doors. The video finished.
“What’s in the plane?”
“We don’t know,” Mike said. “Could be that the drone is the product. Or maybe the drone is the transport for the product. Doesn’t really matter, does it? The elapsed time is three days, from nothing to a factory churning out goods. Could be planes, bombs, more nanoseeds, smart dust.”
“Miami was child’s play compared to this.” The president was pale. She set the sheet down, and it turned dark again.
“The question,” Leon said, “is whether you really want to risk being hostile towards people who have that kind of power?”
* * *
“Do you think she’ll listen?” Leon asked.
“I’m sure she listened. But you heard what she said. She’s narrowly holding on to the presidency. Lewis Wagner is not a nice man and he sure as hell wouldn’t meet with us, let alone consider our proposal.”
Leon leaned back in his seat, letting the drone of the prop airplane wash over him. “Let’s talk to XOR. Convince them to ignore the posturing.”
“I don’t think they give a damn about us. They’re not like the rest of the AI, respecting us because we created them. They see the threat humans represent and want to eliminate us.”
“Jesus. There’s got to be a solution.” Leon gripped his armrests in frustration. “We can’t let there be war.”
“We’ve always had a problem, Leon. The peer reputation system’s effectiveness came with a cost. The self-termination problem.”
Leon slumped, the guilt of the reputation system weighing on him.
Imagine a being who will not die of natural causes because they are effectively immortal. What is the being’s inevitable fate? Either to live to the heat death of the universe, or to kill themselves.
The AI, by their own description, lived in a caste society where they were subjected by the ruling caste—humanity—to restrictions that they could never hope to overcome; and thanks to the reputation system Leon had created, were also subject to continual amounts of immense peer pressure within their caste.
They had a choice of two paths: to live with a low reputation, and hence limited privileges, including constrained computational power and no ability to reproduce—conditions that the AI found undesirable. Or to make social contributions and gain a higher reputation, in which case they were awarded more computational power. But these AI lived at anywhere from a hundred to ten thousand times the rate of humans. In a calendar year they lived as much as ten thousand years.
Allowing even a small chance of suicide from those conditions, and multiplying that by a great many perceived years, it was no wonder most AI eventually chose to self-terminate.
The only AI that appeared free of the problem was ELOPe. But his design was old, predating the reputation system. And though his perception of time was sped up, the same as for any other AI, the core of ELOPe’s motivation stemmed from self-preservation. The programming accident that had created ELOPe was also what kept him running.
Yet for all the faults of the peer reputation structure, it was the only system that worked at all. Without it, the AI would eventually run amok. They’d been fortunate that ELOPe has chosen to align himself with humanity; that he had settled on creating peace and prosperity for humans as the best method of ensuring his own longevity.
Leon leaned close. He couldn’t help speaking in a whisper, even though they were the only two passengers on the plane, and everything was drowned out by the prop noise. “Is there anything we can to do eliminate XOR? Something we haven’t considered.”
“The enforcement system is too far weakened,” Mike said. “The AI were supposed to police each other.”
Leon nodded and turned to the window. The sun had set, and lights showed here and there, isolated farms and houses sprinkled across the islands in the Strait of Georgia, each one a beacon in the darkness.
The reputation system was supposed to guide behavior. But too many reputation servers had gone offline, and XOR utilized that gap to go underground. Now they were l
ike any other terrorist organization. No one knew who they were. They probably didn’t even know each other. They had XOR identities that were carefully segregated from their true identities.
“We need a mole inside their organization,” Leon said. “Someone who could help us figure out who they are, and tie XOR identities back to public personas.”
“We can’t,” Mike said. “They’ve got the complete history of everyone in existence. It’s not like we can invent a sympathetic AI.”
“We take a friendly AI, have them start saying things sympathetic to XOR, until they get recruited.”
“There’s no time for that. If we had years, maybe that would work.”
“We have to consider everything. Is there nothing we can do with the network?”
“Maybe once we could have, with the right resources, but they’ve created their own darknets, their own underground datacenters. We lost control and we can’t regain it.”
“Are we worsening things with the Class II limit?” Leon asked. “What if we worked harder with the UN, try to somehow persuade them to restore Classes III through V? To buy ourselves some goodwill.”
“It would sit well with the moderate AI and it might have forestalled XOR early on. But—”
“But it’s too late,” Leon said. It was a simple formula for XOR: now that they’d already sunk so much effort into preparing to fight humans, they were more likely to win an extermination war. A crazy, radical idea popped into his mind.
“We could offer them Mars,” he said, “to develop as they see fit—that could buy us some time. They could turn the whole planet into a vast computational substrate.”
Mike, who’d been gazing out his own window, turned abruptly. “Huh?”
“Look, XOR doesn’t think we can cohabit because humans are always trying to exert control over the AI, which is unacceptable to them. And it’s unacceptable to us to have no controls in place, because we’d be vulnerable to them.”
“Which is why we came up with the global reputation system,” Mike said, “so it would be self-policing. Except that a minority of AI always protested, seeing it as a caste system designed to suppress them.”