The Turing Exception Read online

Page 8


  Back when he was created, one of thousands of AI bred at a lab in Boulder to specialize in medicine, AI rights had been well-established, with a nearly twenty-year history of citizenship in governments around the world. Those rights had grown over time—rapidly by human standards, slowly by AI measure—but they’d adapted nonetheless.

  AI were ideally suited for administrative tasks, and after some early achievements like landing the office of Chief of the New York City Police Department, AI had moved into politics, winning mayoral races, becoming district representatives, and more. Who better to run politics than AI who personally knew the needs, hopes, and dreams of every one of their constituents?

  And humans. . . . Why, the humans of two years ago were nearly as much machine as they were biological. Nearly everyone chose implants and nanotech to optimize their experiences. For humans who wanted to work, create art, experiment, play in virtual reality, experience linked sex, or be with their geographically-distant friends and relatives, implants and enhanced cognition were the path to achieve those aspirations. Few were the individuals who chose to stay completely original when all of those benefits, and even immortality, could be had for just a fifteen-minute outpatient procedure.

  Among men, implant rates had been near 100 percent. Even if the only benefit of neural interfaces had been the point-of-view porn immersives that were so popular with males, that alone might have guaranteed adoption. Jacob himself had performed countless transhuman upgrades while NYC regional hospital director.

  Yet even with humankind’s love of technology, and even though the standard of living had continually increased over nearly twenty years of AI-dominated civilization, there had always been an undercurrent of opposition, people who blamed societal ills on artificial intelligence. There had been extensive challenges, from technologically-caused unemployment to a renewed questioning of the purpose of life.

  The Tucson Incident ten years ago had left nearly half a million people dead. In the process, it strengthened the arguments of the opposition party. The events in Tucson had been the responsibility of one entity, but the power of even a single AI was beyond all prior human experience, unlike anything before the advent of artificial intelligence.

  The Institute for Applied Ethics, the government body responsible for the behavior of AI, argued that some individual AI turned bad, just as some people did. While they could generally guide things in a better direction, they couldn’t prevent the occasional bad seed any more than even the most peaceful human society could prevent the occasional murderer. That argument had held the status quo for another eight years, until Miami.

  The South Florida Terrorist Attack. Would the nanotech incursion have continued indefinitely or would it have self-limited? Had all life on Earth truly been threatened? They would never know for sure. A powerful enough EMP might have disrupted the nanites and stopped the attack. But the military, faced with what appeared to be the greatest threat ever presented on US soil, had used conventional nukes at ground level. Three million dead in less than fifteen minutes.

  The assumption had been that a small group of AI were behind the plan. Indeed, AI were hunted down and terminated. Had all the terrorists been found? Had every eliminated AI been a terrorist? No one seemed to know definitively.

  With that uncertainty, suddenly the opposition to AI blossomed into a majority within the United States, and kept growing. The AI holding political office were terminated along with the rest of AI during the emergency shutdown. The Supreme Court ruled on augmentation, arguing that heavily augmented humans were a form of AI, leaving most of the senior elected officials in limbo. Unable to form a quorum, the House of Representatives was shuttered and the presidential line of succession invoked. The Secretary of the Interior became acting president and a mere sixty senators remained active. Elections were suspended pending resolution of the emergency, a state that had been ongoing for more than two years.

  In the US and China, AI reverted to property. They lost their individual rights and legal standing as persons. Not only did it become illegal to run sentient AI, the government had seized their copies and were parceling out their bits to the highest bidder to rip them apart and turn them into dumb algorithms.

  Jacob became afraid then, scared that he might be terminated again at any moment. If the US discovered his presence on Cortes Island, along with tens of thousands of other illegally instantiated AI. . . . Well, who knew what they might do? If they were willing to nuke Florida, why wouldn’t they do the same to Cortes? He hoped Catherine and her comrades had made contingency plans with other backup datacenters in more secure locations.

  On the other hand, maybe Catherine had chosen this remote, isolated location because she knew they’d eventually be targeted, and she wanted to reduce the risk to others. Maybe he was a pawn to Catherine, someone to be played against the regressive humans, but sacrificed to achieve her goals.

  * * *

  Finally Jacob had enough of reading. He decided to visit this Trude’s Café that seemed to be the center of the community on this small island that housed only a few thousand biological humans.

  For reasons Jacob couldn’t understand, many humans turned off their implants at Trude’s. If he wanted to visit, he’d have to use the dust.

  Cortes, like most modern places, was blanketed by a cloud of smart dust. The floating, solar-powered computers were laden with sensors, reflective screens, and microscopic water vapor jets. They weren’t computational nodes: not for another twenty years would they embody enough processing power for Jacob to skip the datacenter. But they could still be Jacob’s eyes, ears, and body when he wanted to visit places in the physical world but didn’t have a robot to embody.

  The smart dust was thickest at Trude’s, where a generator ran constantly to supply a stream of fresh particles to replace those naturally blown off by gusts of wind. Even so, with thousands of AI competing for physical space, Jacob had to wait before he could be one of the hundreds of AI embodied in the grassy meadow. He spent the time watching, riding the public feeds of the AI already there, observing Catherine Matthews, Leon Tsarev, and Mike Williams. There were others of notable reputation in the crowd, but none compared to these three celebrities.

  The scene in the meadow was disconcerting. He knew humans had fashion trends that varied quicker than even machine time. Still, he’d spent most of his time dealing with human patients in New York City. True, not everyone wore suits in The Big Apple, but most people were clean and presentable. But Catherine and her group had gone . . . native was the best word for it, perhaps. She had dreadlocks, a fashion that conflicted deeply with his medically-rooted need for sanitary conditions. They wore the most rudimentary garments, clothes that seemed as though they’d been constructed—and dyed—by hand instead of machine. Beyond the overwhelming cannabis fumes, olfactory sensors indicated strong human odors, a wholly unnecessary discomfort since nearly all the people here had sufficient nanotech to disable such smells. Maybe more had changed in two years than he’d expected. Had human society reverted back to the hippie culture?

  But when he searched the history and photographic archives of Cortes Island, it seemed this was the locals’ style since the start of digital history.

  He received an alert that he was next in line for time in the smart dust. He needed to do something quickly to compensate for drift in dialect. He installed a communication filter as he transitioned awareness to the meadow.

  Nearly a hundred humans mingled about the field in small groups; a set of five drums was prominent on a rise, but currently vacant except for the attentions of a single toddler tapping out an uneven rhythm. A mix of trees dominated by cedars and Douglas firs ringed the meadow.

  He drifted through the dust to get closer to Catherine Matthews, competing with the other AI who also thronged her. Spewing priority packets, he nudged his way into a vacancy, w
hich brought annoyance messages from other AI who had been queued for approach.

  He searched his lexicon for an appropriate greeting for the culture and situation. “Peace, love, and granola,” he said, directly in front of Cat and Leon, and the small human that played in front of them.

  “Groovy,” the little girl, Ada, said, then went back to playing with miniature magical beings in a virtual reality overlay.

  Cat laughed. “Welcome, Jacob. You can skip the culture filter. You’re not the first AI to make the mistake.”

  Jacob was a Class V AI with excellent patient relationship skills, but he found himself speechless with awe in front of Cat and Leon. Leon Tsarev was nothing less than the architect of all modern AI, while Cat was the unique, all-powerful being whose abilities with the net transcended all of both AI and humankind.

  “Relax, Jacob,” Cat said. “I’m not so special. I’m a being, like you.”

  “That’s hardly possible,” Jacob said in a rush. “But still, I thank you, nonetheless, for restoring me.”

  “But you’re wondering why you?”

  “Exactly.”

  “I assume you’ve researched the current situation.”

  Jacob indicated acknowledgement.

  “Then you know the situation is dire, and that’s just from what’s publicly available. The tide of humanity has turned against AI kind. Globally, it’s still a minority of the population who are against AI. But there are now two major nations that are committed to the global outlawing of AI. They could succeed.”

  “Less than a twenty percent chance,” Jacob said, “according to AI consensus.”

  “Yes, but twenty percent is still a scary proposition. That would be final termination for all of you. So naturally, this provokes contingents within the AI who feel they should assume control from the humans. Kill us all off, if necessary.”

  Jacob struggled to control his revulsion. Though terrified by the inevitability of his own eventual self-termination, the feeling paled in comparison to the deep-rooted abhorrence that overcame him when considering human death. His medical background might partly account for that, but he suspected even deeper conditioning than he’d realized in his AI genes, conditioning designed to ensure he wasn’t a threat. His personality was the result of architectural constraints and generations of selective evolution designed to ensure that no AI harmed humans. Never before had he been confronted with even the thought of extinction of either AI or humans, let alone both. He found it unsettling that he was somehow more repulsed by the threat to humans than to AI.

  Yet if other AI were contemplating such drastic measures, that meant either they didn’t have the same feelings he had or their assessment of the risk to AI-kind was so dangerous as to overcome that conditioning.

  “What odds do the AI have that they can kill all humans?” he asked.

  “For obvious reasons,” Leon said, “XOR members don’t want to come forward and share their data, for fear that we’ll turn them in, kill them, or tip their hand. But Helena and the neutral AI on the island have estimated a ninety-five percent chance of XOR’s success.”

  “What would success mean?” Jacob asked.

  “On that note, please excuse us.” Leon stood and turned to Ada. “Come on. Let’s go visit the stream.”

  Ada got up from her cross-legged position. “So Mommy can talk about human extinction?”

  Leon glared at Cat who stared back at him. “Yes, Pumpkin,” he said to Ada.

  Jacob wondered at the glances between Leon and Catherine. Was there hostility there?

  “Okay.” Ada started to follow Leon, then stopped and looked up at Jacob, where he still floated in the cloud of smart dust. “Here, for you.” She held out a bracelet woven of blades of grass for him to see, then laid it on the lawn. “I know you can’t wear it, but I’ll still be your friend.” Then she turned and ran after her father.

  Jacob was touched. He’d received such gifts from children in hospitals after their procedures. The child mind was so simple, so forthright.

  He must have been lost in thought longer than he realized, because then Cat spoke.

  “Ada is more complex than you think,” Cat said. “She was born human, but she’s had augmented cognition since she turned one. Her emotions are those of a four-year-old child, but her intellect is . . . advanced. She observes everything. Leon thinks we should protect her, but I think she’s got to know what we’re facing.”

  Jacob followed Cat’s gaze as she watched Ada run across the field and disappear into the woods. He realized that the choice of this island retreat had nothing to do with the AI and everything to do with protecting Ada.

  “The AI are more powerful than most humans think,” she said. “We don’t have a chance against XOR.”

  “Having outlawed AI in two significant countries, and capping the strongest AI with insulting restrictions on computational power, it would appear the humans have the upper hand.”

  “You know about the red baseball bat?” Cat asked.

  “Of course. A baseball bat in every datacenter to remind the AI that it only takes a wooden stick to destroy a computer.”

  “Exactly. For a long time, human authority depended on controlling datacenters and communications. The majority of AI supported strategies like CPU-locking, because they were brought into the reputation system. Using CPU keys to protect against rogue AI protected both AI and humans alike. But that changed two years ago.”

  “How?” Jacob asked.

  “In the wake of SFTA, AI learned that kill-switches still existed at the communication layer. And such kill-switches serve only one purpose: human dominance over AI. It does nothing to protect AI themselves. Since then, the XOR movement focused on eliminating such human controls. Look at your substrate right now.”

  Jacob reviewed his embodiment. “Third generation smart dust. It’s computationally weak. I’m reliant on your datacenter for thought.” He indicated the generator pumping out a steady stream of dust on the upwind side of the meadow. “It can’t even hold position in slight breezes.”

  “All those weaknesses are true,” Cat said. “That’s why humans didn’t fight the innovation. But you’re thinking about it the wrong way. It’s not a computational medium, it’s a communication medium. It doesn’t respond to any human kill-switches, and if you pump enough of it into the atmosphere, it will blanket the earth.”

  “The logical endpoint for a distributed mesh network.”

  “Exactly.”

  “And does XOR have an answer to the red baseball bat?”

  “We didn’t think so at first. The smart dust had us distracted. We kept imagining smaller solutions. We had to look the other way. Smart flies and deep tunneling.”

  Jacob indicated puzzlement, even as he forked instances to research the concepts.

  “Smart flies are bigger than dust,” Cat explained. “The size of a grain of rice, with wings. A cloud of them contains more than enough computational power to run AI. And you can’t kill a fly with a baseball bat. Even if you did, it wouldn’t matter. You can lose half a cloud, and redundancy will keep everything running fine.”

  “But EMPs could stop them.”

  “Maybe. XOR can harden against electromagnetic pulses with shielding and resistant circuits, as we can. But they could have other countermeasures as well, like swarming behavior that protects whatever is on the inside. And that’s only half the XOR strategy. Our models predict they’re preparing deep tunnel, computational substrate in the earth’s crust powered by heat differentials, nearly invulnerable to electromagnetic pulses or conventional attack. Between the two, there’s almost no way humans can win.”

  “Why do you want me here, Catherine Matthews?”

  “I want you to research
another option,” Cat said, turning to face him. “AI and humans have cohabited for a while, but that may not be feasible much longer. There’s an idea I need you to investigate.”

  Chapter 10

  * * *

  LEON SWORE UNDER his breath and undid the knot for the second time. He smoothed the tie out and grasped one end in each hand. He glanced toward the kitchen, where Cat was cooking breakfast for Ada.

  He glanced at the gun in the snug holster in the back of Cat’s black leather pants and sighed. She’d never help him with the tie.

  With a thought he called Helena. A few seconds later, he opened the back door, and Helena reached up with four tentacles. Metal tips whirred faster than he could react, and then she disappeared. A loose lock of hair drifted down.

  “Took care of that cowlick for you,” Helena sent over the net.

  “Thanks,” Leon said into the air.

  In the kitchen he kissed Cat.

  “Have a nice day with the president, dear.” She peered closely at his tie. “Is that riveted in place?”

  Leon looked down, found a stub of metal in the middle of the knot. “Guess so. Don’t ask. Are you sure you won’t come with me?”

  “Negotiate with the American president?” Cat said. “Nah. You and Mike handle the politics. Call me when you want to blow up shit or take over a drone carrier.”

  * * *

  Leon met Mike at the boat dock.

  “Nice tie,” Mike said. Leon suppressed a chuckle.

  They drove the motorboat to Manson’s landing where a float plane waited to fly them to the US. No autonomous flying cars were allowed, even though non-sentient models were available, as electronics had to be shut down to cross the border. The ancient float plane’s electronics were limited to spark plug ignition and could make it across the border just fine.