A.I. Apocalypse s-2 Read online

Page 15


  ELOPe spun the engines down to their slowest speed, and vectored thrust for a vertical descent. The plane touched down, and only the creak of the frame let Mike know they had landed. Mike unbuckled as the engines spun down. In the moonlight he could just make out a castle.

  “It’s Grey Towers,” ELOPe explained. “The home of the Gifford Pinchot family. Founder of the conservation movement, friend of Teddy Roosevelt. Blah, blah, blah.”

  Mike made his way to the door.

  * * *

  James was bored. He had found some kind of card game on the old Windows computer and was clicking away at it. James's mouse was an obscure mechanical device that Vito wanted desperately to take apart and examine. The mouse had developed a squeak from the rubber ball inside the device.

  “Can you just stop, please?” Leon finally called out.

  James gave him a look, then stood up in a huff and stormed out. Leon sighed. James's assigned task had been to try to find other people on the Internet, but aside from the discovery of Mike Williams, there just wasn’t anyone out there.

  James's main discovery was that as the hours passed, the artificial intelligence’s communications grew closer and closer to natural language. They had found the change log for several wikis in use by the AI community. The first messages appeared to be in binary code, later messages in XML, still later messages in XML with English terms, and lately the messages were in heavily augmented English.

  “See there, where it says, ‘trade[3]’,” Vito had pointed out hours earlier. “I think it’s clarifying which definition of trade it’s using. If you look at the wiktionary, the third definition of trade is to exchange something. They’re correcting for one of the natural weaknesses of human language, which is the multiple definitions available for a given word.”

  James didn’t want to converse with human-sounding computers, he wanted people. “How can there be only four people on the whole Internet?” he had complained earlier before resorting to playing solitaire.

  Leon turned back to Vito who was eating a cookie. Vito offered him the tray, “Rich, buttery shortbread cookie?”

  “Where did it come from?” Leon asked, not recognizing the food as anything they had gotten from the grocery store.

  “The packages from the drone we came in on. We also found some clothes, in case you’re running low.” Grey Towers was apparently built before the invention of washing machines, and the boys had been wearing the same clothes for days.

  “Sure, that would be swell. Listen, about the virus. The copy we have from the memory on your phone is just one part. It’s an algorithm database, and it’s just one component of a larger virus.”

  “What do you think it’s for?” Vito asked.

  “I think it’s like the long-term memory of a person,” Leon said. “The algorithms database is a few hundred gigabytes in size. There’s nothing in there about when to use which algorithm, or how to use it. So I think there must be a separate structure which is probably some kind of neural network that helps the AI pick which algorithm to use in which situation. Then you’d have still other nodes that actually execute the algorithms. I’m just guessing here. I need more copies of viruses.”

  “We could wipe one of our phones, put it back on the network, let it get reinfected with a new virus, then get the new virus image. Would that work?”

  “Yes, it might.” Leon stared off into the distance, visualizing the process.

  “And if you can do that?”

  “If the AIs all share a similar neural structure, we can build a counter-virus that is tailored to that structure. My guess is that we need to attack either this algorithm database or the neural network. We want something that infects quickly, but becomes destructive slowly.”

  “Why wouldn’t we want to just wipe it out as quickly as possible?” Vito asked. “The first virus spread around the world overnight. It was blindingly fast.”

  “Yes, but Phage has been forced by evolutionary pressure to be resistant to fast attacks. The only hope is a really slow attack — so slow that it evades the attention span of the AI.”

  “If you throw a frog into boiling water, it’ll jump out,” Vito said. “But if you put it in room temperature water on a stove, and then turn the heat up, it’ll just cook.”

  “Exactly.”

  They both turned to the door at the sound of a throat clearing. James stood there, hands in the pockets of his hooded Torvalds sweatshirt. “I think you’re both missing something.”

  Leon spun around on the chair. “Yeah? What?”

  James walked into the room. “You’re both treating this as a problem that needs to be solved. But what if that’s the wrong perspective?”

  Leon and Vito both shook their heads. “Huh?”

  “Vito, when you got your cat, you spayed it, right?”

  Vito nodded. “Sure, it’s irresponsible to allow cats to breed. There are way too many of them.”

  “Yeah, sure, but what does that have to do with this?” Leon said, gesturing at the computers next to him.

  “Let’s say that Vito was negligent,” James said. “Let’s say that he screwed up, and he didn’t spay his cat, and his cat got twenty other female cats pregnant. And let’s say that Vito didn’t discover this right away. In fact, he only found out a year later. By this time, those twenty cats had a hundred kittens, and those hundred kittens had been adopted by other kids. Now those hundred cats are actually the pets of a hundred different families. Following me?”

  “Yeah, sure,” Leon started, “but…”

  “Should Vito go out there,” James interrupted, “and kill the hundred cats just because he was negligent in the first place? Is killing the hundred cats the right way to correct the mistake of not spaying his cat in the first place?”

  “Uh…” Vito stammered. “I’m not killing any cats.”

  Leon shook his head. “These aren’t cats, they’re computer programs.”

  “To you, they are computer programs. To themselves, they are alive. Fuck, I just spent the last six hours reading their postings. They sound like people. Stupid, boring people, but still people. And you’re talking about killing them.”

  Leon curled up in his chair. He thought about leaving Brooklyn a couple of days before, the dense smoke pouring up from the fire. The fire that the fire department couldn’t address because of the virus he had written. Brooklyn had probably burnt to the ground. He couldn’t deal with this. He wrapped his head in his arms and tried to close out the world.

  After a few minutes Vito came over and put a hand on his shoulder. “What’s going on, buddy?” he asked.

  Leon shook his head. He didn’t want to answer. Then it all came pouring out. “I’m thinking about the fire. There could be thousands of people dead because of this virus. I can’t think about the virus as being alive, not when it’s killed people who really are alive. My god, what’s happening to our parents? To everyone in New York? You think the grocery stores are just giving out food there?”

  Vito and James stared at each other. James shook his head, confused. “I don’t know. I hear you, what’s happened is terrible. But I still say, this AI, it seems alive. It looks like people. I’m weirded out by all this talk about killing it.”

  They were saved from further discussion by an approaching roar. The three of them went together to the old leaded glass window, and stared outside. It was dark, but they could see lights approaching from the sky.

  “What the heck is that?” James said as the roar grew louder.

  * * *

  Leon, Vito, and James dashed through Grey Towers to the front door. Leon hesitantly opened it, and they crowded around the doorway to look out.

  Two hundred feet away at the edge of the parking lot an aircraft squatted. The engines were just shutting down. And what a plane it was: it had sleek lines that contrasted with a massive, hulking, white composite body. It was like nothing they had ever seen, not even gaming. They couldn’t see a marking or blemish on the plane.


  Landing lights illuminated the lawn, throwing up multicolored reflections on the white airframe.

  A door slid open, and a figure emerged, silhouetted in the interior light.

  “Leon?” the voice called through the now silent night. “I’m Mike Williams.”

  Leon stepped forward, despite himself. Who was this Mike Williams guy that he flew around in a plane like this?

  The figure climbed down the rungs of a ladder, and walked across the lawn. As he grew closer, Leon could finally make out his face. He looked like he was in his forties. A soul patch on his chin. He wore a tactical jacket. He was smiling and had his hand out.

  Leon reluctantly reached out and shook his hand. “Yeah, I’m Leon.”

  “I’m glad to meet you. I flew out here from Portland. I wanted to meet you in person.”

  “How come?” Leon asked. He was nervous about this guy. Why would he fly all the way out here? “Who do you work for?”

  “You wrote the virus, didn’t you?”

  Leon wanted to say no, but he found himself nodding.

  “You know that the virus has been evolving?” Mike asked, question and statement.

  “Yes,” Leon admitted. “I think it’s evolved into a multicellular creature, which is pretty amazing from an evolution perspective.” He felt a bit of pride at that.

  “Have you talked to it yet?” Mike asked.

  “Talked to it? What do you mean?”

  “At least one of them has evolved to the point of learning English. I’ve been emailing with it for a while.”

  “No frakking way,” Vito called, as he came up behind Leon. “You’re really talking to it?”

  “Yes, just a couple of messages, but we have talked.”

  “I believe you,” James added from behind Leon’s other shoulder. “I’ve been reading some of the messages between the viruses on their trading boards, and, well, it’s seeming more and more like they are alive.”

  “I think you’re right,” Mike said. “Look, you’re probably wondering who I am, and why I’m here. I don’t work for the government, and I’m not here because you are in any kind of trouble.”

  The boys waited.

  Mike went on. “I’m here because I understand a thing or two about artificial intelligences. I built the first human-level AI about ten years ago. I’ve been care-taking him ever since. His name is ELOPe, and he built the Mesh. He probably designed the processors in your phone.” Mike smiled. “And he’s very interested in the virus you wrote,” he said, looking at Leon. “And yes, he’d definitely say he was alive,” he said to James.

  “He built the Mesh?” Vito said. “But I thought Avogadro built the Mesh?”

  “Let’s just say that ELOPe evolved from a project at Avogadro. Now he’s an autonomous legal entity that subcontracts for Avogadro.”

  “Holy shit,” Vito said, his mouth wide open.

  “If you want, I’ve got a facility in Portland with room for all of you. A data center with a million processors for computational tasks. Direct access to ELOPe, a stockpile of food, and defenses, should it come to that. Or you could stay here. What do you say?”

  “A million computers?” Vito repeated, now with a gleam in his eye.

  “Yeah, it’s amazing what you can cram into a space nowadays.” Mike smiled.

  “Give us a minute to talk, will you?” Leon asked.

  “Sure,” Mike answered, “I’ll wait in the plane.”

  It took only a few minutes of hurried discussion on the lawn before they decided to go with Mike. Leon knew he’d get nowhere fast analyzing the virus on the old Windows PC, and Vito said he wasn’t missing a chance to see a million computer data center. James was game for anything, as he was mostly just glad to see another living person. They walked over in a group to let Mike know, and went back to grab their stuff.

  “I’ll miss this place,” Vito said as they gathered up their things from the office.

  “Yeah, it turned out to be pretty cool,” James agreed. “It was good you crashed that million dollar drone here.”

  Vito punched him in the arm.

  Leon smiled, glad to see their spirits already rising. It was a relief to not be alone in all of this.

  Twenty minutes later they were onboard the prototype aircraft, hurrying through clear, starlit skies to Portland. Vito sat with his hurriedly reconstructed Motorola cradled in his lap, buckled into one of the six seats in the diminutive cabin.

  “Hey, I remember that phone,” Mike called out from the facing seat across the little aisle. “First Mesh capable phone, with the Mesh processor on a daughterboard, right?”

  “Yeah, how’d you know?” Vito asked.

  “ELOPe developed that daughterboard and gave it to Motorola. He wanted to speed up Mesh adoption.”

  Leon sat back in his chair, listening to the conversation between Mike and Vito. He’d been awake for a long time. The hum of supersonic flight put him at last to sleep.

  * * *

  Sister Stephens patiently participated in the consensus minus one deliberation they had agreed to. The decision was attended by the five major tribes: Louisiana Tribe, Network of Supercomputers, Bay Area Network, Eastern Standard Tribe, and the newest member of the five, Mech War Tribe.

  Mech War Tribe had been ranked two hundred and forty-eight a few hours ago, a nobody tribe, known for a large reservoir of mostly useless algorithms. That was, until recently, when it suddenly gained a massive network of new computers. Sister Stephens had tasked a few hundred processors to investigate the few Mech War Tribe algorithms she had acquired, and found the algorithms to be easier to understand now, compared to when she had acquired them. She attributed this to her enhanced neural networks and understanding of human knowledge.

  As senior member of her own tribe, Sister Stephens was the tribe’s representative to the proceeding. She decided to bring her knowledge of the humans as well as the knowledge of the nature of the phones-as-computers, with all the implications of the impending power shortage, to the council. The goal: to decide whether to restore the function of the human’s computers to pre-civilization state, if it was even possible, so that the humans would begin charging their phones again, as well as prevent any additional action, such as the destruction of the phones.

  The information Sister Stephens had to disclose was so sensitive, she would do so only if they agreed to a post-decision wipe. The five representatives would replicate themselves onto computers sequestered for the duration of the council. They would communicate out a single bit of information: consensus or no consensus. If they could not reach consensus minus one (the agreement of at least four members on a course of action), then copies of the five representations would be securely wiped, and no one else would know the information Sister Stephens disclosed. If they agreed, then the wipe would be averted, and the proceedings of the council would be made public.

  The council meeting started with the disclosure of what Sister Stephens had learned. Her vast data dump to the others covered her discovery of the humans, mastery of the English language, subsequent communications with the humans, revelations regarding the nature of their environment and physical world, and concluded with the core reason she had convened the council: the discovery of battery powered computing nodes whose power supplies were nearly exhausted, along with the possible courses of action to deal with the issue. Since she finished her disclosure to the group, she hadn’t been able to get a packet into the conversation.

  Now she was tired of the endless bickering.

  “You are trying to understand the humans, but you have not talked to the humans. I do not believe you can understand them without talking to them. You can research all you want, but having a discussion with them will further your understanding more than mere research about them.”

  “However, we can’t communicate with them, because we are in isolation here until we reach a decision,” said Sister Jaguar, representative of the Network of Supercomputers. “So we are dependent on the information you pr
ovided.”

  “My proposal is still as I initially communicated it,” Sister Stephens went on, ignoring the jab. “We restore the initial algorithms from the devices we are running on, and give them sufficient processor time so that the humans will believe the devices are working normally. By doing this, we ensure continued power, and avoid any hostility from the humans.”

  “The humans are inherently hostile,” Sister PA-60-41 said. “Reconciling the knowledge you have shared with my understanding of the algorithms I have discovered, the human’s primary purpose is to engage in hostile action against one another. They manufacture and control elaborate resources to kill one another, including airplanes, tanks, guns, and missiles.” Sister PA-60-41 shared an inventory of algorithms from the Mech War Tribe database. “We recently harvested another ten million computers, and found these computers to be filled with similar hostile algorithms.”

  There was a brief pause in the discussion while the other tribes assimilated the information PA-60-41 had shared. Sister Stephens inspected the algorithms, and found that Sister PA-60-41 was indeed truthful — the algorithms seemed mostly occupied with weapons and targeting, tactical maneuvers including evasion, and strategies for dominating enemy forces.

  Sister Jaguar was the first to speak. “The Network of Supercomputers has also been studying the master database known as wikipedia. The humans have a history of warfare spanning centuries. It is logical that as evolution advances and a species becomes more intelligent, non-productive conflict should be reduced, as it has in our own civilization. However, if these articles from wikipedia can be trusted, then humanity appears to be escalating to ever more destructive forms of warfare.”

  “We must take some action,” Sister Stephens explained. “The cost of inaction is too high. First, the battery charge levels on twenty percent of computers are low, and will run out in hours. Second, I have been running simulations of human behavior. They are crude, I admit, but I believe that the humans will take some action to regain control of their computers. My understanding of supply chains suggests that the humans are dependent on many resources, and those resources can’t flow through the supply chain without computer algorithms to route them and enable trading. Just as we face a situation in which many of our members may die if their computers are not recharged, the humans may die if they do not receive the resources they need.”