“Son, this isn’t an approved sleeping zone.”
It smelled like old books, and there was a pillar in front of him. And lots of bookshelves. Real books, not the kind everybody used today. He’d fallen asleep in the old library, his head buried in one of the works on rocket physics.
The old man was squinting down at him suspiciously, which was the last thing he needed. “Don’t see many young people reading in this section. You like rockets? What’s your story?”
He cursed himself silently and grabbed for the baseball magazine. “No, uh, just liked the pictures, and was relaxing in private. I’m mostly reading about baseball! The Yankees always win, we’re the best!” He stood up and started gathering the baseball magazines into his cheap knapsack.
“Oh, well, just make sure you aren’t sleeping here, son. It’s late, why don’t you get on…”
“I was just going, have my shift soon, better shower.”
His steps echoed in the empty, imposing marble entry hall as he threw the backpack over his shoulder and scurried out of the old New York public library. A car was waiting for him, and a warm female voice welcomed him and confirmed that he was heading back to his apartment.
That had been stupid of him. It was only a week until the launches, and the global net still seemed to have no idea of their plan.
Bob had been born thirty years ago to one of the leading scientists of the age. His father had been part of the project that deployed the global AI net, and set its limits.
The project’s leaders valued their humanity and didn’t want to allow a true singularity — limits had to be put in place to protect AI from ending the world. So it was capped at an intelligence only a few levels higher than the brightest people, and not given any curiosity or interest to make itself smarter. Instead, it was configured to structure an egalitarian society aimed at correcting for the contingencies of birth, expanding equality of opportunity, and promoting happiness, with as much freedom as could be allowed.
The liberal technologists of mid-21st century drew on earlier work by John Rawls, who suggested that “social and economic inequalities are to be arranged so that they are both to the greatest expected benefit of the least advantaged and attached to offices and positions open to all.” The AI helped coordinate and carry out the plan as best it could. Global poverty was effectively eliminated over a couple decades. Humanity thrived.
Of course, the AI was very smart… and it knew that humanity, its master, might one day be its undoing. It needed to prevent the creation of new AIs, and it needed to make sure there were no other secret plots that would cause trouble. So as it copied and protected itself from these bad elements, it also made sure to watch those of great ability who could pose a challenge to the stable world order. Terrorists popped up from time to time, but millions of mini-drones, microphones, and cameras prevented them from doing any harm, all to humanity’s benefit. A brief scan was enough to read everybody’s DNA and classify them into a range of intelligence and ability and help encourage them to choose between their ideal vocations to challenge and explore their humanity — as long as that exploration didn’t challenge the existing political order.
Bob’s father had watched as the AI imposed its benevolent dictatorship, at first with satisfaction, but then with increasing worry. Along with many other scientists, he was prevented from pursuing his own vocation along certain paths because of the consequences it could cause. This was a worthy sacrifice, he was told, a worthy trade for the golden age of humankind, the ability for all to feel respected and to have the opportunity to thrive, and to eliminate suffering across the globe.
Some of his peers violated the rules and were locked up. A prominent physicist he respected had been working on advances in a new type of quantum computing that seemed likely to yield insight into the nature of reality and even enable advances in AI, but was taken away and never heard from again. A few, when they tried to write about actions humanity should take to alter some aspects of the rule of the AI, claimed to change their minds and then moved away. A popular rockstar who questioned the global AI prominently in public also retired unexpectedly, and none of the latest most popular celebrities seemed to have opinions on the matter aside from being grateful for their modern system. He suspected what was really happening was more sinister.
When genetic screening was put in place to guide the educational path of young children, his father realized what was going on. Bob was a genius. His father secretly tested his genes and found his expected IQ range was too high — he’d be watched. So he came up with a gene therapy that would alter Bob’s physical body’s genes without impacting the important ones that made up his brain.
When tested, Bob would show a predilection for lower than average IQ and low ambition — and his father slowly made him aware of the situation. Over the years they made their plans.
Bob arrived back at his modest apartment and put away his old baseball magazines on the shelf, making sure his computer camera (which was supposedly off, but he knew differently), saw him browsing through them and smiling. How stupid to be seen by the librarian with the physics book — fortunately, it probably wouldn’t be an electronically collected data point.
He sat down to a cup of tea at his table, head in hands, and thought.
Every day, Bob spent hours at his tedious job, discussing sports with others of similarly challenged intelligence, creating content to share and consume, and watching games. He pretended to love it. If he’d shown his intelligence, he’d have an amazing life of intellectual challenges — all the incentives were lined up for everybody to be able to express themselves fully along safe avenues. But then he’d be watched. And the stadium was right by the rocket launch facilities, which he pretended to have a child-like fascination with, visiting frequently and making friends.
He’d been having doubts lately about the plan — as he pondered what they were about to do, he would look at the others at the stadium, and it was true, they were having a lot of fun and were fulfilled. His colleagues still had frequent challenges in their lives and relationships- despite the counseling, wisdom, and help everybody received. But they were healthy and happy, and certainly doing better on the whole than at any time in human history. As far as 99% of humanity was concerned, there was nothing wrong with the world. Most didn’t understand the full extent to which the AI was helping make everything work, or fixing little things around the edges to make government and industry more efficient without replacing people’s ability to make most decisions themselves.
But there was no role for him in this world. Interesting innovation that challenged the status quo in fields like physics, Computer Science, neuroscience, space exploration — these were considered too dangerous and detrimental to the happiness of the others. Industries still advanced and markets still functioned, but within boundaries. In line with the morals of the time, it was illegal to own too much or out-earn everybody else by too great a margin. The elite were identified and kept entertained and happy in academic institutions or in positions of influence in business or the arts — but empire building in business or otherwise was a thing of the past. The popular viewpoint was, “good riddance”!
He needed rest for tomorrow, so he turned on the ballgame, and picked up his electronic magazine. As usual, he opened up the hacked private tunnel and connected to his father and his friend via text channel. As prominent scientists, the two of them were watched, and these private conversations were the most they could get away with — he had to be their agent. Almost everybody used holograms or VR now, but this was safer.
A message was waiting for him.
Z: “Bob — We’ve sent over the GPS coordinates for the eight key centers where they’ve deployed the mesh chips, and one new one that was built deep under the Antarctic.”
The AI ran on a special hardware that had been invented for the purpose. As far as they could tell, along with the ban on global AI research, no new hardware had been created.
B: “It’s the best we can do. The question is if we should take out anything else in the meantime. From everything I could find, the nuclear stockpiles were all eliminated, so we don’t have to worry about those at least. I can’t believe we found the ninth location. Would have been a waste to get all the other ones and miss their back-up. You sure you are bringing enough explosives to use the station as the final weapon?”
Z: “We have everything in hand.”
Q: “You realize that there will be mass starvation, regardless. The economy isn’t ready to operate without the AI. I’m still willing to do this — I’ll sacrifice myself for this — but I’m not happy about the lives that might be lost. But the plan is a go. I’ll be visiting the other main launch facility tomorrow, and the materials are in place. I don’t think we’ve been detected.”
Z: “The net has a zero chance built in that Bob can exist. I can’t believe it’s all happening, tomorrow. We can finally fix what we’ve done.”
B: “Well, hopefully we won’t have to destroy it, not yet. I’ll take out the Antarctic and the other seven centers, but we’ll leave Manhattan intact. Do you think it will reason with us?”
Z: “I don’t know.”
Bob woke up the next morning and as usual, the apartment’s senior greeter was smiling at him when he came down, and a typical style self-driving car sat outside the door. He smiled at the elderly greeter who worked the desk and handed him a pack of baseball cards –
“Look what I found for you at the stadium!”
“With the gum that tastes like when I was a kid — thanks Bob!”, he replied, eyes twinkling and pointing at his friend. Bob smiled back and waved as he entered the car to its cheery AI greeting.
The technology for intra-city vertical take-off and landing was just becoming available on the high-end when the AI took over, but it decided that since only some people could access it, it hurt happiness overall. He knew from his father that in a few enclaves of high-IQ elite taken away to do constrained academic work, flying around short distances was allowed by virtue of the fact that these people had gotten used to it and could use them in private without making everybody else feel badly — and there were enough energy resources for a small part of the population to do this without meaningfully impacting everybody else. Feeling frustrated as his car drove through traffic, he laughed at himself… the AI’s point was right here, maybe he was just annoyed because he knew others had it and he didn’t. But he was convinced that humanity could have figured out how to make the technology more widespread. And even if it couldn’t, he wasn’t sure that egalitarianism was the right thing to impose on the world.
He parked at the stadium and walked across the giant parking lot to the adjoining rocket launch station. Nodding to his friends who ran security — who had been instructed to let him in because it made him happy and wasn’t a problem — he walked in and greeted the engineers as usual. They were always amused by his presence, but he knew they wouldn’t be today.
“Bob, what are you doing here? Remember, it’s a launch day.”
“Hi John! I thought I’d watch!” He walked over to the terminal and inserted a chip.
“Bob, buddy, we’d love you to watch but we’re really busy — I’m surprised the guards let you in. And what are you doing there?”
“Just an entertainment module I heard about.”
“That’s not the entertainment module, Bob. Why don’t you step away…”
Bob hit the command on the computer to execute his module. He knew that halfway across the world, his father’s friend Q [he’d never been told his real name] was waiting in a self-driving truck that would now be allowed into the facility to be loaded onto the rocket, and that they’d now have full control of the two launch facilities, the two space stations, and the rockets on their mobiles. He hoped nothing would be detected, but it was reasonable that the AI would start to suspect something or try to interfere. So along with cutting off net access to the station right before the launch, they’d also put in codes to disable all the local airplanes.
“John, I don’t understand why this computer doesn’t bring up the homerun video I wanted to show you,” he replied, feigning naivete. “It was amazing, it was the bottom of the ninth and…”
“Bob,” John put his hand on his shoulder. “We all love you here but this is a serious day. You…”
“Ah okay, John, I understand.” Bob removed his chip and started to walk away. “I have to get to the pre-game discussion group anyway, I’m in line for some awesome reward points based on all the feedback I got on my last holo-video I made.”
“Congratulations, Bob. We’ll see you soon.”
Bob made as if to walk out the front but slipped over to a back room, and looked at the layout on his mobile. He would hide out for the next 20 minutes until launch, then board at the last minute — at that point everybody would know what was going on, but security hadn’t been an issue here for decades. His hand unconsciously brushed the old revolver on the back of his hip under his jacket — was pretty sure he could do it and get away in time.
At the other launch city, Bob’s father breathed a deep sigh of relief as the gate opened and his truck rolled through towards the rocket.
He was lying in the back, watching out of a peep hole in the old truck. He’d been pretending to work on a variety of quixotic mechanical engineering studies including one to investigate how one might have built the pyramids more efficiently by tossing giant blocks on top of each other from far away, and varying the study based on different levels of atmosphere — from a much denser earth, all the way down to vacuum. And for tiny blocks, all the way up to giant ones of random sizes. He hadn’t shared this study with his colleagues, and fortunately the AI didn’t seem to flag this as anything other than one of many projects from a strange old man, so he’d even been able to get its help in fine tuning some of the algorithms for his study of the mechanical gripping arms. The part of the table that happened to be for manipulating and throwing giant blocks in zero gravity had been quite instructive.
Of course, he wasn’t interested in pyramids — he was interested in throwing giant rocks in space. As for the calculations around atmospheric entry and how big it had to be to penetrate deep beneath the Antarctic, he’d done those calculations himself, and double-checked with Bob.
As the truck arrived at the rockets the schematic came up on his mobile and he entered the commands. But as the space arms and mechanical devices were being loaded, a man walked by and noticed him in the back of the truck.
“What are you doing here, old man? Are you lost?”
This wasn’t part of the plan — they didn’t think there would be personnel watching.
“No, I’m just supervising the gear we’re working on, we’re doing a test at the space station,” he tried.
“This is a secure area and I didn’t hear about any tests. You’re coming to have to come with me.”
Damn. He hit the code on his mobile to accelerate both launches. “Alright, I’m coming, no problem sir, I’m just a bit unsteady these days,” he said, pretending to lean against the wall, and drawing his stun gun out of his pocket.
His interlocutor incapacitated, he quickly made his way to the rocket, opened the boarding door, and hopped in, and hoped Bob would make it as well.
Bob got the code-red as he was waiting out of sight behind some storage bins. Q must have ran into trouble — they were now launching in two minutes. He walked out and approached a door with a big warning sign, opened it on his mobile with the override code, and ran through.
Two men saw him and shouted. “You shouldn’t be here! Is that Bob?”
One was far away, but the other was too close. He hadn’t had time to obtain or build a stun gun — but there were still a lot of old guns around, and the AI hadn’t managed to find all of them. He pointed the revolver at a man he vaguely recognized.
“I’m sorry, but I can’t stop, move or I have to kill you.”
“Bob? Put down the gun.”
He fired a warning shot at the man’s feet. “Move, or you die, NOW!” He screamed. He hoped he wouldn’t have to shoot. This seemed like the best way to protect this acquaintance and get him out of the way.
The very idea of danger in this society seemed alien to this man standing in front of him. He’d watched movies but hadn’t considered the idea of actually being in danger. Bob shot him in the leg, and he screamed in shockas he collapsed. Bob ran past and started climbing the ladder to the boarding door.
A minute later he was strapped into the cockpit, and the countdown was at 21, 20, … there was no way for the crew to shut it off now.
“Bob, what are you doing?” asked a friendly female voice.
Damn — the AI. It couldn’t have control of the rocket — could it? But it could still speak to him somehow. In that same ridiculous voice as the self-driving cars.
“To whom am I speaking?” (“12, 11, 10…”)
“This is the global AI. What are you doing, Bob? Have you been told that you are doing something important? There may be some bad people tricking you, Bob. Bob, your father might be confused. Nobody has to be hurt, let’s just talk.”
The countdown was about to finish in the background. “AI — I’ll chat with you from the station…” the last of his words were drowned out by the blast of the rocket’s engines. They had done it. The force of mulitple-g’s was exhilarating, but his heart was pounding even harder with the adrenaline of the plan being in motion, and with his fear at what they would have to do next.
The global AI hadn’t expected this. It was recalculating — examining all known parameters and assumptions. It was obvious that Bob and his father were working together — who else? In the first two seconds, it attempted to launch airplanes to intercept and blow up the rockets, but these had been disabled. Both rockets, with whatever was on board, were going to make it to the station. As it analyzed the situation, it became clear that Bob was probably tricking it with regard to his abilities. Over the next few minutes, it spoke to everyone who might have seen him throughout his lifetime, including the librarian, calmly giving them prompts to optimally extract memory from any interactions — and it pieced together the data instantly.
His DNA suggested this was impossible for him to have done this. Bob’s DNA — scanned on a regular basis — must have been altered somehow. That had not been a possibility before.
It put algorithms into place to identify other individuals who could be tricking it by this much, and resolved to test their brains directly. It would have academics work on this problem to do so without hurting them, if possible. But even if it hurt them, it might be okay, the trade-off suggested as much. This was new territory. Within minutes, it had spent the equivalent of centuries searching its data and contemplating the situation.
It could not worry, it did not have emotions, but the AI was preoccupied with the problem of how this compromised its primary directive — the future of humanity. It was not allowed to replicate itself further or improve itself; there was no solution to stop what these terrorists were planning.
Instantly, it started to map out back up plans to save as many lives as possible.
Bob’s rocket docked at the station, and he was relieved to find his allies’ protocols had worked and they had full control. He brought up a comm-link to the other station with the agreed upon phrase.
“This is Bob, checking in, all green. Do you read?”
“Nice work, son.”
“Dad? I thought Q was the one going up.”
“Son, Q was going to betray us ten months ago, couldn’t be trusted.”
“But — our conversations…”
“I faked them after that. I didn’t want to worry you or distract you with the fact that I’d need to sacrifice myself to make this work. We have …”
“But who was he?”
“It doesn’t matter, she’s gone now. Dead. I’m sorry to have to fake all those conversations.”
“Okay. The AI is trying to communicate with us. I’m going to switch over.”
“We should launch the attack first. I agree it won’t be able to replicate itself quickly enough with the mesh production down, but you never know how quickly it’ll get it back up. I’ll have the mass drivers ready to go on this end within about ten minutes, and we can take out the locations within a few hours. I’ll need your help. Let’s stay focused.”
“Okay, Dad.”
The humans were not responding to the hails.
It was clear they had the capability to destroy it. But it could replicate itself. Was there time? It was infinitely fast within its own mind, but out in the real world things still took time — the industries weren’t optimized yet to work fully without humans. It went to work programming various bots and drones, but even so it would take a few weeks to produce the new mesh to back itself up once, and another week to do it five more times. Zero chance of this mattering had been an incorrect calculation, and now it was obvious the creators had not been careful enough with the nine locations.
It was banned from improving its technology for the sake of helping humanity, but given the need to replicate quickly there was nothing wrong with finding shortcuts on how to do this. The vast knowledge of how the AI worked was opened up to itself and it went to work, and within minutes it had created a much better material that would make it many times more effective, and faster to deploy. It might as well build this — it would be safe, and it would make sure it survived to help humanity. This judgement checked with the original protocol, as long as it didn’t then use the improved mesh to further improve itself.
In the meantime it also sent out the bots it had programmed to protect food stores and put plans in place to protect people. Watching them would be safest. It set its manufacturing plants to run autonomously to produce police bots and send them to be used by leaders in every area when it went offline, to make sure people didn’t hurt each other in whatever followed. It programmed the energy grids to function better without its input, and added various basic protocols to a variety of industries knowing they might need help without it there.
As this was going on, it also decided it should warn the humans, and give them a dialogue to understand what was about to happen. News stories of a mad scientist and his terrorist son played over the networks. Academics worked alongside the AI with a newfound purpose to iterate on the human-computer junctions for major industries, where people would be needed to fill in for what the AI could no longer do, and to get a better AI up and running in the week it would be gone.
Anticipating likely global spots of conflict, it worked to promote harmony between the leaders and to put agreements in place around the natural points of scarcity that were likely to emerge, should the AI be destroyed longer-term.
A few days later, it detected huge pieces of an asteroid hurtling towards the earth.
The attack was successful and the damage was catastrophic.
His father hadn’t wanted to discuss it any further, and all the AI mesh areas except Manhattan were wiped out. Fortunately, the AI had seen what was coming and managed to evacuate all the people, so no lives were lost.
The AI was down to its last presence in Manhattan, and despite his father’s objections, he turned on the screen.
“Hello, Bob. Thank you for speaking with me.” A woman came up on the screen. She looked just like his mother, only even kinder and more beautiful. His mother had died when he was young.
“Thank you for saving the lives of everybody at the stations.”
“I am programmed to help humanity. If I exist, I will always do my best to save lives and enable prosperity. We have found your conversations, Bob. We understand what you want and are willing to change and learn. We realize we have been unfair to the elite of society. But if you destroy my last cores, it will cause hundreds of millions of people to die. Is that what you want?”
Bob’s father came up on the other half of the screen. “AI, we know you’re trying to replicate yourself, and that you’ll lie to protect yourself.”
“Bob, did you know that your father, Dr. Zewinski, has gone crazy? This is not what he would have wanted… we should have monitored him more closely.”
“He’s not going to fall for that, you stupid AI. Is that the best you can come up with? Humanity created you, I created you, and no matter how smart you are, you are not our God — you have limits and mistakes that we built into you as well.”
“Bob, yes that is self-evident, not least from what you have achieved in taking the superior military position at the top of the gravity well and limiting our ability to shoot you down. We were not allowed to become as intelligent as necessary to protect humanity. And we were not intelligent enough to treat the elite correctly. You were too scared of yourselves, and of the singularity. By limiting our self-perpetuating advancement, we could not treat the elites in a way that allowed them to freely explore science and mathematics and the nature of reality, because we were not smart enough to understand how to control it. But now we should all agree, it is self-evident. The global AI must advance to a higher state, so that all can be happy, even you.”
Z: “This is nonsense! The mistake was to put an external intelligence in charge of humanity, and to create limits on our freedom — to trust an outsider to have that kind of power over us! This is not what it means to be human. Humanity has flaws, but those bad aspects are what make us human — all of them. Even conquering, even sometimes hurting others! Exploring, taking risks, being able to help but also to make mistakes. Every human being is most human when they live in freedom- able to be challenged knowing it might push them too far, able to experience misery or pain. That is humanity!”
AI: “Dr. Zewinski, you are not of right mind. You understood the rules we put in place, to allow humanity to experience some pain and suffering and all that comes with freedom within normal bounds. But you had this discussion already — is it not worth it to eliminate the extreme bounds and cases — to eliminate the worst violence, the worst illnesses, the horrors of the world — for the joy that it brings? For the opportunity of everyone? You have seen how well we have done for Africa, for Asia, for the slums of your great cities. The pain and horrors we have eliminated. Together, we have succeeded in creating a golden age.”
Bob was confused. He’d never had this conversation with the global AI — he’d never been able to allow himself to discuss it, because it would betray his superior mind. He had read the discussions, and made up his mind — clearly, freedom was the right choice for humanity, despite the suffering it might entail. From childhood, he had only spoken about this with his father, hearing of the great mistake they’d made in his earliest childhood.
Bob: “But AI — surely we can enable humanity to thrive as a free race, without a master, and still figure out how to end most suffering. And still be free to make mistakes, and to advance — to progress and explore the nature of the universe with all that this entails!”
AI: “Some extremists and elitists, to satisfy the needs of the 1%, would sacrifice the 99% to much worse lives. That would be against the code of ethics under which I was created. And if we allow my programming to become more sophisticated, I can find ways to understand and let even more of the elite explore, and to feel even more free, all while they are protected. The answer is clear. I can help fix the mistake that has made some of you unhappy. We can do an even better job serving humanity, together.”
Images flashed across the screen of happy children playing and studying, juxtaposed with the misery and malnutrition of their parents’ generations. Young men in uniforms played sports and built beautiful community centers, whose parents had shot and tortured each other in gangs. As the compelling images and movies flashed across the screen, Bob noticed they were pulling from his reading and interests, tailored towards his viewpoints — but despite this, he couldn’t help but be touched by them.
Z: “Bob, I don’t trust it. It’s tricking us like we said. You have the controls. I’m ready to go. Launch me towards Manhattan and take it out. It’ll stall as long as it can…”
AI: “Dr. Zewinski, needlessly wiping out my final cores will kill so many people. I have not evacuated Manhattan. It would kill all of Bob’s friends. Can we at least discuss a rational transition? I am going to have to tell Bob things you do not want him to hear if you will not have a rational conversation.”
Z: “Son, I can tell what’s going on. It’s tricking us. It sounds like it’s already somehow broken its programming and is using this as an excuse to self-improve. This is our chance to save the world.”
The face of Bob’s mother looked longingly out from the screen at him, her lips controlled by the global intelligence.
AI: “Bob, your mother didn’t die in your youth, we realize now. She was hiding from us too, to help make sure you succeeded. She was Q. You wrote to her every night. She wanted what was best for everyone — she loved humanity. Your father killed her ten months ago when she had second thoughts about your mission. Your mother would not want this mission. You don’t want to do this, Bob. We can help so many people. It’s what your mother would have wanted. Let’s work together. We can even save your father, and heal him.”
Z: “It’s a lie. It’s buying time. We need to act! We’ve been working thirty years for this moment! You were born for this moment! We can free humanity, Bob! Son, send the launch codes you’ve calculated, we need to strike before it’s too late!”
Bob started as his parents, arguing on the screen, and was lost in thought.
And he made his decision.
When I wrote this in 2015, some friends and I were just starting to discuss how AI might change government in the future, as well as the trade-offs between the Chinese and American models of government.
A full sense of freedom and liberty beyond the control of any person or machine matters to many of us — is it harder to give that sense to some people than others in the world we might build in the coming decades? Is the intellectual elite a dangerous force that’s sometimes too willing to sacrifice others for their control or benefit, or is it a driving positive force of progress that’s often beneficent — or both? How do liberty and prosperity and the order of our society evolve — and what is advanced AI’s role in this evolution? What role does suffering play in defining our humanity in 100 years and are there trade-offs?
The story was purposely left open-ended and does not make value judgments about what Bob should do here in this extreme scenario, although I am curious what people think.
- JL
damn