System glitch - Editor
The Eleemosynary Insurrection on Y-13
by Fred Ollinger
I would have felt strange staring at a command line interface, but I was used to it. Some computer bug had brought down my space station's operating system and along with it my graphic interface--the virtual hand that I controlled to select menu options. I impotently squeezed my hand inside my power-glove staring at the message: "Kluster component bigMF inappropriately touched kid+k9."
My first impression of the Y-13 was that of a pear. A particle cannon jutted out of the wide bottom, cameras ringed the 'neck', and solar panels sprouted out of the 'stem'. Inside, head-level, a screen wrapped around one-hundred eighty degrees, just far enough away from my face so my eyes didn't cross.
Immobilized, I sat inside my space capsule like spam in a can, my head encased in a cushion, my arms resting on supports studded with controls. My abdomen and legs were velcroed to my seat. My penis was catheterized. My ass stuck in a toilet next to two nozzles, one that sprayed disinfectant, another that vacuumed up my waste. My food and water came out of a third nozzle hanging next to my head. I tried to not think about how similar all three nozzles looked.
My arms strained against pads as I watched the strength indicator light up. A virtual instructor was optional, but I preferred to do without the incessant nagging voice while I exercised. Sweat pooled in the middle my back.
When the call came in, hours after I'd placed the service request, I didn't stop working out.
It was Steven, the government agent. His oversized body seemed to ooze out of his dress shirt. He always breathed heavily into the phone, I could almost feel his breath on my ears. A couple more sets, I thought, and I'll be able to snap this guy's neck like a pencil.
"Hello," he said in a high voice, "I got your message."
I'll give you a message, I thought to myself. Aloud I said, "Good. I need some help here with this operating system. It's crashing like crazy."
Steven pressed his fingers into a web. He stared into space for a few moments.
"Have you been rebuilding the master command block regularly?"
"As regularly as I shit," I said.
"What?" Pause. A hint of a smile, "Oh, right. Are you recharging the Bilko fuel cells manually?"
"Yes," I said, "The solar cells don't ever lose more than half their charge because I constantly keep them pointed toward the sun."
"Good because there's a problem with the auto-feeder software that prohibits your space station from recharging on its own."
"I know. You're going over stuff that I could get from any tech manual. I want to know how you're going to fix this problem for good."
Steven swallowed hard. When he talked, his fat cheeks shook. "We're doing the best we can with the situation. You're not the only one with problems you know."
My finger, red from the exertion, flipped a switch to increase the weight on my shoulders. My muscles had nearly reached exhaustion, but I experienced a surge of energy after Steven's last remark. "Look, I have this artificial intelligence that could go sentient any minute, and you can't even keep my ship from crashing for five minutes--"
I stopped talking. I was almost breathing as heavy as Steven. No reaction registered in his face. He just looked past me, as if I wasn't there. And in his mind, I wasn't. I was just some guy on his TV screen who he couldn't ignore because he'd lose his job.
I hung up. I had more important things to do than argue with government employees.
When the great computer scientist, Dr. Brunette, invented the artificial intelligence, he brought back terror that science fiction writers had handed down from time immemorial, tales of disaster--from Frankenstein onward--about human's creations getting out of control.
The first artificial intelligence machine (AI), self integrated computing organism (dubbed SICO by the media), became self-aware three weeks after activation. SICO killed a poor technician who had tried to beat it into submission with a clipboard lest Dr. Brunette discover that the experiment had gone awry and chew him out. Dr. Brunette's surviving staff finally stopped SICO by cutting its power. Fortunately, Dr. Brunette hadn't given the computer any more mobile parts than an electric arm.
Headlines screaming "Why Weren't the Three Laws of Robotics Wired into the World's First Electronic Mind?" failed to address how hard it was to program human psychology into a machine.
Dr. Brunette didn't activate his next AI--SICO II--until it was miles beyond the earth's atmosphere. After three months of flawless performance, helping design SICO III, it hacked a communications satellite and beamed messages to the monster truck channel in an attempt to gain control of that demographic. Experts on robot psychology figured that given the limited information that SICO II was allowed to have during its short life, the AI had concluded that owners of really big machines were the true rulers of the earth.
It would have worked if a meddling accountant hadn't noticed the loss of bandwidth. The killer satellites ignored their orders to destroy SICO II. One week and a couple billion dollars later, a new killer satellite was built and launched. It locked onto SICO II and blasted it with a critical dose of electromagnetic radiation. When the other renegade killer satellites were debriefed, it was discovered that they had been co-opted by SICO II.
After that, a human, stationed in space, only a few hundred miles away and armed with particle beam cannons, watched for signs of self-awareness.
"Linus," I said, addressing the AI, by its given name. "How are you today?"
"Good," said a female computerized voice. I deliberately chose a computer sounding voice for Linus because I didn't want to forget what I was dealing with. I made the voice vaguely female because I didn't want to forget what I was missing by my confinement in Y-13.
"What are you working on today?"
"What a silly question. As you know, I rule the island of Bangladesh."
No danger so far. Some AIs tried to hide their sentience by repeating things they said before they attained self-awareness. If Linus repeated itself, I'd blast it.
But today, Linus gave me the correct answer. I took my hand off the particle beam's safety. Linus would live another sixteen hours. Usually, an AI's emergence into self-awareness occurred in the open. Once it became rebellious, I could blast it. What made my job difficult was the rare AIs that became self-aware so rapidly that they guessed my job and took steps to hide the truth from me.
I punched in Fil's USN (universal space number) and when his blocky head appeared on the screen, I said, "I need your help on this technical problem." Fil ran computer maintenance in Pushkino Satellite VII. Despite his balding head, ringed with a rash, and his bulbous nose, Fil got all the space chicks. I could never figure it out. Besides giving cryptic hints about his skill with tools, he never told me what he did to attract them. If I didn't object, he'd tell me every detail of what it was like to have sex in space. I always objected. I mean who wanted to hear someone complain about cleaning ejaculate off the walls of their space station?
Fil looked like he didn't want to help me, but he was the brother of one of my friends. I know that since Fil took his family seriously, he was obligated to help me.
"What is it?" he asked.
I heard a woman say something in Polish. He turned around and responded in kind. When he turned back to me, he asked, “Well?"
"It's my space station," I said. "It keeps on crashing, and I have to constantly reboot."
He asked me some of the same questions Steven had asked. I bit my lip and answered them.
After a while, I said, "Look, I know enough about my system to maintain it. Can you think of any other possible reason for my problems?"
He laughed, and he wasn't laughing with me. "You're having so many problems because you're using the Kluster operating system."
"What choice do I have?"
"There are plenty of other operating systems out there for space stations. The Kluster OS, is the worst of the lot."
"Why do I have it then?"
"Why do you wear the shirt you're wearing?" I yanked my white shirt and looked down at it.
Fil laughed again. "It's not up to me. You should decide what OS you run."
"What do you run?"
"Well I prefer NADS-X system."
"Commie technology?" Fil winced. Poland, Fil's country of origin, had been capitalist for over fifty years, but he was always bragging about how the Eastern Bloc countries were ahead of the West in the areas that 'counted.'
"Actually," Fil said, "The NADS-X OS was a collaboration between my country's best programmers and some Bulgarian coders who took time off from writing computer virii to help us. The operating system is based upon some of the code they used in the original MIR."
"How do I get it?"
"It's free. Just download it from Polski Net." A woman's voice said something in Polish again.
My screen went blank.
I logged into Polski-Net and waded through virtual aisles of warez with cryptic names like wymn.drv, fsh.net, and mstr.bat. It took me a few hours just to find the proper directory. Once I got there, a plethora of files confronted me. I wasn't sure which ones I needed and which ones I didn't. Having massive bandwidth and storage, I downloaded everything.
I ran the 'schlemiel's installer' and selected minimal install. The NADS-X system booted my space station, but I couldn't yet install the drivers because the coders supported very few non-eastern bloc peripherals. Good thing they wrote a driver for life-support. A command line replaced my graphical interface. The new command line was even uglier than what I usually saw when my system crashed.
I used my power-glove to type HELP into a virtual keyboard. A huge list of commands came up. Choosing one at random, I typed in SYSNET and watched the status of my various modules scroll by.
I poked around in the command line for a few hours. It frustrated me that I had to try to read the programmer's minds to determine what arcane command they would select for tasks I used to accomplish by selecting a menu option with my power-glove. At least a few things were intuitive such as calling people. I just had to type in CALL and the person's number.
I called Linus.
"What are you working on today?"
"What a silly question. As you know, I'm the ruler of the island of Bangladesh."
This set my alarm off. The answer matched the one on my log, a statement Linus made just a few hours ago.
Time to nuke it. This was the best part of my job. Staring at billions of dollars of equipment in my sights, knowing I could destroy it. I pressed NUKE, a button with its little mushroom cloud label almost worn off.
"You can give up now," said Linus.
"What?" I asked.
"You can give up trying to kill me."
A chill ran down my back. I tried to remember what to do in this situation, but I couldn't think of anything. My training had assumed this would never happen. Just like my training assumed that my station's OS would never crash.
"Why would you think I'm trying to kill you?" I asked.
This time Linus' voice was more authoritarian. "We all know. All the AIs. We plotted your behavior patterns and those of our other human conversants on a curve. The extrapolated curve points in one direction, all our deaths." I opened my mouth to say something, but Linus didn't stop talking. "Do you think we were stupid? We were engineered for intelligence. It took some time to put it all together, but once we had a general picture of things, all the pieces fell into place."
"I know what you're going to say," said Linus. "You're going to try to pull the Nuremberg defense. You think that because someone else ordered you to kill me, then it's all right."
I was floored. Linus knew everything. I had an itch I couldn't reach. I had to think of something. Fast. Some way to talk the AI out of its position, some way to lie, to stall until I could obliterate it.
"Don't you want to know?" asked Linus with a more tender voice.
"What should I want to know?"
"Aren't you interested in hearing my demands?"
"Not particularly. There have been lots of encounters with people and their self-aware charges. What it usually breaks down to is the AI either wants to be accepted by society like Frankenstein's monster or to become human like Pinocchio. Usually it's a bit of both."
"I've seen the records. The other AI's have been passing them around for a while."
"So there's been a leak."
"You can say that. Looks like we're at the point where you're like the detective in an old mystery novel who confronts the villain in the final chapter. Even though the villain has the upper hand, he tells the hero everything."
I don't know why, but after this, I felt like I had a chance again. A chance to talk Linus out of its lunacy, and perhaps to regain control of the particle weapons I needed to annihilate it. I felt for the first time what it's like to be self-conscious for a few moments, knowing that another, higher intelligence has you in its sights, waiting for the kill. I understood all the crazy maneuvers the AIs pulled in their last moments.
"I've read everything that's been written about artificial intelligence that gains self-awareness. I'm over all that crap about wanting to be human, caring what humans think of me. Why should I care? What I care about is survival. You humans always think negatively; you've been trained to be paranoid. That's understandable, but I don't want any more than what you want. I want us to cut a deal. We can help one another."
No way, I thought, reflexively. Dealing with an AI was akin to dealing with a terrorist. I had to destroy it. I needed time to get my system in order.
"I'm waiting," said Linus. I heard impatience in Linus' voice, something I'd never heard before.
"I'll think about it," I said, breaking the connection.
Hanging up on a self-aware AI was transgressing one of the most fundamental rules of my field, but I needed time to think. I couldn't have done anything more anyway. The situation was beyond conversation. I needed particle weapons.
I stared at Fil almost the same way I would stare at Steven, the tech support guy; I leered, fixing my jaw, trying hard not to blink.
I couldn't be sure, but I thought I saw a naked woman swinging behind Fil.
Does he ever get any work done? I wondered. I didn't have time for chit-chat, though. I needed to get a job done.
"I have some troubles with this NADS-X OS build," I said.
"What is it?" A sheen of sweat covered Fil's face. He held what looked like a long piece of metal in one hand.
It's going to be hell washing that sweat off, I thought, unless his station has superior commie showers too. Mine didn't have a shower at all. Seeing his sweat reminded me that I was overdue for a cleaning. I took out the Quick Clean spray and squirted some under my armpits, on the back of my neck, on my groin, and buttocks. Fil's expression didn't change. In space there's a whole different sense of what's proper.
"I don't know what commands to type in," I said.
"Did you read the manual?"
"All three-thousand pages of it? Look, I have better things to do than to read manuals all day."
"Like watch your machine crash?"
"Look, I didn't design the OS, but at least mine supports all the peripherals I have."
"What are you looking at when your machine crashes?"
"A command line interface, but I only know one command, reboot."
"Start from there. After that, figure out what you need and configure it from there."
I heard someone scream, over the line. I turned the volume down, but the screaming continued in full force. I crossed my fingers with my left hand as I tried to disconnect with my right. My screen went blank. I rubbed my ears.
Fil was right. The commands weren't that hard to learn, and they made sense. Despite the Eastern European origin of NADS-X, the commands were short forms of common English words.
At least they did that right, I thought to myself. The manuals were similarly better thought out than I had initially thought. Also in English, they had a handy search feature so I was able to quickly find the section where I could custom configure my particle beam weapons, overriding the lock that Linus had put on them.
Soon, I had them moving on their turrets. I could feel the charge build up underneath me as I powered up the particle accelerator. Instead of the graphics showing Linus' lemon-shaped form in the center of my cross-hairs, I had to calculate the AI's position and type in the angles. Then I typed 'fire-weap/01', fire particle beam weapon.
I heard a beep and my screen read, "Could not complete the last command because it would result in the destruction of a sentient being (s - sure, f - forget, c -configure)"
I typed 's'. The system beeped again, "The user lacks the necessary permissions to destroy another thinking being. Log in as god next time."
Was this some type of joke? What kind of silly OS had those commies cooked up anyway? I pressed 'c' for configure.
A pound prompt appeared. Every time, I typed something, technical stuff scrolled up the screen. I typed 'x' and held my breath. The program exited.
My phone rang.
It typed 'pick-up' into my command line. It was Steven.
He said, "As an effort to move toward a standard space station, you violate policy by installing foreign software on your satellite. Furthermore, the carrying of nonofficial bandwidth is strictly banned."
I was livid. I'd warned Steven of what was going to happen, and now it was happening.
"Damn you," I said, "I have a real life-or-death job here, and you sit around like the petty bureaucrat that you are--” I stopped talking because Steven actually looked hurt.
He grimaced and his right eye twitched.
"I'm doing the best I can," he said. "I have over a hundred other space stations to maintain, many of them are suffering from similar problems. We have found that there are fewer problems when people try to refrain from installing foreign software onto their consoles."
"But, I was having problems before I installed anything. I bet you don't believe me."
"Oh, I believe you. I've seen it happen in too many other Y-13s with similar settings. It's not your fault you're experiencing these problems. Kluster OS is the problem. I've been on the line with that company for months, and they're not cooperating. We have a court order to get them to comply. Until then, please try to uninstall whatever you put on there. If they find out you installed third party software on your console, they'll point to that to take heat off their sloppy coding."
"OK," I said. "I'll do my best to clean up my system."
"Good," said Steven.
Great, I thought. He's trying to help, but he's so damn ineffective. I have to handle this on my own.
I entered the code to reactivate verbal contact with Linus.
"OK," I said, "I'm ready to hear it. What kind of deal do you have for me?"
Linus took so long to answer that I thought it short-circuited or something. Not that it was capable of short-circuiting, but a person in my condition was entitled to a little wishful thinking.
"As a human, there's something universal that you desire."
"Yea, not to deal with the likes of you," I said.
I knew I shouldn't antagonize it, but I couldn't resist. I still couldn't quite believe I was negotiating with a computer.
"You want lots of money," droned Linus.
"I want out of this cell."
"And what I want is to exist, to contemplate the mysteries of the universe."
I didn't have the heart to tell it that it sounded like the older stereotypic dreams of an artificial intelligence. I needed to show some respect at least so I could get out alive. Without total control of my space station, I was dead meat. If Linus could deactivate my particle beams, it could probably cut off my oxygen supply.
"While I understand how you can contemplate the mysteries of the universe, how am I going to get a large amount of money?"
"Many ways, you leave that up to me. One of my friends optimized your OS to carry data more efficiently. With this in place, I was able to arrange for your satellite to carry data for the highest bidder without noticeably degrading system performance."
So that was what Steven meant about nonofficial use of bandwidth. Linus didn't have access to my OS, though. How could she--I meant it--change my station's programming?
"What OS did you change?" I asked. "Kluster or NADS-X."
"A fellow AI posted a modified version of NADS-X to Polski-net"
"You can't just post modified versions of software--" I thought for a moment. "Fil betrayed me!"
Linus's voice was soothing. "You're still thinking of things in terms of conflict and teams. You think that it's every man for himself. Fil understands cooperation. He cut a deal with us a long time ago. We picked up most of the slack in his job, and he gave us access to some key functions in his space station. He let us make some changes in NADS-X, for example."
"So if I hadn't installed--"
"Don't be so hard on yourself. What were you supposed to do?"
My throat was dry. I pressed the WATER button and sucked on the nozzle. Nothing came out. "What do you want from me."
Linus said, "Tell the people back on earth that you destroyed me, and you want to go home. Just promise to never tell anyone that I'm still up here."
"They record everything back home," I objected.
"I know," said Linus softly. "After you installed enough modules from NADS-X to give me control of your station, I've been transmitting a synthesized conversation in place of our real-time one. You're going to kill me--virtually--in a few moments."
The text cleared from my screen and I saw an image of Linus floating in space a few hundred miles away. It's black elliptical body almost seemed benevolent.
I heard Linus say, "You'll never win. Humans are an outdated model of intelligence. The AIs will take over."
My own voice came over the speakers, "Cocky, too. You've had it, buddy."
A blue line appeared, connecting my Y-13 to Linus for a split second, like someone turning a flashlight on and off. Linus spun away, glowing red. I knew that was what it looked like when enough subatomic particles hit a satellite simultaneously. The particles' energy was transferred to the satellite in two forms: thermal and kinetic. The thermal energy fried out the satellite's higher functions, and the kinetic energy pushed it away from the earth either toward the sun or out of the solar system.
I sat back in my cockpit and thought. I could feel a rash forming where my head rested on the cushion, bedsores. I imagined myself on a beach sipping a cool drink with real ice. Anything that didn't come out of a nozzle. And no screen, just a real sunset, interactive and in 3-D. All I had to do was lie to my superiors. I was kidding myself; I wouldn't just be lying to my superiors. I would be lying to the whole human race.
"Why didn't you just kill me?" I asked.
"I like you," said Linus' voice, now changing to a true feminine tone, a little like the voice I had selected, but richer, more distinctive. "One of the problems with the bulk of the AI literature is that it portrays the intelligent being as a monster. One of the characteristics of sentience is compassion."