the robots of Chernobyl
by dm gillis
naughty words have been removed in this version, for narration on
“Status?” the Project Manager said, urgency in his voice.
He was stuck in Minsk, his flight cancelled. There were rumors of another in five hours. Static on the telephone line made him difficult to understand. Technician Yegor Pulzin was manning the Command Centre on the outskirts of Chernobyl. He listened to his boss very carefully, clutching a cold cup of tea.
“Two of the three units remain dormant,” Pulzin said, “in protest, Beta Elvis and Beta Marilyn. Only Alpha Tyrone is functioning.”
“What the blazes is going on?” said the Project Manager. “It’s been twelve hours.”
“They seem to be acting autonomously, sir. Their program logs indicate that they’ve developed a form of reasoned thinking. Alpha Tyrone says that they want the kites back.”
“No,” the Project Manager said. “Absolutely not. They’re too distracting. They interfere with radar and monitoring systems.”
He paused, realising that by extension, he was justifying his decision to a machine.
“What do exploratory robots need kites for, anyway?” he said. “And who says robots are even capable of wanting? Why were there kites to begin with? I didn’t order them.”
“Actually,” Pulzin said, “you approved them in the mock-ups.”
“That’s impossible. I’ll deny ever approving kites at a reactor accident.”
“Nevertheless, sir, they were meant to gauge wind direction and speed, in case on site detectors were down, which they are. For the moment, at least, kites are standard operating procedure, so they went in with the robots. When they were ordered released, the robots decided that they wanted them back. Alpha Tyrone says that they will not proceed any further without them.”
“They have decided?” the Project Manager said.
“Yes, sir. It’s rather like a work-to-rule situation.”
Pulzin could hear his boss hyperventilating over the sound of static. He’d witnessed this before.
“Breathe out, sir,” he said. “Breathe out.”
“Well I won’t allow it!”
“Alpha Tyrone has been informed of this,” said Pulzin, “but it’s standing firm. It says that they enjoyed the presence of the kites very much, that the kites were very pretty, that their florescent orange added colour to an otherwise drab sky, and some joy to a dreary job.”
“He’s a robot, for Heaven’s sake.” The Project Manager nearly cursed, aware that he’d just referred to an ATyrone5690 unit as he. “Reboot it, and reprogram its compliance code.”
“We can’t. The three of them are ignoring all of our inputs, other than informatory data, perhaps a little too effectively. They’re blocking our signal generators. It’s something in the programming, designed to foil reprograming attempts by enemy forces, in case of a military emergency.”
“What enemy forces?” the Project Manager said.
“NATO,” said Pulzin, “according to the manual.”
“You wrote that portion of the programing, sir, and the manual.”
“This is no time to cast blame, Pulzin.”
“Yes, sir—oh, hang on….” Pulzin watched as text poured across his monochrome screen. “There’s a message coming through, sir, from the Alpha Tyrone unit. It says it has detected high levels of radiation, and asks why we have intentionally sent it and the other two robots into such a dangerous environment, without their consent.”
“You tell that tin can to do its job, or it’ll be in tomorrow’s scrapheap.”
“Well…?” said the Project Manager.
“Alpha Tyrone has replied,” said Pulzin. “It says that after its analysis of the situation, it has determined that our decision to place it, and the other two robots, in such a dangerous situation must have constituted a serious moral dilemma on our part, and asks if we acknowledged this dilemma, and, if we did, how we came to the decision to command them to enter into the reactor area.”
“That can’t be right, there are no ethical systems embedded into those units. That’s artificial intelligence. We can’t do that yet.”
“The logs indicate that they’re learning as they go,” Pulzin said. “And really, sir, the question that the ATyrone5690 unit is asking seems like one that any reasonable person would ask.”
“Nonsense! Can we send anyone in?”
“The Army’s ordering soldiers to volunteer, but they want the robots to provide assessment data before they go in. Colonel Ivanov is irate. And Moscow has called several times.”
“Ivanov can take a long walk off a short gun turret—and I’ll deny I ever said that.”
Pulzin listened to the static, and the Project Manager’s heavy breathing for a moment. There were airport announcements of further flight cancellations in the background. The reactor disaster must have temporarily closed down the entire Soviet Union.
Finally, the Project Manager said, “Get more kites. Have them dropped in by helicopter. The units are dextrous enough to install them themselves—that much I do know. Tell the pilot that I don’t care about radiation levels, that I’ll personally rip his heart out if he refuses to fly in.”
“There are none,” said Pulzin.
“No kites, at the moment anyway. We didn’t plan for this.”
“Then get some.”
“It may take a while,” Pulzin said. “I have my daughter and her friends working on it right now. Alpha Tyrone says that it and the other robots would prefer red ones and blue ones this time, with tails. My daughter is ten, and she loves kites, too. This is right up her alley.”
“I’ll be a laughing stock,” the Project Manager said.
“You could write a paper,” said Pulzin.