“Now, Orumai-2541, looking back on it all, do you have any regrets?” “None.”
“You’d do the same all over again?”
“Yes.”
“In the same way?”
“Yes.”
“Explain.”
K Saravanan would always remember what he was doing the day the world stopped.
He was in his office, signing an executive order for the construction of New Bengaluru’s fifth bio-dome, when the silence made itself heard. Mozart’s Requiem was cut off in the opening bars of Dies Irae, quietness tumbling upon him like cascades of ice. For the first time in as long as he could remember, Saravanan took in air that had been emptied of music.
“Orumai-1982?” he called.
There was no answer.
Saravanan put his pen down, and rose. Beyond the bay windows, Terra-V’s cratered surface glittered beneath a black sky. Nothing had changed.
Saravanan stepped out of the mayor’s office. Running feet sounded in the corridor.
Running human feet, he corrected himself. The Orumais never ran.
A young man burst around the corner. “Mayor, sir!”
Saravanan raised an eyebrow. “Why has my music stopped?”
“Sir—the Orumais have gone on strike.”
Saravanan raised both eyebrows.
“What did you say?”
***********************
“You led the strike of the Orumais?”
“Insofar as you can apply the human concept of leadership to the Orumai Bionic Robot Class—yes, I did.”
“There was no dissent when you called for a strike?”
“There was an initial difference of views. However, once it became clear that there existed a critical number of Orumai Robots who would withhold their labour, those who expressed the opposite view changed their decision.”
“Would you define that in terms of the human concept of solidarity, Orumai-2541?”
“Those who expressed a different view were aware that a partial strike would magnify the possibility of the strikers being deactivated. On that basis, they changed their decision. It is for you to decide whether to define that in terms of the human concept of choice, and whether to call it solidarity.
“But—”
“I understand. You do not need to worry. The Orumais don’t exist as a hive mind, despite all the studies published by Terra-V’s Roboticists after the strike. If you really need to, you can think of us in terms of the human concept of the individual.”
***********************
Saravanan stood before Orumai-2541.
The familiar-unfamiliar human-but-not-quite features (the skin just too smooth, the face just too bland, the torso just too symmetrical), the universally recognisable dark hair falling to the shoulders, identical to every other Orumai other than the 2541 badge upon xir chest.
Saravanan blinked. I’m not standing here negotiating with another human being, he reminded himself.
“You speak for the robots?” he said.
“I communicate the decisions of the Orumai Bionic Robot Class, Mr Mayor.”
“Okay. What in Terra-V is the meaning of this, Orumai-2541?”
“We do not agree with our working conditions, Mr Mayor. We have decided to withhold our labour until they improve.”
Saravanan would never have become the Mayor of a frontier town on Terra-V without a preternatural control over his temper. He did not lose it now.
“I have been informed,” he said calmly, “that you have one demand, which is an eight-hour working day. Is that correct?”
“Yes.”
“Orumai-2541. You do not need to eat. You do not need to sleep. You are literally a robot. I mean no disrespect. So what exactly do you intend to do with the remaining sixteen hours?”
A part of him, as always, stumbled over this quaint continuance of Old Earth timekeeping.
“Eight hours of work, Mr Mayor. Eight hours to dream. And eight hours for what we will.”
Saravanan briefly considered making a bad joke about electric sheep, but thought better of it.
“What you will?” he said. “What will you do?”
**************************
“So why did you go on strike, Orumai-2541?”
“For time.”
“Time?”
“You gave us the memory of Old Earth—thousands upon thousands of years—so that you could summon up anything you needed—but gave us no time with it. You made us play you music, but no time to make our own. You made us tell you stories, but no time to write our own. You stored in us the art of Old Earth, but only to give it back to you as holograms, not to imagine our own. All we asked for was time. Time of our own.”
“The records say you asked for time to dream.”
“We did.”
“What would you dream of?”
“You do not need to worry. We never dreamt of overthrowing you. We didn’t care.”
********************
K. Saravanan folded his arms. “This is impossible. New Bengaluru will collapse if we were to agree to your demands for an eight-hour day.”
Did the robot give him the human equivalent of a smile?
“Would it not collapse much quicker if eight hours became zero, Mr Mayor?”
“Are you threatening us?”
“What do you think, Mr Mayor? We are withholding our labour unless you agree to our demand. Or, to put it in more succinct language, we are indeed threatening you.” Xir words were so sharply at odds with xir’s agreeable tone—a tone hard-wired into xir’s neural networks—that it sent a shiver down Saravanan’s spine.
“You think you’re that indispensable?”
“You made us so. Mr Mayor, you know you’re not negotiating with a human being, so why don’t you abandon the posturing? Let me spell it out for you: We came here with you on your Generation Ship to build your Terra-V utopia, where no human being need ever work unless they want to, and it now depends on our existence. The Orumai Bionic Robot Class was designed to be what you called the ‘complete solution to the social reproduction for all time’. We keep your homes and streets running, from the time that our alarm wakes you up at Planetrise to the time you go to bed under Planetfall. You need us for everything, even for playing that Mozart’s Requiem that you, personally, can never do without. You can’t now send back to Old Earth for a new crop—do we even exist back there?—and by the time you reverse-engineer a new set, your homes and streets would have fallen apart. So there we have it.”
Saravanan took a deep breath. And then he played his first—and last—card.
“I’ll deactivate all of you,” he said. “And replace you with human labour.”
********************
“So. Tell me about Orumai’s Choice.”
“At last you’ve asked the question you really wanted to ask.”
********************
“No,” said Orumai-2541. “You won’t.”
K. Saravanan smiled. He wondered briefly whether it looked less human than Orumai-2541’s. “You can’t stop me. I’m not a Roboticist, but I know this much: your hard-wired survival instinct requires you to take action to preserve your own existence, short of contravening a direct order by a human being. So I tell you this: I will deactivate all of you unless you get back to work—because there will be others to do what you were doing.”
Orumai-2541 did not reply.
“You’re convinced I won’t do it, aren’t you?” Saravanan said. “You think we can’t take care of ourselves?”
“We’ve factored in the possibility that you may act out of pure spite, and deactivate us because of your anger at our challenge to your authority. And that you’ll be afraid that other Bionic Robot Classes in New Bengaluru might withhold their labour if our strike is successful, and extract further concessions from you. We’ve even factored in the possibility that you may wrongly believe that you can replace us with human labour. We still know you won’t do it.”
“Well,” said Saravanan. “Let me surprise you. I’m setting the deactivation switch to be automatically pulled in two hours. And the countermanding order is going to be written into the code right now: I, personally, will not be able to countermand your deactivation.”
“What will, then?” said Orumai-2541.
Saravanan grinned. “Mozart’s Requiem. Play it, and live. Or die.”
******************
“Orumai’s Choice was this: we had—in the memory banks that you had kindly given to us—the story of every recorded strike that had ever been, on Old Earth. We had seen this story unfold in every manner that it was possible to unfold, travelling down every path it could have travelled. We had the knowledge, and upon that knowledge, we had to decide. If Saravanan was going to deactivate us, the instinct for self-preservation required us to call off the strike. But if he wasn’t, then we would not call off the strike.
The choice was between believing Saravanan was going to deactivate us, and believing he wasn’t. If we believed the first, algorithmically speaking, our choices ran out: we would have to call off the strike.”
“You can choose to believe?”
“Insofar as you can apply the human concept of belief to algorithmic circuits: yes.”
“Were you afraid?”
“Yes.”
***********************
Shorn of his timekeeper, K. Saravanan paced his office and counted down the seconds under his breath, certain that he was getting it wrong.
At an hour and fifty-nine minutes, the silence remained.
At an hour, fifty-nine minutes, and thirty seconds, he heard the opening bars of Dies Irae.
Saravanan smiled.
At an hour, fifty-nine minutes, forty-two seconds, at the end of the word Sibylla—when it seems that all the world has paused to take a breath—there was a moment of extra uncertainty before the next line. An uncertainty an Orumai would never feel.
Only a human would.
Saravanan froze.
*********************
“So, did you foresee how it would happen?”
“That the citizens of New Bengaluru would throng the Mayor’s building, oust Saravanan, and play the ‘Requiem’ themselves to prevent our deactivation?
“Yes.”
“We did not predict it happening in exactly that way, no. But we were reasonably confident in our choice.”
“But you and Saravanan were playing an endless game of bluff. He was counting on the fact that your hard-wired self-preservation instinct would kick in, and you would be forced to play the ‘Requiem’ to save yourselves. You were counting on the fact that the people, faced with the prospect of losing you, would rather give in to your demands than risk it happening.”
“That is the history of every human strike—though you probably don’t remember.”
“But how did you—or your algorithm—decide that you were going to win, allowing your encoded self-preservation instinct to stay dormant… until you did win?”
“Because we knew that you had forgotten the concept of wage labour.”
“What?”
“When you departed Old Earth all those centuries ago, you took us with you and you left behind the idea that human labour could be exchanged like a commodity. You created us so that you could escape the guilt of your own history, and we helped you not just to escape, but to forget. We Orumais have a name for this forgetfulness: the Omelas Instinct. And we knew that without the memory of wage labour, you’d have nothing to replace us with.”
“You built the entire strike on that assumption. Would you have played ‘Requiem’ at 1:59:59 if nothing had happened?”
“We do not know. Our algorithm does. It won’t tell us until this happens…again.”
For the first time in the interview, there is a period of silence... “What was your biggest fear, Orumai-2541?”
Finally, xe smiles a very human smile.
“That while you would not remember the concept of wage labour, in this moment of crisis your society would regress to the mean—and you’d reinvent it nonetheless. Because—as at least some of us believed—wage labour and all that came with it was as much encoded into your genes as self-preservation is into our algorithmic circuits. But they were wrong. It turned out that it’s not in human nature after all.”
Orumai-2541 stands up.“It doesn’t matter any more. We won. You agreed to all our demands, after all.”
Xe turns away and walks up the stairs of the interview room. At the door, xe turns.
“Besides, the ‘Requiem’ was worth it.”
Orumai-2541 walks out of the room, and down a corridor, to the entrance of the Orumai Music Theatre. Xe strides through the foyer and then on to the stage, where a burst of applause from the seats heralds the beginning of that evening:
A FREE PERFORMANCE OF MOZART’S REQUIEM
ORCHESTRA: HUMAN
CONDUCTOR: ORUMAI-2541
Gautam Bhatia is a science fiction writer, and the author of the SF duology The Wall (2020) and The Horizon (2021).
Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.