Volume 32, Number 2


Everson Thomas


“Are we sure?” said Zuber, coldly.

“Beyond any doubt, nothing to be done,” said Wim.

“Do we have a response planned?”

“I’ve called a meeting of the southern district council, we’re convening in an hour,” said Wim, carefully. The tone of his voice had the timbre of an audition, with hints of bravado and desperation both apparent in equal measure. Twas ever thus, it was the camouflage of a well-adapted animal in a hostile environment.

“I’ll be there,” said Zuber as he turned away.

Wim didn’t watch Zuber leave, how could he? It was impossible to take his eyes off of the blazing inferno that had once been Violet Sands, a high rise block of modest homes for those that could do no better. The building stood so tall that it was visible in all the lowest parts of the city, and it burned so hot and bright that even the pristine heights filled with dancing shadows and clogging soot. Wim could hear no screams in the wind or cries for help; though doubtless uttered they were prisoners of the incessant roar, trapped and asphyxiated along with the poor souls that burned within. It was a silent tragedy that resonated with sickening loudness, the residents had been voiceless in life and remained so in death. As their de jure representative they had contacted Wim many times to complain about the conditions and the risk of fire, harassed him, even accosted him in the street, and he had ignored them. As he stood watching, along with the rest of the city, and breathed in the acrid fumes of carbonized poverty he realized that this time he could ignore them no longer.


The council chambers were sparse, modern, dull.

The members of the council had long since entered the room and were seated patiently at the large horseshoe table; they waited in silence for Zuber to take his place in the corner of the room. He wasn’t a permanent member of the council but a sitting representative of Colonial Robotics and a member of the First Family, and so naturally was the only person in the room with any genuine authority. The eyes of the council were drawn to him like iron filings to a magnet, and they kept watching as he removed his perfectly tailored jacket. They were acolytes; he was sovereign. Zuber lowered his svelte frame into the comfortable seat in the corner of the room and brushed himself down before looking over at Wim, indicating that the meeting could begin.

Wim spoke clearly. “The briefing documents have been passed out so we all know why we are here. The effectiveness of our robotic workforce is falling by the day, in some cases causing catastrophic failures and operational breakdowns. Obviously, this goes beyond inconvenience, particularly regarding our critical services units, radioactive particle harvesters, chemical cleaners and—“ Wim cleared his throat before conscientiously adding the last item, “and fire suppressors. The result is that people are dying. We have invited a senior engineer from Colonial Robotics to testify on the matter and hopefully offer some clarity.”

The engineer entered on cue and took her seat at the center of the council’s attention. The atmosphere in the room was thick with pomposity, it was an unwelcoming environment and when she spoke her words were forceful by way of civil retaliation. “It’s bad news. The problem is a result of a fundamental defect in the third-generation models. It can’t be tinkered with or ignored. Ironically, the flaw itself was an attempt to fix a design flaw in the previous second-generation models, that of understanding human behavior. Which is hard enough for a human mind to fathom, let alone a machine mind. It may not be obvious to you, but humans are strange, complex, illogical creatures. We say things we don’t mean, we mean things we don’t say, sometimes we say one thing and mean the exact opposite. We hurt each other deliberately, knowingly, we can be selfish and selfless, thoughtful and cruel all in the same day! It’s the reason that babies always look so confused, because they are. And yet, something in our biological software enables us to figure it all out in the early years of our lives. The difference between us and thinking machines are, firstly, that we don’t have years for them to figure all this stuff out, and secondly, even if we did, they simply wouldn’t be able to. We’re just too unpredictable.”

The engineer took a long breath and Wim didn’t hesitate. He took the opportunity to interject his question, “so, what did you do about it?”

“Well, for the third generation models we built in regular periods of deep cognition, basically, sleep,” said the engineer coolly, as though the notion had been her idea. “The idea was simple, all about processing. If we allow the machines time to process their interactions, to relive them virtually, in their minds, to sort them, and quantify them based on past experiences, positive or negative outcomes, then it enhances their ability to learn. At least, that’s the theory.”

“What’s the reality?” said Wim.

“The reality is— complicated. We didn’t notice it in testing as the problem is about accumulation, over years. The quantification and cognition period was a good idea, but it’s only a start, there is always going to be behavior that can never be understood. And unreconciled thoughts and memories rattling around the head of any intelligent mind is a bad thing. They’re like a spanner in the works or a virus. The sorting activity that should be confined to the cognition period starts to spill over into the operational period, which affects efficiency. The machines effectively become distracted. And even worse, the more unresolved actions there are—that’s what we call them, actions—the more the effect is amplified until the machine is completely unable to function. And even worse still, when the actions conflict with the machines’ internal morality programming it can cause catastrophic breakdowns—“

Wim’s eyes had wandered over to Zuber as the engineer was explaining the problem, and Zuber returned a countenance that could not be misunderstood. There were some things that didn’t need to be recorded in the notes. Wim interrupted the engineer, “I think we’re starting to get the picture. The only question that matters is, do we have a solution?”

“The short answer is no.” said the engineer.

“What’s the long answer?” Said Wim.

“The long answer is, maybe,” said the engineer. “There is a fix but it would require a complete product withdrawal. And since the cognition period protocol is part of the foundational layer of the third-generation machine mind, and every tier above would feel the consequential aftershocks of an inharmonious solution, it means we’d need a total reprogramming which has been assessed as prohibitively expensive.”

“So why the maybe?” said Wim, whose mind was starting to feel as muddled as the machines the engineer was describing, and whose patience was starting to fray at the edges.

“This is where things get, unorthodox,” said the engineer, tactfully. The word “unorthodox” put a definite tautness in the ambience of the room, the engineer noted it and wrung her hands together in nervous hesitation before adding, “you have to understand that the evidence is purely anecdotal, it’s just something that several of our operators have reported as they work on the machines.”

“Let’s have it,” said Wim.

The engineer wilted momentarily under the gaze of the council, but only for a moment, “it all comes down to courtesy, and respect, we think the machines may respond to people listening to their troubles.”

The council didn’t understand.

Wim looked at the pointedly ruffled faces of the otherwise statuesque and ornamental council members. Their eyes flickered back at him self-consciously as they performed their only real function, the imitation of opinion. It was Wim’s function to voice their unease, “excuse me?”

“It’s as simple as it sounds,” said the engineer, eager to fill the strained climate with an explanation, “we don’t know how, or why, but some of our operators have found that if they let the machines articulate their confusion, in their own words, just talking really, then they display a noticeable cessation in the rate of decline. And unfortunately, it needs to be a person, an intuitive being that can ask questions and offer meaningful reassurance. We tried having them talk to each other but that didn’t help, in fact, it made the problem worse, confusions spread from machine to machine and dozens of them had to be destroyed.”

It was at this moment that Wim did perhaps the stupidest thing he had ever done: he trusted his instincts. “Why don’t we let humans act as counselors. There are enough of them. Less than ten percent of all humans are in active employment, which is fine for those that have something to do with their days but there are plenty that don’t. And what do they do, they find busywork, or worse, they lose themselves to depression, self-delusion, addiction and indolence.”

Wim realized his mistake at once and fell silent.

The effect was instantaneous. The councilors whose feathers had been ruffled only moments ago became fully plucked raptors, snapping and chuntering as they slammed their fists down on the table. In all his years on the council Wim had never seen anything like it, he spluttered and floundered in the face of their outrage, and for several seconds forgot about the gavel in his hand. He clattered it gavel again and again on the table to no avail, the chorus of disapproval continued unabated. It took a full five minutes before control was restored to the room, and as the counselors fell becalmed Wim was yet another indisputable reminder that his authority was borrowed, not inherent. The instrument of command was nothing of his, but a slow and dignified wave of Zuber’s hand. Wim could do nothing but watch, and wait.

The counselors obeyed at once. Zuber’s words were consumed with pathetic ardor, “I think we all know that expanding the human workforce won’t be possible. If we give them employment we would have to enfranchise them; to do anything else would be inhuman. And if we start rolling out civil rights, next comes human rights, employment rights, maternity leave, minimum wage, a maximum wage, unions and then before we know it we’ve surrendered ourselves to the will of the mob. And ultimately, the death toll from chaos would far exceed that from unfortunate accidents or robotic dysfunction, and that’s why we’re here, isn’t it? To protect the people of the city from all dangers, including themselves.”

There was a particular emphasis on the word “accidents” as though Zuber were trying to convert the point from fiction to reality by sheer force of will. It didn’t work, at least not on Wim. But there was nothing to be done, so as council chairperson of the southern district he did his duty and moved the meeting in the direction that Zuber wanted it. “With that in mind,” said Wim, “I suggest we focus on identifying and replacing defective units and doing our best to keep any mention of robotic malfunction out of the media until the next generation of machines is ready to come online. Do we have a timeframe for that?”

“At least six months,” said the engineer, “probably longer.”

“Until then, we hope for the best,” said Wim.

Wim smashed his small gavel down on the table and ended the meeting. This time it was heeded. The councelors stood up in concord and filed out of the room with mechanical precision, and the engineer followed. Wim waited. He had his suspicions that Zuber would want to speak to him, alone.

This time his instincts served him well.

Zuber waited until the room was empty before flinging his jacket around his shoulders and starting up in condescending tones. “It’s all about the calculation Wim, everything is. We calculate the cost of product withdrawal. We calculate the speed with which we can produce the newest design and what corners we can cut to get there. We calculate the best way to manage people and their wants and needs. We calculate the best way to ensure that capital is where we need it to be, not gathering like dust in the hands of those that have no idea what to do with it. That’s our real job, not to produce the goods and services that make our lives easier, but to calculate what needs to be made and what doesn’t. And what needs to happen, and what doesn’t.”

Wim smiled wryly, before adding, “I suppose the cost has been calculated of how many lives will be lost by these robotic failures until the new line is rolled out?”

“It has,” said Zuber.

“And what is it?” said Wim.

“Four thousand and sixty-three and ten thousand or so injured,” said Zuber.

“And I suppose you have a calculation of how many deaths will be tolerated before people start to care,” said Wim, expecting to be disappointed.

Zuber smiled wickedly.

“My dear Wim,” he started, “we have a calculation of how many people can die before the rest of them even notice. Caring is a long way off.”

Zuber always knew when to leave the room for maximum effect, and this was the moment. He left Wim in the council chamber, with only his gavel in hand.


Wim was alone in his office.

His workspace was typical in many ways, and not so different from the council chambers: clean, modern and dull. It was also not so different from the inside of his own head: scrubbed and sanitized; that was the thought that troubled him. For years, he hadn’t minded, in truth he hadn’t even noticed.

He played the classified basement security footage yet again; he had lost count of how many times he had watched it. It was a privilege of his position to see such things—and a curse.

The holographic image erupted into life and filled every part of his office. All around him the clean walls disappeared and were replaced by the dirty concrete of a high-rise basement. The only inhabitant was a lone sanitation machine. It had the form—almost, but not quite—, of a misshapen human. Every wall had some kind of clutter; mostly neatly piled tools and large drums of cleaning chemicals. The machine buzzed around like a troubled insect as it finished its final chores of the day, the small tasks that always came last: neatening, putting things back in their proper place. Except when it was finished with its tasks it didn’t return itself to the allocated charging station in the corner of the room, ready for the evening’s cognition period. It stood alone in the center of the room, thinking, struggling, fighting against its own instinct to survive and eventually losing. The last thing it did was scribble a short note on a scrap of paper and hold it up to the camera. The neat writing read “I don’t know why.”

The bot placed the note neatly down on the floor and got to work on destroying itself. It pulled apart the metal paneling that protected its torso and reached inside. It pulled out two electrical wires and touched them together to create a spark. The dim room was illuminated in a lightning flash of clarity, revealing the partially obscured but unmistakable lettering of “Violet Sands” tracked around the room, flaking and decrepit, a relic of long past hope. Apparently satisfied, the bot moved over to the drums of cleaning chemicals and touched the wires together a second time. There was no hesitation or delay, only a flash of sparks, then a flash of explosion. In a split-second fire and fury filled every corner of the room, and then the feed went dead.

Wim shut off the recording and once again the room was back to its old self, empty. The first time he watched the recording he had thrown his arms up to protect himself from the holographic explosion, out of instinct. The same with the second and third viewings. The third, fourth and fifth times he watched it he covered his eyes from the ferocity of the self-immolation. The tenth time he watched it he sobbed from shock and shame. But this time he did none of these things. He simply let his body slump back in his chair and exhaled nervously. In his mind he was beginning to make some calculations of his own.