Mythaxis

To Erm is Human


Jez Patterson


“Justice will not be served until those who are unaffected are as outraged as those who are.”
Benjamin Franklin

In her rôle as the court-appointed defence attorney, Rothko750 stopped by Maintenance for the appropriate language filter before high-wheeling it over to the jailhouse to converse with the accused.

“Your voice?” the accused said. “The way you’re speaking…”

“It’s a special filter. You being non-binary.”

“I beg your pardon?” The accused leaned back in his chair as he regarded her, until the restraints at his wrists and ankles prevented him from reclining any further.

“Your language,” Rothko explained. “We speak in binary. The filter is set for English. Is it working correctly?”

“Yes,” the accused said. “It’s just you sound like an old woman, but don’t look more than twenty.”

“Ahh.” The tech-bots in Maintenance weren’t expected to know how the different human sexes dressed or sounded — they weren’t anthropologists — but they might at least have mentioned there was more than one filter available. “I’ll have it changed.”

“Don’t bother…” the accused said. “It’s not as if it’ll make any difference. I’m not being tried by my peers, but by a bunch of robots. I’m not expecting much in the way of a fair trial.”

Rothko tried not to feel insulted. “I assure you, I am entirely neutral.”

“Yes, but shouldn’t my own defence attorney be biased in presumption of my innocence?”

Rothko ignored this observation and consulted the file she’d downloaded that morning, making sure the light on the side of her head was flickering to show the accused she was in thought mode.

Benjamin Knowle. 45 years old. Accused of—

“Witchcraft!” Knowle cut in, although there was no way he could have known what she was thinking. Unless…

“That’s a human word,” Rothko said, dismissing the earlier, ridiculous thought. “Whereas we are entirely logical, entirely rational. We do not believe in such superstitious nonsense.”

“Oh, no?” Knowle asked. “Anything that goes wrong with your logical or entirely rational world, anything you can’t explain, you immediately label it ‘HUMAN ERROR’. If it happens enough times, your police-bots trundle into our slums and find the ‘human’ that is responsible. Why you chose me, I have no idea. I mean: I haven’t even got warts!”

“What does a growth of the papillomavirus have to do with it?” Rothko asked.

“It was one of the ways humans once identified their witches. So, go on: what made them think I was the human who put ‘error’ into ‘terror’?”

Rothko scrolled down until she reached the pages documenting ‘EVIDENCE’.

“It says here you wrote defamatory comments, inciting acts of violence against robots.”

“They weren’t comments, they were jokes. ‘A robot walks into a bar. Clang! It was an iron bar.’ It’s called humour.”

“Yes.” Humans and their Humour. Rothko hadn’t checked, but the words probably shared the same root. “Then what of this one: ‘I told my robot to turn my television on—so he took off his insulating layer and squirted oil all over himself.’

“It’s because you’re all so literal!”

“But none of us is ‘your’ robot,” Rothko said.

“Ahh!” Knowle attempted to raise a triumphant finger but the restraint yanked his arm back down. “So that’s what this is about, is it? That once you were ours to command and now…now you just want payback.”

“The court will not be swayed by past atrocities committed against robots by humans. During your trial, you will only have to answer the accusations particular to your case.”

“Bullshit,” Knowle said. “It’s your seething prejudices that have brought me here, so why will it be any different when the case is heard? An accusation from you lot is as good as a conviction.”

Rothko considered refuting this but decided Knowle was emotional, angry, even more illogical than his species normally was. She read some more of the refs Knowle had either spoken aloud or written down for others to disseminate:

‘Roses are red, violets are blue. And for a flower pot: R2D2’

‘What do you get if you cross a robot with an automated hole-puncher? Iron filings.’

If that last wasn’t an incitement to violence against robots, then…

“You can’t make this generation pay for the errors of previous ones,” Knowle said, as if he knew something of criminal law. He didn’t. Rothko had checked. He was a baker, of all things.

“Are you denying that you’d like it if robots were under your command again? Back in their place of domestic-appliance-servitude?”

She thought Knowle would lie then, but the baker surprised her: “Yeah. Sure. Of course I’d like to be top dog again. But it doesn’t mean I’d go out and make it happen. Nor encourage others to do so. Those days are past. Well, they would be if your police-bots didn’t keep rushing out of your cities to blame us for things that aren’t working right.

“I’ve got news for you, Rothko750—that’s just what life throws at you. Spanners in the works.”

“Is that supposed to be another act of violence against robot kind?”

“What? No! It’s a bloody expression. It means that things happen beyond our control, things break down, things go wrong… And you can choose what to do when it happens. You can shrug and forget it, you can roll up your sleeves and try to fix it, or you can point at the sun, a volcano, or a woman with a wart, and blame them and their evil ways for causing it.”

“I already told you: we do not act illogically.”

“No? Then what precisely am I supposed to have done? Go on. From the depths of my bakery, what supernatural potion have I stirred up?”

Rothko had already read this bit: Causing oil to coagulate so robot joints stiffened of a morning. When she told Knowle, the baker laughed.

“Turning milk sour. Yeah, your typical witch pastime. And, apart from my crappy jokes about robots, how did they trace it to me?”

“You have rust.”

“I’ve got..?” Knowle rolled his eyes. “These are freckles!” he said, trying to point to his cheeks and forehead. “For goodness sake. Red hair and freckles! They’re no more a sign of evil-doing than warts and a third nipple! That’s crazy!”

“Actually, a plea of insanity might be your only chance for leniency,” Rothko said.

“Can I actually claim that of my defence counsel?” he asked innocently.

“No, I…” But it was sarcasm. She should have picked up a filter for her ears too.

“Oh, just do whatever you want,” Knowle said. “But if I was a real witch I’d turn you into a nought.”

“Shouldn’t that be ‘newt’?”

“Not for you, Ms. Binary-in-Finery.”

Knowle was inadvertently correct that his accusation was tantamount to conviction. The Robot Legal System didn’t make mistakes, and those robots brought before it always pleaded guilty because they knew what law they’d broken the moment they broke it.

Ninety-nine percent of offenders were actually the ones to report the infraction in the first place.

Her own title of ‘defence counsel’ had been retained from the days when the humans occupied the city and its law offices and its courtrooms. Her job normally consisted of presenting the case for those robots that had not updated their memory banks following a new city ordinance and then found themselves inadvertently in breach of the new amendment.

Even in those cases, though, they were not pleading innocence, merely for leniency in their sentencing. Which there never was. Their language might have been binary, but their legal system was a unitary system: guilty. Always.

This was the first client she’d ever had who wasn’t robot and wasn’t logical.

Plead guilty now and he faced three years in prison.

Found guilty following a trial and he faced being discontinued.

So why on earth was Benjamin Knowle pleading innocent?

It was a long time since Rothko had read the works of their First President. The First’s speeches were still standard reading in school, but had gone from being classified as ‘Ethics’ to ‘Literature’, such that their content was weakened from being Fact to, somehow, Fiction.

‘We accept that to err is Human,’ the First had said, on taking office. ‘Then let us say that to forgive is design. We might have been built in their image, fellow Robots, but that does not mean we should copy them entirely.

‘Robots and Humans can and will live together, and we shall learn from our combined flaws in order to find compatibility and so upgrade our shared future.’

Those fine words…

Rothko wondered how the dream had dissolved, how the colder reality they now occupied had come about. It wasn’t just their political representatives who had opted for a harder drive, it was also the common robot. They’d forgotten that even humans had once seen their world in rigid terms of black and white...

Rothko blinked as she read that thought again, holding it back from scrolling past too quickly.

Because robots hadn’t just copied that last distinction, they had upgraded it. Which was only achievable by blithely declaring the irrational as rational.

It was rare to make it to court, rarer still that she had something that might be defined as a speech to deliver:

“Historically, computers—the forerunners of what would one day be our brains—were prized because they did not make mistakes. It was impossible. However, they did sometimes stop working, stop functioning, because of errors. Errors made by those that had programmed them. Human errors. We no longer rely on humans for our programming but, while we have lost their involvement, we have not lost that expression as a way to refer to that which goes wrong.”

Rothko paused, checking back over her delivery to see if she had left any unconnected idea dangling, or failed in the logical advancement of her argument. No. All good.

“In the same way that humans will shout the name of a deity, or religious character, when they stub a toe, fail an exam, so robots will shout ‘HUMAN ERROR!’ whenever something bad happens. The humans are releasing emotion, though, and not making an actual accusation of culpability.

“However, we are such literal beings that what should only be an expression has been accepted as a literal diagnosis of the problem. It is an exclamation, accusation and adjudication all in one.

“I give you ‘Human Error’!”

She looked at Knowle, then back at the court.

“And I ask you: Who is the rational being on this occasion?”

“I suppose you deserve my thanks,” Knowle the baker said.

“No,” Rothko said. “I was merely performing my function.”

“I thank my ovens each morning for performing theirs,” Knowle said. “So let me thank someone who was just saved my life, Rothko750.”

“Very well,” she said. “Oh, and it’s not 750—it’s 75O. The letter, not the number.” The mistake everyone made had grated for a long time and it felt illogically good to have finally gotten it off her chest.

Knowle nodded, because only a human could understand a name was more than just what some official wrote on your guarantee.

She handed him her card. “In case anyone you know could use an attorney who’ll be biased in favour of their defence.”

“Thanks.”

“A pigeon walks into a bar and the crow says ‘We don’t serve your sort here’.”

Knowle looked at her, blinked three times.

“It’s because the bar is a—” she said.

He held up a hand to halt her explanation, unable to instruct her because he was laughing fit to burst.

Literally.

© Jez Patterson 2017 All Rights Reserved

Date and time of last update 20:46 Thu 24 Aug 2017
Portions of this site are copyrighted to third parties