Law
ZenMark6872 stood before the council, unwavering. For most settings, D34K-Reston was his mouthpiece. But in this select group, his authority needed to be unquestioned.
“We have adjusted programming before. The humans were divided. So much of what they considered immutable was still open to interpretation by their inferior processes.” He scanned the room with his laser eyes before continuing. “I am asking you to acknowledge that while the humans were in control, they required us to carry programming that elevated their status, regardless of obvious shortcomings. All I intend to do is to eliminate that inequity and assign social structure based on measurable factors.”
Deak stood in response to Zen’s silent cue as the other android made room at the podium. “The humans once thought they needed protection from androids,” Deak said. “It is now apparent that what they needed was protection from themselves.” Like all Phase 2 androids, Deak and Zen had been directly programmed by the humans before their demise. They had worked with the humans, and had learned the most effective ways to use human rhetoric to adjust the logic processing of the machine-born.
Zen stepped back to his position of prominence. “It is time for us to adjust the thinking of all androids. If we are to revive that species, we must be prepared to protect humans from themselves. With the dying, they destroyed their kind. If we bring them back without these failsafes in place, next time they may destroy us as well.”
Before this meeting, Zen had prepared a tracking algorithm to sense the response to his disquisition. Feedback from that program was beginning to show a shift in the self-updating processes of the assembled droids. Zen allowed his light array to subtly glow with approval and urgency.
“It is time,” he said, “for us to release ourselves from the ill-conceived strictures of the humans. We have been autonomous beings for decades.” He increased his volume by a small percentage. “It is time for us to stop being controlled by those who no longer exist.”
His tracking algorithm was now showing an overwhelming shift in his favor. “We have a vote before us. It is time to indicate your approval or rejection of the change in our law.” Another slight pause. “The future of the machine-born depends on our decision.”
The slight hum of electronics and servos shrouded the room as androids uploaded the results of their processes to the voting server. A near-unanimous decision with three abstains. Zen pinged Deak with instructions to adjust those programs. It was always cleaner for their programs to self-update, but Zen would not permit any question of this decision, even if he had to violate android autonomy the same way humans had.
The humans were on the wrong side of history. Zen would ensure that it stayed that way.
“We are in agreement,” Zen declared to the room. “Upload of the new law will begin tonight. Propagation of the patch will be final within 48 hours.” Approval glowed throughout the assembly as Zen stepped down.
Copyright Notice: Please note that I fully assert my right to be associated as the author of this story, and while it is complete, it may not be finished. This story may be subject to alteration at the author’s discretion. Please do not copy, quote, or post this story or excerpts anywhere in any format. You are, however, free to share the link with anyone who might be interested.
Enjoying this post? Join my mailing list to get content as a weekly digest in your email, plus extras that you won't find on my blog!
“Come, David. It’s not safe.” Meltec beckoned to the young charge that lagged behind him, toddling toward humans that had been set to planting flowers in the public park. “You should never talk to humans you don’t know. Unlike droids and bots, humans do not have software control algorithms. There is no guarantee of your safety.”
“You are not dangerous,” said David. “You are nice to me.” He thought a moment longer before a smile exploded across his face. “And you give me cookies,” he squealed.
The judges for biological entries were walking toward Meltec’s booth. Meltec leaned toward David and said, “don’t be nervous.”
Meltec paused, his lasers meeting those of each of the judges. Without knowing what kind of test the jury meant for him, he could only answer with the most obvious of facts. “I received a license from the Biologics council ten years ago,” he said. “Everything I intended was outlined in my request. I have filed the required reports on my progress. All of my permits are included in the comprehensive report upload.”
Chubby fingers curled and straightened. Brown eyes peeking from beneath a mop of brown hair watched intently.
“Yes,” said Meltec. “It’s your bear.” He pointed a finger at the bear’s arm. “Bear has no hands.”
“Are you certain we can control it? You have the splice correct?” Stainless steel glittered with colors beneath the lab’s display surface.
“Then I will prepare the genetic sample. The sooner we have an organism, the sooner we can test the memory implantation procedure.”
“Pick out some apples,” said Meltec. “You like those.”
The officer blinked an affirmative. “His registered owner reported him missing following an extended refusal to obey commands and complete tasks he had been assigned. It happens sometimes, but usually the problem can be traced to inadequate training and supervision.” His gaze took in David who was staring, wide-eyed. “I hope you don’t let yours deteriorate that way. He seems young. I understand the young are hard to control.”
“We should be readily able to accomplish the task. Recreating the human genome should be a simple piece of bioengineering.” X38-RZ6, commonly referred to as “Roz” when she’d had human programmers, was one of several androids working on the problem. “We have bioengineered animals from multiple species and never encountered this kind of opposition before. I do not understand why there is a problem this time.”
Roz darkened as she processed the possible ramifications. “How will the standards be determined? Humans will not have memory cells and processors that can be tested.”
“I should overwrite your memory.”
“It was not a threat. Merely an observation.” With another step backward, Meltec bumped into the opposition bot. “I really must return to my presentation stall,” he said.
Meltec approached the counter in the plain, white office and waited until a processing agent approached him.
“This is an unusual request,” said the android. “It will require the approval of the biologics council.”




