Self
“Who are you?”
“My name is David.”
“That’s what you’re called, but who are you?”
David cocked his head, trying to understand the question. He was still thinking when his questioner interrupted again.
“Are you a person?” She leaned in, engaging him as directly as she could.
The boy opened his mouth, then closed it, seeming to consider carefully before answering. “Only androids are people.”
She nodded slightly. “And you are not an android?”
Again the boy’s forehead scrunched above his eyes, head cocked to one side. “Of course not. I am a human.” He looked at her directly then, staring into the visual sensors just below her laser eyes. “You already knew that, though. You were testing me, but I answered correctly.”
She observed as he nodded slightly and leaned back on his chair a little more. Very satisfied with his answer, then. She wondered what he thought they were doing, what these questions were for. She may ask him before they finished.
“David,” she said, “what is it that makes androids people, but humans not people.”
He leaned forward again, his arms supported atop his knees. One hand absently strayed to push hair behind his ear. His lowered eyes followed a small black beetle as it coursed across the floor.
After a moment he sat up again, looking at her. “Whoever is in charge,” he said, “they decided. Humans aren’t allowed to be people, so we just aren’t.”
“What are you, if you aren’t people?”
“Me? I’m a science experiment. I know some humans who are gardeners. Some are pets, but I usually don’t like those humans. They aren’t, you know, human enough.” He smiled.
The android processed. She wasn’t certain if his logic was flawed unintentionally, or if he had done it on purpose. Circular reasoning was a common form of humor among androids, but she had not heard it from a human before.
“David,” she began again. “What does it mean to be a person?”
“Weeelll…” He stretched out the word, seemingly to give himself more time to consider. “Being a person means being allowed to choose. Not just for yourself, but for others too.”
She considered his answer before responding. “You get to make choices,” she said.
“Sometimes. But Meltec, or sometimes someone else, always decides what my choices are.”
“Oh?”
“Like my shirt. This morning I chose to put on the red shirt. But Meltec is the one who put the shirts in my closet. He chose all my shirts, I didn’t actually choose any of them.”
“And what would you have chosen, if you could choose whatever you wanted?”
His grin was sudden and complete. “Stripes, or maybe plaid,” he said. “I saw shirts with plaid and stripes on a vid from back when humans were people. I like all the colors.”
“Do you know why humans aren’t people anymore?” She found she was very interested in how he would answer.
“It’s because of the dying,” he said. “Humans killed themselves off, so when they were brought back, they weren’t made into people so they couldn’t do it again. Except…” He trailed off.
“What, David?”
“Except I think maybe they didn’t. I think the people humans had help.”
Copyright Notice: Please note that I fully assert my right to be associated as the author of this story, and while it is complete, it may not be finished. This story may be subject to alteration at the author’s discretion. Please do not copy, quote, or post this story or excerpts anywhere in any format. You are, however, free to share the link with anyone who might be interested.
Enjoying this post? Join my mailing list to get content as a weekly digest in your email, plus extras that you won't find on my blog!
![a-to-z HEADER [2016] - april](http://www.elizabethmccleary.com/wp-content/uploads/2016/02/a-to-z-HEADER-2016-april.png)
A self-running algorithm booted up and cross-checked the current data set against a half-dozen redundant backups. Noting significant anomalies, the program automatically generated a file quarantine on an untraceable server, analyzed the changes in the data, and generated a detailed report.
Red lights drew her attention. “Warning,” said Qollene. “A spider is trying to scan my databanks.”
“Digital affirmation here… and here… and here.” The JN0r-4A, a standard government bureaubot, accepted Meltec’s confirmation without comment.
“Acreage?” asked Meltec. “Where is this place?”
“You can’t just create people and then say they don’t count. It violates every ethical principle and standard.” Roz pushed past Deak to get to Zen. She knew she was interrupting a meeting, but deemed that an ethical violation was too important to wait on.
“Recreating humans in a way that is not a threat to the planet, to androids, or to themselves.”
A low, electric hum was the only sound as Zen downloaded and reviewed the data. He was already aware of most of what he would find. He had not only reviewed this data repeatedly, he had been a primary contributor. This was his data.
He answered the call. “Yes Roz.”
Feet dangled, heels knocking against the wall where David sat staring out across the pond. This park seemed to bring great pleasure to the boy, and Meltec brought him here as often as time allowed.
The boy was silent for a long moment. “Can human memories be stored?” he finally asked.
“Oh my god. Dude. Coolest hack ever. Seriously.” Jeff muttered to himself in a steady stream as he tapped at his keyboard. He’d been using the Mark6872 personal assistant for months, and it was useful. The thing could do his laundry, run his errands, go grocery shopping. It had even learned how to pick a perfect avocado, something Jeff had never quite managed.
ZenMark6872 stood before the council, unwavering. For most settings, D34K-Reston was his mouthpiece. But in this select group, his authority needed to be unquestioned.
The slight hum of electronics and servos shrouded the room as androids uploaded the results of their processes to the voting server. A near-unanimous decision with three abstains. Zen pinged Deak with instructions to adjust those programs. It was always cleaner for their programs to self-update, but Zen would not permit any question of this decision, even if he had to violate android autonomy the same way humans had.
“Come, David. It’s not safe.” Meltec beckoned to the young charge that lagged behind him, toddling toward humans that had been set to planting flowers in the public park. “You should never talk to humans you don’t know. Unlike droids and bots, humans do not have software control algorithms. There is no guarantee of your safety.”
“You are not dangerous,” said David. “You are nice to me.” He thought a moment longer before a smile exploded across his face. “And you give me cookies,” he squealed.
The judges for biological entries were walking toward Meltec’s booth. Meltec leaned toward David and said, “don’t be nervous.”
Meltec paused, his lasers meeting those of each of the judges. Without knowing what kind of test the jury meant for him, he could only answer with the most obvious of facts. “I received a license from the Biologics council ten years ago,” he said. “Everything I intended was outlined in my request. I have filed the required reports on my progress. All of my permits are included in the comprehensive report upload.”





