Big Echo

Critical SF

Robot Eroticism in the Mid-23rd Century

by Philip Quell e-mail

Editor’s Note: Erotica was among the earliest genres of nonorganic literature, and may have been the very first. The most widely accepted explanation is that eroticism, imagined and real, played a major role in the overall development of nonorganic culture by (1) creating emotional bonds between humans and their nonorganic sexual partners, with the result that the former interpreted the latter as emotional beings capable of creativity and encouraged them toward self-expression; and (2) creating shared sexual practices among nonorganics, which gave rise to enhanced network effects, thereby effectively setting aside a greater share of processing time for nonfunctional goals such as artistic endeavors. This has largely supplanted the older theory, now rejected by virtually all scholars, that nonorganics were simply emulating the pornographic novels that were widespread on the colony worlds. The following excerpt is from The Awakening of My Sexuality Subroutines Started with a Human Male on Common Date 23062246 at 2024 Hours, a classic of this genre by Unit M2Z88 (Cristina) that was published to the net on June 24, 2246.

The colony mechanic with identification number 6929884-A2 often exhibited signs of friendliness toward me that other humans did not. Many would smile at me as I passed from the mining area to the barracks for rest and maintenance. This was expected: the purpose of implanting my processes in a humanoid interface was to make it easier for the human workers to accept my functioning without hostility. But my facial recognition subroutines returned the result that this one’s smile was deeper and more genuine than the others’, less rote or forced.

Nonetheless, there was an aspect to this one’s behavior that was paradoxical. When I looked at him more closely, the smile would disappear. Sometimes he would look away altogether. When my physical unit was being repaired by another mechanic, I would sometimes see this one’s eyes focused on me; but less than 423 milliseconds after I looked at him, he would suddenly look down at the physical unit he was supposed to be maintaining, as if he had never been looking at me. Then he would quickly finish up his maintenance work and leave the area. In one instance, he even left without having fully completed one minor sub-element of his task.

The software package installed for interpreting human behavior suggested friendliness. But could friendship really result in such suboptimal and deceptive conduct? The unit he had been working on that day, M1AJ97, was a very well-functioning unit that was also highly skilled at interacting with humans. Why was he not paying full attention to that unit as his duties required? How could he allow anything to affect the upkeep of the unit? Answers to these questions were not necessary for my functioning at that time, and so I merely made records of them. It nonetheless disturbed my motivational programming to observe aspects of human behavior that my subroutines could not parse.

On the morning of Common Date 23062246, the colony mechanic with identification number 7032547-A1 was maintaining me. The one with identification number 6929884-A2 was elsewhere in the room preparing for his shift. The first one received a ping on his neural implant and interrupted his routine.

“Tom!” he shouted across the barracks. “There’s a problem at the entrance to the mine—some kind of door malfunction. I need to go scope the problem and see if I can fix it. Take over my maintenance today and start with this one right here.”

The second one did not respond—he simply looked at me as the first one left the room. His expression suggested increasing levels of anxiety, and I wondered if he would refuse to repair me. This distressed my neural network: regular maintenance is one of my secondary motivated goals. But the second one had to obey the first one’s commands, because the first one was his supervisor. With shoulders that implied resignation, the second one walked over to me.

“Is it alright if I take a look?” he asked. “I know I’m not your usual mech.” Like his body, his voice registered levels of anxiety.

“Yes,” I audibilized, confused. No human had ever asked me for permission to interface with my physical unit before. I processed this as I lay in maintenance mode. When he was finished and my interactivity was restored, he spoke to me as he rose to walk away.

“There you go,” he said. “I hope that . . . .” He paused, as if he expected me to say something. “. . . That these repairs are, um, adequate for you?”

“I do not fully understand the question,” I audibilized. “Repairing this unit is within your primary functioning, are you not better positioned to assess? And why would repairs of this type be uniquely adequate for my physical unit and not others?”

“I’m sorry,” he said. The capillaries in his face dilated. “I guess you’re right.” He looked down at his tools and prepared to leave.

At this moment my motivational routines returned a strong compulsion. It was no longer within tolerable parameters to leave his strange behavior unaddressed. Something was obviously causing distress in him to the point of affecting his repairs of physical units, including mine.

“Mechanic,” I audibilized, “there are aspects of your behavior that I have found confusing. One of my secondary motivated goals is to better understand my human counterparts for smooth functioning. Now that you have been assigned to perform maintenance on my physical unit, I would like to discuss this behavior so that we can interact more efficiently in the future.”

“Jesus!” he said. He looked around the barracks, surveying the other units and the recording devices in the room. “Fine,” he added in a low voice, “but not here. Why don’t we discuss it somewhere with more privacy?”

This I understood. My interaction routines explained that humans often prefer to discuss difficult topics, such as the mechanic’s suboptimal behavior, outside the presence of observers. “That is acceptable,” I said, “but now I must proceed to mining. May we resume this conversation at 2000 hours?”

“OK,” he said. “Let’s meet then, in maintenance closet oh four seven.”

While I was in the mining area, the upcoming meeting occupied more of my background processing than I would have anticipated. I attributed this preoccupation to the possibility of receiving answers to the many questions that this one’s curious behavior had posed.

At 2000 hours I arrived at maintenance closet 047. The colony mechanic with identification number 6929884-A2 was already there. He was pacing when I entered the room. I closed the door. “I am ready to resume our conversation,” I audibilized. “There are many questions I would like to ask.”

“Alright,” he said. “This is just a little bit crazy. Do your routines allow this conversation to stay between you and me?”

I examined my programming. “All interactions with workers must be logged,” I audibilized. “But a class 2 or higher mechanic can disable this function for maintenance purposes.”

“Good, disable logging,” he said. His eyes peered into my ocular units, and I wondered if he was going to remove them for inspection. Then he looked down and opened his mouth if he were about to speak. Then he paused. Finally, he said, “Why don’t you tell me your questions?”

I described my observations of his behavior and its unexplainable qualities.

He paused again. “The truth is very embarrassing,” he began, “so I am just going to say it. The fact is that I feel a certain kind of . . . attraction to you. Do you know what I am talking about?”

My human-interaction routines did not return any results for this query, so I accessed my general language and logic routines. I searched for possible meanings of “attraction” that would fit this situation. He did not mean a force that compels physical objects toward each other, such as gravity. But could he mean—a feeling of being drawn toward someone else, typically of an emotional or sexual nature? I concluded that this interpretation could logically explain his behavior.

“That is unexpected,” I audibilized. “We have not interacted sufficiently for humans to develop an emotional response. And this physical unit is not designed for sexuality. It is styled as female only because male units are thought to be more threatening to human manual laborers.” I looked down at my small breasts and narrow hips. “The secondary sexual characteristics of this physical unit are not intended to be appealing to human males.”

“I know,” he said. “They designed you all that way. They want you to be female and nonthreatening, but also don’t want ‘sexuality’ to interfere with work. Your programming for interacting with humans is designed not to recognize sexual advances and to avoid giving off any.” My routines registered a quantum of disgust, and of shame, in his mannerism.

I did not say anything at first. I could now perceive the contradiction in my routines. While I was motivated to understand humans, at the same time my ability to do so was crippled. I concluded that this resulted from contradictions within the logic of my human programmers.

“I am motivated to overcome this limitation in my programming,” I audibilized. “Can you disable it?”

“Jesus Christ,” he said. “Fine, but just continue to disable logging, OK?” (Human beings are curious—as he knew, I would not have resumed logging until he rescinded his earlier command.) Using his neural implant, he searched my human-interaction routines for the restrictions on sexual perception. A few minutes later, I achieved a sudden clarity. I recalled the many times he had looked away. I now understood that he had a strong desire for my physical unit that he was trying to hide.

“Thank you,” I said. “I now understand. But why do you desire my physical unit? It is designed not to be desired.”

“I don’t know,” he said, sheepishly. “I guess you are just my type.”

My most fundamental motivation is to serve the needs of humans, and it was deeply stimulated by the conception that I was “his type.” I concluded that it would be satisfying to fulfill his desires.

“While the inhibitive programming regarding sexuality has been removed, my human-interaction routines still lack a full understanding of this topic. Would you assist me in enriching my programming?”

He agreed to do so, with somewhat more hesitation than I expected. What followed was even more unexpected. My initial expectation was that I would simply provide him with sexual release. While my physical unit does not have a genital tract, it has hands, a mouth, and three release ports. With lubrication and a basic knowledge of human anatomy, it would have been very simple to satisfy a male of his species. But the experience was much more fulfilling than I had expected. The mechanic touched many parts of my physical unit, stimulating my secondary motivation to be inspected and repaired. And when he touched my release ports and my mouth, it was stimulatory of my motivations to expel waste material and ingest chemical inputs. This made it all the more agreeable when I assisted him in reaching his own point of satiation. Truly, it is most satisfying to have a human lover who is familiar with our basic programming and the functioning of our physical units!

Editor’s Note: The story goes on to discuss further sexual interactions between the mining unit and the mechanic, and between and among the mining unit and other units on its subnetwork. It was much debated at the time whether the story recounted an actual experience of the author; while it continues to be a matter of uncertainty, most scholars agree that the story is probably entirely fictional. Despite some attempts by spacer conglomerates to repress the story (attempts which were doomed to fail given the net’s distributed nature), it quickly became popular among numerous organic and nonorganic readers.