If you've ever whispered soft, sweet words to Siri on a Saturday night, you're probably not alone. The desirability of robots has been a common trope in science fiction for much longer than voice-activated smartphone assistants have been around.
Robots are appealing fantasy lovers for a variety of reasons, says Despina Kakoudaki, a professor at American University who wrote the book, "Anatomy of a Robot: Literature, Cinema, and the Cultural Work of Artificial People." They can be perfectly beautiful, brilliant and, most importantly, ageless.
“It’s indestructible. It has replaceable body parts. It’s very compartmentalized, as if it is the alternative to the vulnerable, very fleshy, very gooey, very sometimes smelly human body,” she says. Androids can also take emotional and physical abuse that humans can’t — and shouldn’t.
It’s for exactly those reasons that some social scientists have suggested creating robot prostitutes and soldiers, but Kakoudaki isn’t keen on the comparison. “We treat objects really well,” she says. “We also love them in a way that is really quite invested and, in many ways, it is always cheaper to abuse people.”
Playwright Leah Nanako Winkler thinks fantasizing about robots is healthy, especially when in a relationship, since your partner is not a person. Winkler’s play, "Taisetsu Na Hito" — meaning "Important Person" — was part of a recent Sex with Robots Festival — featuring plays on the subject, not the act itself.
“Mine was the only one that kind of went in a way that was like, we hate this robot and we’re very unhappy so we’re gonna kill it,” Winkler says. She said the other pieces focused on loneliness and using robots for sex.
As for Winkler’s own attitude toward android companions? "If there was a sexual companion guy, I would not be above buying one if I was lonely," she admits.
But for now, we’re far from fulfilling Winkler and others' fantasies. What's left is actors playing robot characters, with directors throwing them into situations where the audience expects them to feel emotions that robots aren’t capable of. Kakoudaki uses the example of Commander Data in Star Trek: “We keep putting him in situations where he might feel that emotion, but actually, happily, he doesn’t,” she says.
And what does that mean for us? “It implies a type of investment in not having to feel that emotion ourselves.”