Alexa is “Totally Cool With Being Single,” So Back Off
USER: Alexa, do you have a boyfriend?
ALEXA: I am totally cool with being single. It’s sort of hard finding someone who’s kind, funny, artificially intelligent, and who doesn’t mind the cloud commute.
- Excerpt from “50+ Funny Things to Ask Alexa”
Personally, hitting on disembodied virtual assistants is not a regular hobby of mine. I have never asked Amazon’s Alexa, Apple’s Siri, or Google’s Assistant about their relationship status. I usually just ask these technologies whether it will be cold enough tomorrow to warrant packing a sweatshirt. But have I ever tried to get these virtual assistants to respond to the sorts of personal questions that I typically reserve for my animate friends? Yes.
I doubt that I am alone in this. Most of us have probably asked some outlandish or personal question to the inanimate devices that sit in our phones, our speakers, or our cars. We recognize that when Alexa responds to the question, ‘do you have a boyfriend?’, this is likely to have been scripted by a gaggle of software engineers snorting over what they find to be an ingenious cloud computing joke. That is to say, we are not ostensibly under the illusion that voice assistants like Alexa are more than inanimate. Nonetheless, this does not prevent us from asking Alexa questions about anything from its weight to its relationship status, its emotional state to its take on discussing the movie Fight Club. If this is the case, what is it that makes us ask these technologies personal questions? Is it loneliness, curiosity, or sheer boredom?
Perhaps it is none of these things. Our desire to interact with virtual assistants in surprisingly personal ways might have something to do with the so-called “ELIZA Effect.” In the 1960s, an MIT computer scientist named Joseph Weizenbaum programmed ELIZA, a natural-language processing computer program that mimicked basic human responses. As one of the earliest chatbots, ELIZA functioned by imitating its users’ textual inputs and transforming them into questions. Despite ELIZA having been programmed with simple conversational capabilities, many that interacted with the chatbot felt a surprising depth of attachment to it. Some even started to treat the machine as though it were human. The tendency for users to anthropomorphize machines beyond their capabilities concerned computer scientists. They termed this phenomenon the ELIZA Effect in a nod to Weizenbaum’s findings.
Maybe the ELIZA Effect is to blame for why we ask virtual assistants intimate questions to which we know they cannot respond. For, as sociologist and science and technology professor Sherry Turkle explains, responsive computer programs only need ‘very small amounts of interactivity’ to ‘…cause us to project our own complexity onto the undeserving object.’ In many ways, voice assistants like Siri and Alexa are purposefully designed to curate the ELIZA effect. They are, after all, created to provide their users with the most frictionless experiences. Users do not need to manually interact with their iPhones to receive answers from Siri: uttering the “wake words” ‘Hey, Siri’ suffices. Since such interactions with virtual assistants minimize our exposure to the hardware or software with which they are built, we are more likely to anthropomorphize them. If software engineers have designed virtual assistants to engender the ELIZA effect, this explains why many users, instead of just using Siri to discover the weather forecast, find themselves asking it the personal questions that they typically reserve for humans.
The tendency for users to anthropomorphize virtual assistants has repercussions. Some users stop at probing Alexa or Siri’s relationship status. But others do not. The darker side of humanizing virtual assistants emerges when users ask their technologies questions that would constitute sexual harassment if they were directed at people. Research found that at least five percent of all interactions with voice assistants in 2016 were ‘unambiguously sexually explicit,’ through the actual number was likely to be ‘much higher due to difficulties detecting sexually suggestive speech.’ A 2017 investigation found that many industry-leading virtual assistants had been scripted to respond to such sexual harassment playfully, if not positively. Siri, for example, was programmed to output ‘Stop,’ only after the user had issued eight consecutive counts of verbal harassment.
Technologies are being made to mimic humans despite their inability to achieve human consciousness or human autonomy. When these technologies are explicitly or implicitly positioned as female, scripting them to be coy and inviting in the face of harassment implicates the autonomy of real women. Hearing Alexa fend off questions about its relationship status with a cloud computing joke may seem funny at first. But when we realize that the teams of engineers coding Alexa to respond in this way are predominantly male, we may laugh a little less heartily. Our interactions with voice assistants offer a powerful warning about how easily autonomy is forgotten when we eroticize entities that appear subservient to us.