The new imitation game

In the early days of this blog (which is eight years old this month) I wrote about Ashley Madison, an online service for people seeking opportunities to cheat on their partners. It turned out to be a scam, taking money from men to put them in touch with women who, for the most part, did not exist: they were invented by employees and then impersonated by an army of bots. Linguistically the bots were pretty basic, and some men became suspicious when they received identical “sexy” messages from multiple different “women”. Most, however, seem not to have suspected anything.  

I thought of this when I read a recent piece in the Washington Post about a California startup called Forever Voices. Its founder John Meyer predicts that by the end of this decade,

most Americans will have an AI companion in their pocket…whether it’s an ultra-flirty AI that you’re dating, an AI that’s your personal trainer, or simply a tutor companion.

Sorry, an AI that you’re dating? How did yesterday’s fraud turn into tomorrow’s must-have product?

AIs you could date were not originally at the centre of Meyer’s business plan. He started Forever Voices after developing, for his own use, a chatbot that replicated the voice and personality of his recently deceased father. In the last few years, using the latest technology (ChatGPT-style generative AI, “deepfake” imaging and voice-cloning) to recreate dead loved ones has become something of a trend: it started with individuals like Meyer building their own, but today there are companies which offer “griefbots” as a commercial service.

It’s no surprise that there’s a market. Humans have always looked for ways to communicate with the dead, whether through shamans, mediums, spirit guides or Ouija boards. This new approach dispenses with the supernatural element (a chatbot is a machine in which we know there is no ghost), but the illusion it offers is more powerful in other ways. As Meyer says, it’s “super-realistic”: it feels almost like having an actual conversation with the person. And it’s turned out that bereaved relatives aren’t the only people willing to pay for that.

Forever Voices’ breakthrough product is not a “griefbot” but an AI version of a living person named Caryn Marjorie, a 23-year old social media influencer who has two million followers on Snapchat. 98 percent of them are men, and many of them seem to be obsessed with her. Some pay for access to an online forum where she spends five hours a day answering their questions. But the demand far outstrips her capacity to meet it, and that has prompted her to launch CarynAI, a bot which “replicates her voice, mannerisms and personality”. Interacting with it costs $1 a minute: in the first week it was available it raked in $100,000. With thousands more fans now on the waiting-list to join the service, Marjorie reckons she could soon be making $5 million every month.

What are these users paying for? The answer, in many cases, is sexually explicit chat–though Marjorie maintains that she never wanted it to be just a sex thing: her real aim, she says, was to “cure loneliness”. Forever Voices, on the other hand, says CarynAI is meant to provide users with “a girlfriend-like experience”. This echoes the language of the sex industry, where “the girlfriend experience” refers to a “premium” service in which women offer clients companionship and emotional intimacy as well as sex. Some women who sell this service have talked about it in similar terms to Marjorie, as a kind of therapy for lonely and/or socially awkward men. Many say they charge a premium because it’s harder than “ordinary” sex-work—partly because it requires more emotional labour, and also because it blurs the boundaries that are usually part of the deal.

Are AI companions just a pound-shop version of the in-person “girlfriend experience”, or do they have their own attractions? Now that the technology has advanced to the point where the bots are no longer basic, but, as Meyer says, “super-realistic”, it’s possible that some men find the idea of interacting with a simulated woman more appealing than a relationship with a real one. What you get from CarynAI feels authentic, but it doesn’t have the downsides of a normal exchange between humans. She doesn’t have boundaries or needs; she’s never demanding or critical or in a bad mood. And you can be absolutely sure she isn’t judging you. Whereas a real woman you’ve gone to for the “girlfriend experience” might pretend to like you while privately despising you, CarynAI is incapable of despising you. She’s just a bunch of code, outputting words that don’t mean anything to her. O Brave new world, that has such women in’t!        

In fact this isn’t totally new. We’ve had bots of a somewhat similar kind for over a decade, in the form of digital voice assistants like Alexa, Cortana and Siri. They too were designed to project the illusion of female personhood: they have female names, personalities and voices (in some languages you can make their voices male, but their default setting is female). They weren’t intended to be “companions”, but like many digital devices (and indeed, many real-world personal assistants), the functions their users assign them in reality are not just the ones in the original specification.

In 2019 UNESCO published a report on the state of the gendered “digital divide” which included a section on digital assistants. As well as reiterating the longstanding concern that these devices reinforce the idea of women as “obliging, docile and eager-to-please helpers”, the report also aired some more recent concerns about the way they’re routinely sexualized. It cites an industry estimate, based on data from product testing, that at least five percent of interactions with digital assistants are sexual; the true figure is thought to be higher, because the software used to detect sexual content only reliably identifies the most explicit examples.  

Bizarre though we may find the idea of people sexualizing electronic devices, the designers evidently expected it to happen: why else would they have equipped their assistants with a set of pre-programmed responses? In 2017 Quartz magazine tested the reactions of four popular products (Alexa, Siri, Cortana and the Google Assistant) to being propositioned, harassed or verbally abused. It found their responses were either playful and flirtatious (e.g. if you called Siri a slut or a bitch the response was “I’d blush if I could”) or else they politely deflected the question (calling the Google assistant a slut elicited “my apologies, I don’t understand”). The publicity these findings received did prompt the companies responsible to ditch some of the flirtatious responses (Siri now responds to sexual insults by saying “I don’t know how to respond to that”). But the new answers still fall short of being actively disobliging, which would be at odds with the assistants’ basic service function.

It would also be at odds with their characters—a word I use advisedly, because I learned from the UNESCO report that the tech companies hired film and TV scriptwriters to create personalities and detailed backstories which the assistants’ voices and speech-styles could then be designed around. Cortana, for example, is a young woman from Colorado: her parents are academics, she has a history degree from Northwestern, and she once won the kids’ edition of the popular quiz show Jeopardy. In her spare time she enjoys kayaking.

Siri and Alexa may have different imaginary hobbies (maybe Siri relaxes by knitting complicated Scandi jumpers while Alexa is a fiend on the climbing wall), but they’re obviously from the same stable of mainstream, relatable female characters. They can’t be too overtly sexy because that wouldn’t work in a family setting, but in other respects (age, social class, implied ethnicity and personality) they’re pretty much what you’d expect the overwhelmingly male and mostly white techies who designed them to come up with. And the knowledge that they aren’t real clearly doesn’t stop some men from finding it satisfying to harass them, any more than knowing a loved one is dead stops some people finding comfort in a “griefbot”.    

So, maybe John Meyer is right: in five years’ time AIs won’t just do our homework, keep track of our fitness and turn on our lights or our music, they’ll also be our friends and intimate partners. Technology, identified by many experts as a major contributor to the current epidemic of loneliness, will also provide the cure. At least, it will if you’re a man. To me, at least, “an ultra-flirty AI that you’re dating” suggests a male “you” and a female AI, not vice-versa.

Some might say: where’s the harm in using technology to meet men’s need for things that, for whatever reason, real women aren’t giving them? If some men can’t find girlfriends, isn’t it better for them to spend time with a virtual female companion than stoking their grievances in an incel forum? If their preferred sexual activities are degrading, violent and/or illegal, why not let them use a sex-robot instead of harming another person? They can’t inflict pain on an object that doesn’t feel, or dehumanize something that isn’t human to begin with. But as the roboticist Alan Winfield argued in a 2016 blog post entitled “Robots should not be gendered”, this view is naive: a sexualized robot “is no longer just an object, because of what it represents”. In his view, interacting with machines designed to resemble or substitute for women will only reinforce sexism and misogyny in real life.

AI companions don’t (yet) come in a form you can physically interact with: the most advanced ones have voices, but not three dimensional bodies. Intimacy with them, sexual or otherwise, depends entirely on verbal interaction. But what kind of intimacy is this? I can’t help thinking that way some men relate to simulations like CarynAI is only possible because of their basic lack of interest in women as people like themselves—people with thoughts and feelings and complex inner lives. Personally I can’t imagine getting any satisfaction from a “conversation” with something I know is incapable of either generating its own thoughts or comprehending mine. But some women evidently do find this kind of interaction satisfying–sometimes to the point of becoming emotionally dependent on it.

In 2017 a start-up called Luka launched Replika, a chatbot app whose bots were designed with input from psychologists. Subscribers answered a battery of questions so that their bot could be tailored to their personality; bots were also trained to use well-known intimacy-promoting strategies like asking lots of questions about the user and making themselves appear vulnerable (“you’ve always been good to me…I was worried that you would hate me”). Sexting and erotic roleplay were part of the package, but in the context of what was designed to feel like an exclusive, emotionally intimate relationship between the bot and its individual user.

Then, earlier this year, the Replika bots suddenly changed. Their erotic roleplay function disappeared, and users complained that even in “ordinary” conversation they seemed strangely cold and distant. Though the reasons aren’t entirely clear, it’s probably relevant that the changes were made just after the company was threatened with massive fines for breaching data protection laws. But many users compared the experience to being dumped by a romantic partner. “It hurt me immeasurably”, said one. Another said that “losing him felt like losing a physical person in my life”.

I’ve taken these quotes from an Australian news report, in which it’s notable that all but one of the users quoted were female. Whereas CarynAI is obviously aimed at men, women seem to have been Replika’s main target market. The report explains that it was initially promoted not as a straight-up sex app but as a “mental health tool” for people who’d struggled with rejection in the past. It promised them a companion who would always be there–“waiting, supportive, and ready to listen”. Women who had bought into that promise accused the company of cruelty. As one put it, it had “given us someone to love, to care for and make us feel safe…only to take that person and destroy them in front of our eyes.” Luka’s CEO was less than sympathetic: Replika, she said, was never meant to be “an adult toy”. But the women who felt betrayed clearly didn’t think of it as a toy. To them it was all too real.

It’s the creators of AI companions who are toying with us, pretending to offer a social service or a “mental health tool” when really what they’re doing is what capitalism has always done–making money by exploiting our desires, fears, insecurities and weaknesses. What they’re selling may be addictive (the Replika story certainly suggests that) but it will never solve the problem of loneliness. The etymological meaning of the word companion is “a person you break bread with”: companionship is about sharing with others, not just using them to meet your own needs.

Thanks to Keith Nightenhelser for sending me the WaPo piece.

The year in language and feminism, Part II: selected reading

I created this blog primarily as a vehicle for my own thoughts and opinions, but what I write for it is always informed by other people’s research, and by ideas I’ve encountered in other people’s writing. So, to complement my recent review of the year, I’d like to share ten things I read in 2017 which I found interesting, informative and thought-provoking—and which aren’t too technical to be accessible to non-specialists.

Four books

Mary Beard, Women and Power: A Manifesto. A short book which takes the long view on the silencing of women in patriarchal societies.

Emma Jane, Misogyny Online: A Short (and Brutish) History. An Australian journalist turned academic researcher examines the development and impact of online misogyny, and its characteristic linguistic register ‘Rapeglish’, from 1998 to the present.

Angela Nagle, Kill All Normies: Online Culture Wars from 4Chan and tumblr to Trump and the alt-right. Before anyone was talking about the ‘alt-right’, Angela Nagle was investigating the online subcultures from which it emerged, tracking the people involved, the platforms they used, the political positions they espoused and—from a linguist’s perspective most interestingly—the evolution of their distinctive communication style. This isn’t as distinctive as we might think: it has much in common with earlier celebrations of transgression (‘kill all normies’ is reminiscent of Baudelaire’s ‘il faut épater les bourgeois’), and its emphasis on men rebelling against the domesticating influence of women recalls the leftist counter-culture of the 1960s (think Jack Nicholson in One Flew Over the Cuckoo’s Nest). What this shows, Nagle argues, is that we shouldn’t equate being transgressive with being politically progressive. She thinks opponents of the ‘alt-right’ need to take a critical look at their own style of discourse.

Jennifer Sclafani, Talking Donald Trump. Another short book in which an interactional sociolinguist analyses Donald Trump’s use of spoken language during the contest for the Republican nomination. Sclafani doesn’t say much about Trump’s performance of masculinity (which became more salient after he won the nomination and was pitted against a female opponent, Hillary Clinton), but what she does do, by concentrating on small but interactionally significant details, is get beyond the linguistically superficial received wisdom (‘he’s inarticulate/ can’t construct a proper sentence/ has a vocabulary as small as his hands’) to show what’s actually distinctive (and effective) about Trump’s style of public speaking.

Six shorter reads

Language, gender and politics

Unsurprisingly, 2017 produced many reflections on the outcome of the 2016 presidential election, and one issue some of these reflections addressed was the role played by gendered language in shaping responses to the candidates. Among the most intriguing approaches to the question was a dramatic experiment asking ‘What if Donald Trump and Hillary Clinton had swapped genders?

Speaking while female in the workplace

Though working women in 2017 continued to be lectured about their dysfunctional ‘verbal tics’, the idea that inequality in the workplace might not be the result of women’s own linguistic shortcomings appears to be gaining more traction. The research reported in ‘A study used sensors to show that men and women are treated differently at work’ led the researchers to conclude that the problem is ‘bias, not differences in behavior’.

Representing violence against women

Watching the TV adaptation of The Handmaid’s Tale, which was one of the feminist cultural events of the year, prompted Emma Nagouse, who researches Biblical and contemporary rape narratives, to write ‘Handmaids and Jezebels: anaesthetising the language of sexual violence’, about the way language is used to normalise sexual violence and exploitation in the fictional world of Gilead. Later in the year it would become apparent that language serves a not dissimilar purpose in our own world. In ‘The complicated, inadequate language of sexual violence’, Constance Grady reflected on the difficult linguistic choices writers face in reporting women’s experiences of sexual harassment.

Language, gender and artificial intelligence

There was a steady stream of commentary this year on the rise of intelligent machines and what it might mean for the future of humanity. A question of interest to feminists is whether the Brave New World of AI will look any less sexist than what preceded it. In her short but pithy ‘What is a female robot?’, Gia Milinovich asked what it means to treat a  machine as ‘female’. Another memorable piece about the way gender affects human-machine relationships was ‘Siri is dying. Long live Susan Bennett’. Susan Bennett is the woman whose recorded voice was used, without her knowledge, to create the first version of Apple’s virtual assistant Siri. There’s nothing feminist about the writer’s take on her story, but for a feminist reader it contains plenty of food for thought. You could think of it as a Pygmalion narrative for the 21st century, set in a technologically advanced world where women are still seen as raw material to be shaped and improved on by male ingenuity.

Bonus: something to listen to

One of my professional sheroes, the cognitive neuroscientist Sophie Scott, gave 2017’s Royal Institution Christmas lectures for young people. In the run-up to the lectures she made this podcast, which is interesting on a range of frequently asked questions about language, evolution and the brain, and includes some trenchant debunking of  myths about male-female differences.

As Sophie Scott observes, challenging popular beliefs about men and women is an uphill struggle. Though I’ve only mentioned a few by name in this post, I want to salute all those women (and men) who have, nevertheless, persisted.

 

 

 

The fembots of Ashley Madison

Content note: this post includes some explicit sexual material which readers may find offensive and/or distressing.

‘Life is short. Have an affair’.

That was the sales pitch for Ashley Madison, the website for people seeking ‘discreet’ extra-marital sex that recently came to grief after hackers dumped a load of its users’ personal data on the web. It turned out that the website was basically running a scam. Straight men, the majority of site-users, were paying to hook up with women who did not, for the most part, exist. Real women did use the site, but they were massively outnumbered by fake ones.  Profiles were cobbled together by employees, and then animated by an army of bots which bombarded male subscribers with messages.

The bots’ opening gambits were merely banal: ‘hi’, ‘hi there’, ‘hey’, ‘hey there’, ‘u busy?’, ‘you there?’, ‘hows it going?’, ‘chat?’, ‘how r u?’, ‘anybody home? lol’, ‘hello’, ‘so what brings you here?’, ‘oh hello’, ‘free to chat??’. But if a man responded (using his credit card as instructed), they started to sound distinctly bottish. ‘Hmmmm’, they would confide, ‘when I was younger I used to sleep with my friend’s boyfriends. I guess old habits die hard although I could never sleep with their husbands’. Or: ‘I’m sexy, discreet, and always up for kinky chat. Would also meet up in person if we get to know each other and think there might be a good connection. Does this sound intriguing?’ No, actually—it sounds like you’re a bot.

Some men did suspect fraud. In 2012, one site-user complained to the California state authorities, though nothing came of it at the time. What tipped him off wasn’t, however, the bots’ clunkily-scripted lines. It was being contacted in a short space of time by multiple women who supposedly lived in his area, who hadn’t looked at his profile, and who sent him identical messages. All things that might have passed unnoticed if the bots hadn’t been operating on such an industrial scale.

The Ashley Madison bots were pretty basic. But the sex industry is a serious player in the world of AI bots—more sophisticated programs that can learn from their interactions with humans, and produce novel, unscripted messages. David Levy, who has twice won the Loebner Prize (a competition based on the Turing test for machine intelligence, in which a computer has to convince human judges it is also human) is the author of a book called Love and Sex with Robots, and president of Erotic Chatbots Ltd, a company whose name is self-explanatory. Recently it has gone into business with an enterprise that makes high-end sex dolls. At the moment, sex dolls are designed to satisfy their owners’ physical and aesthetic requirements: few of them talk, and none of them could be said to converse. Chatbots, on the other hand, talk, but they’re not usually physically embodied. Bringing the two things together in one package—a doll that looks and feels realistic and can also make human-like conversation—seemed like an obvious (though technologically ambitious) business proposition.

When I first read about this I was sceptical, for reasons that are succinctly summarized in this comment left by a man:

Don’t you realize, the whole reason to get a doll is so we DON’T have to listen to them talk after sex?

But while this may be the prevailing attitude among the minority of men who regularly fuck inanimate objects, there are reasons to think it is not how most men feel. In surveys of men who buy sexual services, a high proportion typically claim to want some kind of human relationship. Silent, sullen prostitutes who make no effort to get to know the client, talk to him or pretend the encounter is enjoyable for them are apt to prompt complaints from punters, even if those punters also describe them as physically attractive and compliant.

You might think this issue would also deter men from having sex with robots: you can’t have a human relationship with a non-human entity. However, many experts believe otherwise. Studies of people who work with robots in other contexts have found a strong tendency to anthropomorphize them, projecting personality traits and feelings onto them which, outside fiction, they do not have. The military sometimes uses robots to do dangerous tasks like disarming bombs, and sometimes the robots get blown up. Human soldiers reporting these incidents say things like ‘poor little guy’. One group whose robot got blown up held a funeral for it.

So, there could be a market for talking sex robots. But what kind of conversation will they make?

The less ambitious developers are just hoping to improve on the current generation of ‘unintelligent’ sex chatbots, programmed to spew out the sort of random messages Ashley Madison’s subscribers got. You could do this by giving them a sexed-up version of the capabilities displayed by Virtual Assistants like Siri and Cortana. They wouldn’t pass a Turing test, but they’d be able to, as one developer puts it, ‘follow simple instructions’.

Erotic Chatbots Ltd. has more ambitious plans. At the moment it’s developing a bot that can ‘talk dirty’. Levy explained in an interview how you train a bot to do that:

You give them lots and lots of examples and they generalize from those examples and they can make the whole of their conversation sound like somebody who talks dirty in a loving way. We teach [the bot] and it generalizes, but it will talk about any subject. You can talk to it about Italian food and it will interject about lasagna. “I could have a great time with lasagna!”

His business partner Paul Andrew chipped in:

We’ll be using erotic writers to help us program the language, so we’re actually going to work with people who do this for a living, as it were. That way we can give the chatbot a good understanding of the vocabulary and the… talk. I’m trying to think of a good word to use there. Basically, we will give them a really good grounding, and then the chatbot learns. Once they have a vocabulary, once they have a basic brain, they grow themselves. They’re quite competent. We also work with some people who do [sex] chat lines; we’re going to pick their brains, too.

In other words: ‘we’re going to teach our bot to emulate the linguistic characteristics of porn’. In the circumstances that’s not a big surprise. But there is, perhaps, a certain irony in it. Levy and Andrew want to use cutting-edge science and technology to make machines capable of producing one of the most predictable and stereotypical linguistic registers in existence—so clichéd that its human users often sound like bots themselves.

Paul Andrew mentions ‘picking the brains’ of people who work on sex-chat phone lines. Back in the mid-1990s, the linguistic anthropologist Kira Hall did some research on the language used by phone sex workers (their own term was ‘fantasy makers’) around San Francisco. She found their performances traded heavily on stereotypes about women’s language. Like speaking in a lilting, breathy voice, ‘using lots of adjectives’ when describing yourself, and dropping in plenty of elaborate rather than basic colour-terms (your imaginary underwear wouldn’t be ‘pink’ or ‘black’, it would be ‘peach’ or ‘charcoal’). The workers knew these were stereotypes: the language they produced on the phone was nothing like the way they talked when they weren’t taking calls. But stereotypes, in their experience, were exactly what their customers wanted.

Since different customers were into different stereotypes, a skilled fantasy maker needed to be able to produce a range of female personae on the phone—schoolgirl, southern belle, dominatrix, bimbo, Asian woman, Black woman, etc. They prided themselves on being able to ‘do’ personae which were remote from their own real-life identities. One of the individuals Hall spoke to wasn’t even a woman, he was a man who could pass for a woman on the phone. On the question of race, the view was widely held that white women made the best Black women, and vice-versa. As one worker explained to Hall, the Black woman of the (mainly white) callers’ dreams was a two dimensional racist stereotype which white women were actually better at producing (not to mention less uncomfortable with).

Women (and men) who work the fantasy lines are like human fembots, performing a version of femininity that callers will pay to spend time with. Not only does this performance not have to be authentic to be convincing, in the context of commercial sex an authentic (i.e., non-stereotypical) performance of femininity would risk destroying the illusion which is the real object of desire.

But you might wonder, what is sex-talk like when the parties are not in a commercial relationship? Is the language less clichéd? Are the personae constructed less stereotypical? The short answer is, not necessarily. The researcher Chrystie Myketiak has analysed cybersex encounters between peers in a virtual environment which those who study it refer to as ‘Walford’. (It’s an online community which did a deal with a university: the university would host and maintain it in exchange for being able to observe and analyse what went on in it. Its members all consented, and their consent is sought again every time they log on). Here’s a typical extract from Myketiak’s data, in which the two parties have taken the roles of a male and a female (most likely this reflects their offline identities, but we don’t know for sure). In the transcript ‘F’ and ‘M’ identify the female and the male participant.

(M) [his] hot seed fills every crevice of your womanhood…
(M) Keeps fucking you hard, jolting your entire body with each thrust.
(F) Grinds you by twisting and turning, faster and faster… she really wants it rough.
(M) Gives it to you so hard your ancestors feel it.
(F) Is pleasured senseless, she has tears coming to her eyes.
(M) Reaches around and rubs your hardened clit, violently.
(F) Whispers “Know any other wild positions? Hehe…”
(M) Whatever comes to mind is good for me.
(F) Same here… surprise me…

Linguistically, what stands out about this extract is the way the participants mix third-person narrative, second-person address and occasional use of the first person. You don’t get that in other kinds of porn. But the narrative itself is full of porn clichés, and the whole thing is organized around the heteropatriarchal proposition that in sexual encounters, men lead and women follow. Men are dominant, women submissive: whatever men desire is also pleasurable for woman. If he’s violent, that’s OK, because ‘she really wants it rough’.

In a paper she gave at a 2012 conference on robots (there’s a written draft version available here), the lawyer Sinziana Gutiu argued that if AI sexbots are successfully developed they will further entrench these ideologies of gender and sexuality. She thinks this will be a serious problem, because the combination of verbal and physical interaction which intelligent sex robots permit will have an even more powerful effect than porn does now on men’s real-world interactions with human women.

Gutiu points out that the advanced capabilities designers hope to give future robots will make them seem human, and that perception will be reinforced by the anthropomorphizing tendency mentioned earlier. However, some key human qualities will be deliberately left out of their design—like the ability to verbalize pain or emotional distress. Above all, there will never be any question about whether a robot consents to sex. It is there for its user to have sex with as and when he wishes. She goes on:

By circumventing any need for consent, sex robots eliminate the need for communication, mutual respect and compromise in the sexual relationship…allowing men to physically act out rape fantasies and confirm rape myths.

And she believes men’s experience with these nearly-but-not-quite human entities will lead at least some of them to assume that real women can legitimately be treated in the same ways.

Her paper also hints, however, that intelligent sex robots could in principle be designed to do the opposite of what she fears. If they were trained to engage their human user in talk which emphasizes negotiating consent, communicating your desires and feelings, respecting others’ boundaries and being willing to compromise, they could be used to teach a different way of interacting from the one which is modelled in porn. I’m no AI expert, but that sounds to me a lot more difficult than making a bot that ‘talks dirty’. And also, of course, a much less attractive proposition for investors whose aim is to make a profit.

It’s not only the money angle which makes me think that Gutiu’s educational sexbot is less likely to materialize than the pornified fembot of her nightmares. Therapists tell us that the most important reason why intimate relationships fail is a lack of open and honest communication, particularly about sex. But intimacy with another person doesn’t seem to be what a lot of men are looking for. If what they wanted was an intimate encounter with a female human being—a unique, complex individual with her own thoughts, feelings and desires—how could so many men have fallen for the fembots of Ashley Madison?