Arwa Mahdawi 

What’s up with ChatGPT’s new sexy persona?

OpenAI’s updated chatbot GPT-4o is weirdly flirtatious, coquettish and sounds like Scarlett Johansson in Her. Why?
  
  

Man wearing white T-shirt sits on bed holding phone
‘There’s nothing wrong with giving your chatbot a voice like Johansson in Her.’ Photograph: Warner Bros./Sportsphoto/Allstar

“Any sufficiently advanced technology is indistinguishable from magic,” Arthur C Clarke famously said. And this could certainly be said of the impressive OpenAI update to ChatGPT, called GPT-4o, which was released on Monday. With the slight caveat that it felt a lot like the magician was a horny 12-year-old boy who had just watched the Spike Jonze movie Her.

If you aren’t up to speed on GPT-4o (the o stands for “omni”) it’s basically an all-singing, all-dancing, all-seeing version of the original chatbot. You can now interact with it the same way you’d interact with a human, rather than via text-based questions. It can give you advice, it can rate your jokes, it can describe your surroundings, it can banter with you. It sounds human. “It feels like AI from the movies,” OpenAI CEO Sam Altman said in a blog post on Monday. “Getting to human-level response times and expressiveness turns out to be a big change.”

It hasn’t gone unnoticed that when Altman says “AI from the movies”, he seems to be referring to the 2013 movie Her, which featured a lonely writer falling in love with an operating system (voiced by Scarlett Johansson) designed to meet his every need. In many of the demonstrations, the bot’s voice sounds a lot like Johansson. And after the OpenAI event on Monday, Altman tweeted with a single word: “her”.

There’s nothing wrong with giving your chatbot a voice like Johansson in Her. What does feel a little weird, however, is making your chatbot weirdly flirtatious and coquettish. While the chatbot has a male voice in some of the demo videos (namely the one in which it helps a new dad practice dad jokes), a number of the demo videos, including one where she helps someone with interview prep, feature the Johansson-like voice sounding oddly seductive.

“Am I the only one that gets the ick from how flirty this is?” technologist Nick St Pierre asked on Twitter/X, linking to the interview prep videos. Judging by the responses to his tweet, he wasn’t.

While GPT-4o’s flirtatiousness was glossed over by a lot of male-authored articles about the release, Parmy Olson addressed it head-on in a piece for Bloomberg headlined Making ChatGPT ‘Sexy’ Might Not End Well for Humans.

“What are the social and psychological consequences of regularly speaking to a flirty, fun and ultimately agreeable artificial voice on your phone, and then encountering a very different dynamic with men and women in real life?” Olson asks.

OpenAI didn’t get back to Olson when she posed them that question. But they didn’t need to. We don’t have to hypothesize about the consequences of giving GPT-4o the ability to sound like a submissive young woman who caters to your every need because there’s already a ton of data on this. We’ve been having conversations about the social impact of female-sounding voice assistants like Apple’s Siri and Amazon’s Alexa for a very long time now.

In 2018, for example, USC sociology professor Safiya Umoja Noble warned that the gender of assistants’ voices affects how we speak to them. Noble told New York Magazine that virtual assistants have produced a “rise of command-based speech at women’s voices. ‘Siri, find me [fill in the blank]’ is something that children, for example, may learn to do as they play with smart devices. This is a powerful socialization tool that teaches us about the role of women, girls, and people who are gendered female to respond on demand.”

A highly influential 2019 Unesco report echoed Noble’s warning. “Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the report said.

What’s crucial to note here is that these voice assistants don’t just send a signal about gender norms, they send it at massive scale. The Unesco report explains, for example, that Apple’s Siri “made ‘her’ debut not as a genderless robot, but as a sassy young woman who deflected insults and liked to flirt and serve users with playful obedience … This technology was a flagship feature in the nearly 150m iPhones Apple sold from late 2011 and through 2012. This singular technology – developed behind closed doors by one company in one state in one country, with little input from women – shaped global expectations of what an AI assistant is and should be, in a mere 15 months.” And, indeed, you can draw a direct line from Siri’s personality to flirty GPT-4o.

That 2019 Unesco report came with a call-to-action. “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them,” Saniye Gülser Corat, Unesco’s director for gender equality, said.

Big tech has paid some lip service to this and has started offering more masculine voice options with their voice assistants. Still, the new and improved ‘sexy’ version of ChatGPT seems to make it clear that nobody at OpenAI seems particularly interested in paying attention to how AI technologies are gendered and the ramifications of this. This might be because it’s a male-dominated company which didn’t have a single woman on its board for a few months. (They appointed three women board members in March.) It might be because they have board members like Larry Summers: the economist and former Harvard president who, in 2005, famously downplayed discrimination and suggested that men outperform women in maths and sciences because of biological difference. (He later apologized.) Or it might just be because they can’t be bothered to think about things like gender; it simply doesn’t interest them.

For all the ooh-ing and ahh-ing about how innovative ChatGPT is and how it’s going to change the world, it feels like nothing has changed when it comes to misogyny in tech. It’s frustrating that we are still having the same conversations about bias in digital technology that we’ve been having for more than a decade. It’s infuriating that some of the highest-paid people in the world, people who are regularly lauded as the greatest minds of their generation, are still seemingly oblivious to gender norms and their roles in perpetuating them. We’re constantly being told that OpenAI is building the future, but I don’t really know whether a sexy ChatGPT is progress.

 

Leave a Comment

Required fields are marked *

*

*