Skip to main content

Microsoft’s Cortana gets sexually harassed, but she fights back

You don’t have to show any skin to be sexually harassed — and you don’t even have to be human to be the subject of offensive, suggestive commentary, it seems.

Deborah Harrison, an editorial writer in the Cortana division of Microsoft, told CNN that even female virtual assistants can’t escape the dirty and sometimes disrespectful minds of their human users. And as AI systems become increasingly humanized, this virtual harassment is a disturbing trend.

Recommended Videos

A large part of the issue stems from the fact that the vast majority of AI assistants feature female voices. Beyond Microsoft’s personal assistant, Cortana, there are Apple’s Siri and Amazon’s Alexa, and even Hollywood portrayed the operating system of the future with a sultry Scarlett Johansson. And beyond the issue of consistently conforming to gender stereotypes by using female personas in subordinate roles, it now appears that people are getting so comfortable with their AIs that these machines are being bombarded with questionable questions.

Please enable Javascript to view this content

According to Harrison, when Cortana was first launched in 2014, “a good chunk of early queries were about her sex life.” Now the team behind the AI is fighting back; Cortana is a true woman of the 21st century, you see, and she doesn’t take any crap.

“If you say things that are particularly a**holeish to Cortana, she will get mad,” said Harrison during a talk at the Re•Work Virtual Assistant Summit in San Francisco. “That’s not the kind of interaction we want to encourage.”

To combat this sort of behavior, Harrison and seven other Microsoft writers tasked with the fascinating job of determining how Cortana responds to inquiries have decided to be very careful with the way in which they structure this virtual woman.

While she is very clearly female — she’s represented by a female avatar, and the flesh-and-blood human woman Jen Taylor supplies her actual voice — Cortana doesn’t succumb to many sterotypical female pitfalls. She doesn’t find herself constantly apologizing, nor does she seem particularly, well, subordinate. And according to Harrison, that’s all a conscious decision made by the Microsoft team.

“We wanted to be very careful that she didn’t feel subservient in any way … or that we would set up a dynamic we didn’t want to perpetuate socially,” she told CNN.

A big part of creating a believable persona for a virtual assistant, Microsoft says, is to talk to human beings who have that actual job. Not only does this give them better material, but it also helps the team address harassment issues from the people who have to deal with it firsthand.

So don’t mouth off to Cortana. You might not like what she says in response.

Lulu Chang
Former Digital Trends Contributor
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
Microsoft Bing and Edge are getting a big DALL-E 3 upgrade
Microsoft Copilot comes to Bing and Edge.

Microsoft Copilot is coming to Bing and Edge Microsoft

You'll soon be hearing more about Microsoft Copilot and Bing Image Creator as these innovative technologies come to Microsoft Edge and Bing. The news of their arrival was delivered at Microsoft's Surface Event, along with several more AI and hardware announcements.

Read more
Microsoft Paint is about to get so much better
A screenshot of Microsoft Paint shows the new layers feature.

A screenshot of Microsoft Paint shows the new layers feature. Microsoft

It's hard to complain about Microsoft Paint. After all, it's bundled with Windows for free and provides the basics for annotating a screenshot with boxes, arrows, and text. It's OK, but compared to every other paint app on the planet, it feels outdated.

Read more
Bing Chat fights back against workplace bans on AI
Bing Chat shown on a laptop.

Microsoft has announced Bing Chat Enterprise, a security-minded version of its Bing Chat AI chatbot that's made specifically for use by company employees.

The announcement comes in response to a large number of businesses implementing wide-reaching bans on the technology -- including companies like Apple, Goldman Sachs, Verizon, and Samsung. ChatGPT was the main target, but alternatives like Bing Chat and Google Bard were included in the bans.

Read more