Take the pledge to vote

For a better tommorow#AajSawaroApnaKal
  • I agree to receive emails from News18

  • I promise to vote in this year's elections no matter what the odds are.
  • Please check above checkbox.

    SUBMIT

Thank you for
taking the pledge

Vote responsibly as each vote counts
and makes a diffrence

Disclaimer:

Issued in public interest by HDFC Life. HDFC Life Insurance Company Limited (Formerly HDFC Standard Life Insurance Company Limited) (“HDFC Life”). CIN: L65110MH2000PLC128245, IRDAI Reg. No. 101 . The name/letters "HDFC" in the name/logo of the company belongs to Housing Development Finance Corporation Limited ("HDFC Limited") and is used by HDFC Life under an agreement entered into with HDFC Limited. ARN EU/04/19/13618
LIVE TV DownloadNews18 App
News18 English
News18 » Buzz
2-min read

'Siri' and 'Alexa' are Learning From Humans and Perpetuating Gender Stereotypes

Gender inequality which exists in everyday scenarios are so normalized we don't see them as problematic or sexist anymore, aren't limited itself to just humans. It's moved onto technology - and your personal digital assistants like Siri and Alexa.

Raka Mukherjee | News18.com@RakaMukherjeee

Updated:May 24, 2019, 8:36 AM IST
facebookTwitterskypewhatsapp
'Siri' and 'Alexa' are Learning From Humans and Perpetuating Gender Stereotypes
Gender inequality which exists in everyday scenarios are so normalized we don't see them as problematic or sexist anymore, aren't limited itself to just humans. It's moved onto technology - and your personal digital assistants like Siri and Alexa.

If you ask your Apple device, 'Hey Siri, can you make me a sandwich?' She doesn't respond with 'It's not my job to,' or 'Make it yourself.'

Her sly answer is something else: 'I can’t. I don’t have any condiments.'

Seems strange? How many women in real life do you hear say that? The reason behind it is simple. Although we hear the stereotypical women belonging in the kitchen, we perhaps don't really believe it as much (Or as I would like to believe). Women have full-time career opportunities, are advancing in every field, and there's more acceptance of women doing things other than her 'traditional gender roles' now than ever before. That said, we're far from achieving anything close to equality. And furthering the divide is gender stereotypes.

Gender stereotypes are so common and normalized that we often don't see them as problematic. And just like everything else, even technology is learning the human language. Now it seems even your personal digital assistants like Siri and Alexa have learnt gender stereotypes.

A research paper titled "I'd blush if I could: closing gender divides in digital skills through education" conducted by UNESCO in for the EQUALS Skills Coalition finds that our outlook on gender stereotype extends to our digital assistants.

In the piece called "The rise of Gendered AI and its troubling repercussions," the study addresses how we speak to our digital assistants in a gendered way. The study addresses three kinds of digital assistants - Voice assistants: Alexa, Google, Siri. Chatbots: automated AI replying. Virtual agents: Technology that communicates with users through VR or AR mediums.

The study asks the question why Siri, Alexa and Cortana are all female? To justify the decision to make voice assistants female, companies like Amazon, Apple has cited academic work demonstrating that 'people prefer a female voice to a male voice.'

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report states.

“Because the speech of most voice assistants is female, it sends a signal that women are ... docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”

So, while the questions directed at the voice assistants may be sexist and further perpetuating stereotypes, the answers are equally problematic.

But hey, is it just female assistants? Google search is gender-neutral, (or gender-less because its an AI search portal) but people still ask it absurd questions. For example, one of the top questions searched on Google isn't an answer for a common query. Its "Will you marry me?"

Get the best of News18 delivered to your inbox - subscribe to News18 Daybreak. Follow News18.com on Twitter, Instagram, Facebook, Telegram, TikTok and on YouTube, and stay in the know with what's happening in the world around you – in real time.

Read full article
Next Story
Next Story

Also Watch

facebookTwitterskypewhatsapp

Live TV

Countdown To Elections Results
To Assembly Elections 2018 Results