Digital assistants like Siri and Alexa entrench gender biases, says UN | THE GUARDIAN


THE GUARDIAN reports that a UNESCO study says smart speakers like Siri and Alexa should not use female voices by default. The study suggests the computer-simulated voices should be treated as ‘non-human,’ and that programming for digital assistants should discourage gender-based innuendo.

The Guardian quotes Unesco’s director for gender equality, Saniye Gülser, saying, “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

‘Research released by Unesco claims that the often submissive and flirty responses offered by the systems to many queries – including outright abusive ones – reinforce ideas of women as subservient.’

SEE FULL STORY

Digital assistants like Siri and Alexa entrench gender biases, says UN
THE GUARDIAN | May 22, 2019 | by Kevin Rawlinson

SEE ALSO

Is it time for Alexa and Siri to have a “MeToo moment”?
CBS NEWS | May 23, 2019 | by Pamela Falk |
‘More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says, so it matters what they have to say.’

Alexa, why does the brave new world of AI have all the sexism of the old one?
THE GUARDIAN | My 22, 2019 | by Yomi Adegoke
| ‘Virtual assistants such as Google Home and Siri only encourage the attitude that women exist merely to aid men in getting on with more important things.’

UNESCO Publication
I’d blush if I could: closing gender divides in digital skills through education

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.