Apple

FN: att ha kvinnlig röst till röstassisten som förval är sexism

1101101101101011011110

I en rapport från FN så belyses det problematiska i att alla röstassistenter från de stora för företagen så som Apple, Amazon och Google är kvinnliga röster.

Rapporten säger att det reflekterar och förstärker idén om att ”assistera i stödjande roll” är kvinnlig.

In 2017, Quartz investigated how four industry-leading voice assistants responded to overt verbal harassment and discovered that the assistants, on average, either playfully evaded abuse or responded positively. The assistants almost never gave negative responses or labelled a user’s speech as inappropriate, regardless of its cruelty

Rapporten heter ”i’d blush if I could” vilket är en av de kommentarer som Apples röstassisten tidigare svarade om användaren tilltalade henne med ordet ”slut” (sv. slampa). Nu har Apple ändrat svaret till ”I don’t know how to respond to that.” Rapporten belyser problemet med att röstassisterna som finns svarar flörtigt på vad annars är stötande kommentarer. Rapporten adresserar oro kring hur barn påverkas av att exponeras för röstassistenter med enbart kvinnliga röster.

Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.

As voice-powered technology reaches into communities that do not currently subscribe to Western gender stereotypes, including indigenous communities, the feminization of digital assistants may help gender biases to take hold and spread. Because Alexa, Cortana, Google Home and Siri are all female exclusively or female by default in most markets, women assume the role of digital attendant, checking the weather, changing the music, placing orders upon command and diligently coming to attention in response to curt greetings like ‘Wake up, Alexa’.

Professor Noble says that the commands barked at voice assistants – such as ‘find x’, ‘call x’, ‘change x’ or ‘order x’ – function as ‘powerful socialization tools’ and teach people, in particular children, about ‘the role of women, girls, and people who are gendered female to respond on demand’. Constantly representing digital assistants as female gradually ‘hard-codes’ a connection between a woman’s voice and subservience.

According to Calvin Lai, a Harvard University researcher who studies unconscious bias, the gender associations people adopt are contingent on the number of times people are exposed to them. As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically.

According to Lai, the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants – and penalized for not being assistant-like. This demonstrates that powerful technology can not only replicate gender inequalities, but also widen them.

Det ska tilläggas att Apples röstassisten per default är manlig om språket på enheten är satt till franska, arabiska, holländska eller brittisk Engelska.

https://youtu.be/7-SVvtxHJGU

Via: 9to5Mac

Källa: FN

Läs mer

Populärt i bubblan idag

Inga inlägg!