The publication, produced in collaboration with the Germany Government and the EQUALS Skills Coalition – an alliance of public and private sector partners which encourages the involvement of women and girls in scientific and digital technology sectors – is called “I’d Blush If I Could.”
The title is a reference to the standard answer given by the default female-voice of Apple’s digital assistant, Siri, in response to insults from users. Apart from Siri, other “female” voice assistants also express submissive traits, an expression of the gender bias built in to Artificial Intelligence (AI) products as a result of what UNESCO calls the “stark gender-imbalances in skills, education and the technology sector.”
Hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves Saniye Gülser Corat, Director of Gender Equality, UNESCO
Several recommendations are made in the study, including advice to stop making digital assistants female by default; programming them to discourage gender-based insults and abusive language; and developing the advanced technical skills of women and girls so they can steer the creation of new technologies alongside men.
Given the explosive growth of voice assistants, says the report, there is an urgent necessity to help more women and girls cultivate strong digital skills.
Bridging the digital gender gap is an issue for all countries
Today, women are extremely under-represented in teams developing AI tools: women make up only 12 percent of AI researchers, six percent of software developers, and are 13 times less likely to file ICT (information and communication technology) patents.
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” says Saniye Gülser Corat, Director of Gender Equality at UNESCO. “Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves. To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”