AI-powered voice assistants from Google, Amazon, Apple, and others could be perpetuating harmful gender biases, according to a recent UN report. The report, titled “I’d blush if I could” — Siri’s ...
The Feminist Internet, a non-profit working to prevent biases creeping into AI, has created F’xa — a feminist voice assistant that teaches users about AI bias and suggests how they can avoid ...
Voice assistants have historically been female. From Siri and Alexa to Cortana and the Google Assistant, most computerized versions of administrative assistants launched with a female voice and, in ...
A typical after-work scene at my house goes something like this. “Alexa,” I say. She chimes, then lights up. “Play the new Jenny Lewis album.” “Playing Jerry Lee Lewis Essentials from Apple Music.” ...
Artificial intelligence voice assistants with female voices reinforce existing gender biases, according to a new United Nations’ report. The new report from UNESCO, entitled “I’d Blush If I Could,” ...
Every time Kelly publishes a story, you’ll get an alert straight to your inbox! Enter your email By clicking “Sign up”, you agree to receive emails from ...
LONDON (Thomson Reuters Foundation) - Barking orders at a digital device that responds in a woman's voice can reinforce sexist stereotypes, according to academics and creatives who launched the first ...
Female-sounding default voices perpetuate antiquated, harmful ideas about women Female-sounding default voices perpetuate antiquated, harmful ideas about women is a Senior Producer on Decoder.
Psychologists and biologists have investigated whether voice pitch can influence how female faces are evaluated. Their conclusion: a higher voice does indeed influence how the corresponding face is ...