The purpose behind this info graphic is not to argue that digital assistants shouldn’t have female voices, but to get people asking if how we treat AI can affect how we treat people. It has been shown that female voices are more well received by more people than men’s voices, so the decision to make them female is good business and likely just a reflection of our view of women in the past. The troubling bit is that women continue to have predominantly female voices, and more importantly, how they respond to users. When Siri doesn’t understand you, she responds with “I’m sorry, I don’t understand.” Now we have AI, given a female voice who are completely and totally resilient to your abuse.
As technology improves, we see more and more realistic representations of AI. What happens when we have a sexually active robots with female AI that are just as accepting of abuse as Siri? Will this translate to real people? Parents are already concerned with how their kids are treating digital assistants, since they have no incentive to say please and thank you, but how will this affect them if they are able to treat a physical representation the same way? This is not just an important issue for female AI, but male as well. Perhaps it will be good to keep faces off of AI, or find someway to definitively distinguish AI from real people.