A.I. used by Google in Gmail to auto-suggest words to users while they’re typing will not suggest “gender-based pronouns” for fear of misgendering people and causing offense, according to a report.
Reuters reported on Tuesday that Google’s A.I. “will not suggest gender-based pronouns because the risk is too high that its ‘Smart Compose’ technology might predict someone’s sex or gender identity incorrectly and offend users.”
“Consumers have become accustomed to embarrassing gaffes from autocorrect on smartphones,” explained Reuters. “But Google refused to take chances at a time when gender issues are reshaping politics and society, and critics are scrutinizing potential biases in artificial intelligence like never before.”
The change was made after the A.I. allegedly recommended the pronoun “him” in reference to a conversation about an investor whose gender was not specified.
“Not all ‘screw ups’ are equal,” declared Gmail product manager Paul Lambert, who claimed misgendering is “a big, big thing” to get wrong.
Google’s Senior Vice President of Engineering, Prabhakar Raghavan, further added that, “The only reliable technique we have is to be conservative.”
“You need a lot of human oversight,” Raghavan continued. “In each language, the net of inappropriateness has to cover something different.”
In May, Google encouraged its employees attending the I/O developers conference to wear preferred pronoun stickers, which included non-binary options such as “They/Them,” and “Ze/Hir.”
According to a study, just 12 percent of A.I. researchers are women, while 21 percent of tech roles at Google are taken by women.
Google has previously had problems with its artificial intelligence, and the company apologized in 2015 after its A.I. tagged a picture of a black couple as a picture of “gorillas.”
Like the pronoun issue, Google tried to fix the mistake, but ultimately just removed the “gorillas” tag altogether.
Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington, or like his page at Facebook.