Skip to navigationSkip to content
Close
Google disables some Gmail smart suggestions because it can’t fix AI gender bias

Google disables some Gmail smart suggestions because it can’t fix AI gender bias

Read more on The Next Web

Contributions

  • How can we ensure that technology is democratizing opportunity instead of reinforcing existing barriers? As we transition to a more automated world, we need to ensure that algorithms liberate us from bias instead of codifying it.

    To be honest, I had assumed that Google would be smart enough to figure

    How can we ensure that technology is democratizing opportunity instead of reinforcing existing barriers? As we transition to a more automated world, we need to ensure that algorithms liberate us from bias instead of codifying it.

    To be honest, I had assumed that Google would be smart enough to figure this out. But I guess turning it off is better than getting it wrong.

  • Machines learn from humans. And humans are awful.

  • I don’t agree with “humans are terrible and AI is learning from us...” I don’t think humans are all that bad. The AI doesn’t know any better just as a child wouldn’t unless someone corrected and thought them better.

  • This bias has been shown repeatedly in other areas, including, far more troublingly, in criminal sentencing and racial bias. At the end of the day, humans are completely bias and programs created by humans can’t do much better because they’re based on algorithms created by those very humans.

  • I object to autocorrect because its choices offend my intentions. Can I get that disconnected?

  • Kudos. It's a step in the right direction, and being gender neutral isn't the worst thing in the world. It may even be better.

  • I find it presumptuous and it often proposes actions or verbs nowhere near the direction my note was intended. It's more annoying than helpful.

  • The bias is troublesome, but I’ve been impressed by Google’s smart suggestions thus far. Huge time saver!

  • AI catering to preexisting human biases is further proof of the axiom, "Garbage In, Garbage Out."

  • Setting the right metric is key role for reducing the inherent biase.

    Big question is why the biase happend in our history.

  • This is a nice explanation of the problem with a decent outline on the way Google is/ is not addressing it.

  • Who cares if it offends? You going to quit Google? Good luck with that.

  • We have to ensure that algorithms, AI or Machine Learning must recognize what the biases are even if it is very difficult. Honestly, I had assumed that Google would be smart enough to figure this out. However, they couldn’t say their figuring out were correct.

  • I’m not sure it’s technically possible but can’t they create a “gender neutral naming/pronoun corrector” and run the results of the first one through that before piping it to the user. Kinda like humans do when one may have been raised with a bias that they are working to neutralize.

  • Data-backed AI is basically an inductive inferential machine. In such cases, when it definitely needs to be fixed, it’s society and human beings that need to be fixed, too. Please, let’s not enhance our stupidity.

  • As far as it goes, disabling part of the so-called smart suggestion could be the best solution (for now). Artificial intelligence learns from all the data it collects and thus when gender biases appear repeatedly, this convenient function may turn itself into a electronic sexist. By the way, being gender

    As far as it goes, disabling part of the so-called smart suggestion could be the best solution (for now). Artificial intelligence learns from all the data it collects and thus when gender biases appear repeatedly, this convenient function may turn itself into a electronic sexist. By the way, being gender biased is only one displeasing consequence, next time it could be racial biased. Since we are fighting for an equal society, we‘d better train our intelligent assistants —products that are related to artificial intelligence—to be neutral and equal.

  • I've always understood that our biases were a shortcut to protect us, based on prior learned experiences. To remove issues of race, sex and class from the issue, one might say that one is biased against venomous snakes, for example.

    I'd like to hear the I put of a psychologist on this issue. I believe

    I've always understood that our biases were a shortcut to protect us, based on prior learned experiences. To remove issues of race, sex and class from the issue, one might say that one is biased against venomous snakes, for example.

    I'd like to hear the I put of a psychologist on this issue. I believe that we should tread very carefully indeed in field for fear of unintended consequences.

  • I’m not a fan of “smart suggestions.” They make me challenge my own syntax and intentions mid-sentence. Now I have to worry about my AI turning misogynistic. Just let me type my own sentences! I’m good.

  • Gender is a reality. Why is this stuff hard to except? Water is wet. Clouds are above us. Ice is cold. Seems easy.

  • And what about their Chinese spy search engine?