Machines judged a beauty contest last year and the winners, chosen from photographs women had submitted, were overwhelmingly white. Meanwhile, a Microsoft chatbot went from wide-eyed naïf to misogynist Nazi sympathizer after just a day among trolls on Twitter.
It’s clear that artificial intelligence can learn to be racist or sexist from human interactions. And of course AI doesn’t just receive bias, but can even propagate it by recreating tradition gender roles in service-oriented software, for example.
It’s worth thinking about this in the context of the recent #MeToo campaign, where people shared their experiences and thoughts on sexual harassment and assault.
Perhaps because of the way social feeds work, it can appear that men have kept relatively silent in the conversations that have followed the allegations of decades of sexual harassment and assault by Hollywood executive Harvey Weinstein (a spokeswoman for Weinstein has said he unequivocally denies allegations of nonconsensual sex). It can be scary to think of joining in a conversation that in some way seems to be “not about you,” or when you fear being told what you think and feel is wrong. Some men say they’ve been motivated not to keep quiet and listen by the idea that men dominate conversations in real life (and in pop culture).
But think of the traces those decisions leave—or rather don’t leave. We think of “big data” as something amorphous and separate from humans. But data is formed by millions of our daily interactions. For example, in the future, some algorithms may allow artificial intelligence to group together a bunch of tweets under the label “sexual abuse,” while other algorithms allow it to understand that the Twitter handles attached to these tweets largely includes names considered “female.” What will the machine make of that?
Communication may be still a human endeavor, but we live in a world that is increasingly shaped by machine intelligence, and it’s worth thinking about what machines may learn about humanity by what we say—and the gaps they may fill in by what we don’t.