Skip to navigationSkip to content
AP Photo/Virginia Mayo
As if computers understand race
SHARE THE BLAME

Google search isn’t racist, society is

Ananya Bhattacharya
By Ananya Bhattacharya

Tech reporter

The computer brain learns everything from its human users—and so it learns all the stereotyping too.

An 18-year-old high school senior from Virginia, Kabir Alli, shared a video on Twitter showing the discrepancy between searching for “three black teenagers” versus “three white teenagers” on Google. While the former pulled up police mugshots,  changing one word—”black” to “white”— yielded smiling, youthful photos.

The video generated a tweetstorm, with over 67,000 retweets and comments from people calling Google out on the racial bias. This isn’t the first time Google search results have irked people. Another Twitter user had noticed similarly discriminatory results when searching for professional and unprofessional hairstyles for work.

The search giant was also criticized for allowing an anti-semitic Chrome plugin to exist, as discovered by Mic. In the past, Google Photos had also incorrectly categorized black people as gorillas.

Alli does not believe Google is racist but he does expect the company to bear some responsibility. “I understand it’s all just an algorithm, based on most visited pages, but Google should be able to have more control over something like that,” he told USA Today.

The Silicon Valley behemoth maintains that its search results have very little to do with the company and its programmers—it’s all about the algorithm. “Algorithms rely on more than 200 unique signals or ‘clues’ that make it possible to guess what you might really be looking for,” the company says in a blog post, listing things like the freshness of content, your region and page rank as factors.

“Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they’re described online,” a Google spokesperson said. “This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query. These results don’t reflect Google’s own opinions or beliefs–as a company, we strongly value a diversity of perspectives, ideas and cultures.”

Google is a mirror: The algorithm works with what it’s given—a persistent bias in society manifests itself in the online landscape in the form of meta-tagged images.

Alli acknowledged the shortcomings in his interview with USA Today. “It shouldn’t be so difficult to find normal non-offensive pictures of three black teenagers. That search sort of portrays us as a whole and those pictures are not us. We have a lot to offer and that search does not do us any kind of justice.”

Subscribe to the Daily Brief, our morning email with news and insights you need to understand our changing world.