Should search algorithms be moral? A conversation with Google’s in-house philosopher

Should search algorithms be moral? A conversation with Google’s in-house philosopher
Image: Fanqiao Wang
We may earn a commission from links on this page.

When you have a question for the universe, where do you turn first? Google, of course.

We all expect search engines to provide the best, smartest information out there. Everyone knows someone who trusts search results above their doctor’s expertise. We even feel smarter when we’re “in search mode,” according to a 2015 Yale University study (whether we find what we were looking for or not).

But one of the first search results when you Google “What happened to the dinosaurs?” is a website called Answers in Genesis. It explains that “the Bible gives us a framework for explaining dinosaurs in terms of thousands of years of history, including the mystery of when they lived and what happened to them.” Scroll down, and you’ll find a collection of acerbic articles in response to the Biblical theory and another collection of articles responding to those responses. Only below, and nearly forgotten, finally appear scientific explanations of what really happened to the dinosaurs (a ten-kilometer wide asteroid slamming into the Gulf of Mexico).

Luciano Floridi, known as “the Google Philosopher,” thinks that’s fine.

Floridi is a philosophy professor at the University of Oxford, and the sole ethicist serving on the advisory board Google formed to help devise strategy for the European Commission’s new “Right to be Forgotten” ruling. When I spoke to him, Floridi was quick to call the creationist’s view on dinosaurs “totally insane.” But in spite of his own convictions, he says he wouldn’t advocate Google remove its faith-based answer.

“You can imagine how much I like truth as a philosopher,” said Floridi, “but Google’s a business and we have to be careful about who is managing truths. If you put too much power in the hands of a private company, that’s very dangerous. Google should not be in charge of what is true today or what is true tomorrow.”

Search is indifferent to the truth

Google has never claimed to deliver the best information. Google’s search algorithm is designed for efficiency: To provide the results users are most likely looking for and to move them on to their next site as quickly as possible. In order to do that, they pool and analyze our every digital fidget to best provide the content we’ll welcome and click on.

From the algorithm’s perspective, the Answers in Genesis page is the optimal result for our particular search. First of all, its headline, “What Really Happened to the Dinosaurs?” differs by only one word from the search terms. Not to mention, a plurality of Americans believe in creationism, as Gallup polls have found consistently since 1982, so it’s bound to be a popular option.

By contrast, if you search “Where did the dinosaurs go?” the algorithm will recommend a children’s song with that very title. Its opening lines:

Sixty-five million years ago
On the Yucatan peninsula in the Gulf of Mexico
There crashed a mighty asteroid of ten miles wide, or so
Sixty-five million years ago

In other words, Google’s search engine—and that of Bing, DuckDuckGo, and most of the other tools out there—is indifferent to truth.

Indifference to the truth happens to be part of the established philosophical definition of bullshit. ”Bullshit is grounded neither in a belief that it is true nor, as a lie must be, in a belief that it is not true,” wrote Princeton philosophy professor Harry Frankfurt in his seminal 1986 paper on communication theory, “On Bullshit.” “It is just this lack of connection to a concern with truth—this indifference to how things really are—that I regard as of the essence of bullshit.”

By this measure, even in the best of circumstances and overlooking SEO-manipulation and competitor discrimination, Google’s search engine is a bullshit engine. Search itself is a bullshit endeavor.

Yet we users treat digital search as though it were designed to provide the truth. In Feb. 2012, the Pew Internet and American Life survey found that “73% of search engine users say that most or all the information they find as they use search engines is accurate and trustworthy.

A Reuters report earlier this year indicated that ”search” is an increasingly popular “starting point” for finding news online in the twelve countries surveyed, with particularly pervasive usage in countries like Italy, Japan, Germany, and Spain. And according to a 2007 study published in the Journal of Computer-Mediated Communication, people using search engines put more trust in results that appear higher on the page, for no good reason.

The “dozen doughnuts” problem

Projects by Google and other search providers have tried to close this gap between user trust and search return trustworthiness.

In Mar. 2015, Google researchers proposed a new search model based on a knowledge-based trust (KBT) index, in which its search engine could extract the facts on a chosen website and compare them to a “knowledge vault” in order to “reliably compute the true trustworthiness levels of the sources.”

Another initiative headed by Google News chief Richard Gingras and ethicist Sally Lehrman is called the Trust Project. Launched in Oct. 2014, the Trust Project aims to identify the qualities of a news story that correlate with high standards of journalism, to make it easy for automatic aggregators like Google News to highlight the best reporting.

But how or if these measures find widespread adoption remains to be seen. What’s evident, as The New York Times reported in July, is that machine-learning algorithms trained exclusively on human behavior will reflect human biases in its answers. In other words, we are part of the problem.

While search users may believe we prefer the best and most truthful information, our clicks tell a different story. Stephen Levy reporting from Facebook’s content lab in Knoxville, Tennessee, dubs this the “dozen doughnuts problem:”

“Many people conscious of their weight know it’s not a good idea to eat a doughnut every day,” Levy writes. “But once that delicacy is in front of you … Oh, what the hell!”

In the case of search, we prefer media doughnuts—links that promise such swift satisfaction that clicking is irresistible. This in comparison to a ruminative exposition that puts accuracy and attention to detail above all else, and takes at least an hour to digest—that would be media kale.

A few media companies generate lots of cash pushing media doughnuts on Facebook. Meanwhile the forms of reporting that earn the bulk of industry awards, like investigative journalism, are struggling to keep themselves alive and thriving. “The impact of digital media and dramatic shifts in audience and advertising revenue have undermined the financial model that subsidized so much investigative reporting during the economic golden age of newspapers, the last third of the 20th century,” wrote Leonard Downie Jr. for the Washington Post on the 40th anniversary of the Watergate scandal, in 2012.

Although a legion of nonprofit news organizations with hard-hitting missions has emerged over the last decade, they too are struggling to break even and in recent years have been “losing traffic share to commercial news organizations,” according to Nikki Usher and Matthew Hindman’s analysis of the Knight Foundation’s 2015 report on the subject. “Nonprofit news is never going to fill the news gap in the states and communities that need it most,” they concluded.

Cultural critics have contended for centuries that society can only improve when fields like investigative journalism flourish, and complex issues can gain exposure and drive informed debate. We often admire our friends who actually read the difficult, sourced stuff (and have all been guilty of pretending to read it ourselves once in a while). This deep-rooted sense of virtuous consumption demands that we consume more media kale than we actually do.

Sticking to a healthy information diet

So should the Facebooks and Googles of the world—our few, major content distributors—put users on a healthy diet, by filtering search results to reflect the truest truths and News Feeds to promote nuanced, substantial discourse?

“The day Facebook does that, we start living in a utopia,” Floridi opined to me.

With regards to search, Floridi concedes that undisputed falsehoods—that 2 +2 = 5, for example—should be monitored. “Controlling the truth value of what is being circulated, information quality,” he says. “Absolutely, it should be something that gets checked.”

But that doesn’t mean we should entrust Google to fact-check itself. “Now we have the impression that Google provides us with true information, but the day we’re told ‘your truth has been deleted by a Google official,'” he explains, “I’m going to be really scared. That is the ultimate Big Brother.”

One way to fix search without inviting a new Big Brother is to consider the knowledge it delivers as a basic, common necessity, and as such, one that requires a democratically-empowered quality control. Just as governments put books in schools to make us smarter, add fluoride to the water to make us healthier, subsidize green tech to save the environment, and police the streets to prevent us from harming ourselves, the lords of search could keep us honest by responding to user demand for truth on certain issues.

Of course, Google and Facebook (and Apple with its upcoming Apple News app, among others—this article could have focused on countless tech companies) already study consumer preferences and privately reshape themselves based on their research. But we know already that what users unconsciously tend to select (doughnut), and what we know we should have (kale), are two different things.

Democratic governments, on the other hand, ask citizens to name their preferences and publicly reshape themselves based on elections. In theory, transparent checks and balances ensure universal accountability. If private tech companies made similar decisions with the same accountability—that is to say, if they made decisions on search mechanisms out in the open, and honestly engaged with users—we might actually end up with the truth we say we want.

Some day, perhaps, in the perfect marriage of good and useful, there could even be a competitive process for technocrats—Floridi suggests a universal search engine that presents competing companies’ results side by side. “For once you’d get different answers to your question … maybe not all of them would confirm that dinosaurs disappeared because of some kind of creationist, insane sort of theory.”

But the first step, of course, is to have an active, thoughtful digital demos. “If we could have better users, more intelligent human beings who would require a better service, those human beings would move the market and information providers like Facebook would have to react,” he said.

In other words, before inventing a moral search algorithm, we need moral users.