Big Tech has exactly one job to do in a pandemic

Protect yourself.
Protect yourself.
Image: Massimo Pinca/Reuters
We may earn a commission from links on this page.

As the reality of life under quarantine in the age of coronavirus has crystalized, so has Big Tech’s top job in this new era: combatting a plague of misinformation. So far, things are off to a rocky start.

Even prior to the World Health Organization’s decision on March 12 to label Covid-19 a pandemic, WHO director-general Tedros Adhanom Ghebreyesus warned that the world wasn’t just fighting an epidemic, but an “infodemic” as well.

“Fake news spreads faster and more easily than this virus and is just as dangerous,” he said in February. Warnings about this “infodemic” have accompanied the WHO’s numerous status updates on the crisis ever since.

A tsunami of misinformation has indeed followed the spread of infection. Fake cures involving colloidal silver, vitamins, and essential oils, unfounded posts warning about the use of anti-inflammatory drugs, and false claims that drinking excessive amounts of water could ward off the disease are just a few examples. As Sylvie Briand, the architect of the WHO’s strategy to counter the infodemic, recently told the Lancet, “Now with social media this phenomenon is amplified, it goes faster and further, like the viruses that travel with people and go faster and further.”

Over the past few years, technology firms have come under increasing pressure to tackle just this kind of misinformation. Covid-19 is proving to be not only a problem of an entirely different magnitude, but one that may be deeply difficult to solve in the long term.

Big Tech appears aware of the scrutiny it’s under at the moment. In an unprecedented move, Facebook, Google, LinkedIn, Microsoft, Reddit, Twitter, and YouTube jointly vowed to fight coronavirus-related misinformation. “We are working closely together on Covid-19 response efforts,” reads a March 16 joint statement. “We’re combating fraud and misinformation about the virus, elevating authoritative content on our platforms, and sharing critical updates in coordination with government healthcare agencies around the world.”

What does that mean, exactly?

Google and YouTube are now promoting information from the WHO, the Centers for Disease Control and Prevention (CDC), and the New York Times when people search for virus-related information, and Google set up a dedicated informational website.

Twitter, meanwhile, is cracking down on messages that it deems to increase “the chance that someone contracts or transmits the virus.” And during a conference call with reporters on March 18, Facebook CEO Mark Zuckerberg outlined plans to install a coronavirus information center with resources from the WHO and CDC at the top of Facebook users’ news feed. This is likely to help at least in part because, as a survey by the PR giant Edelman recently confirmed, most people find doctors as well as experts from the CDC and WHO to be the most trustworthy sources.

“What we are seeing is tech companies realizing that public perception of their actions is important,” says Diana Bossio, director of the media and communication postgraduate program at Swinburne University of Technology in Australia.

Still, Covid-19 is shaping up to be a misinformation problem of greater proportion than even the meddling that marked the 2016 US presidential election and the Brexit referendum in the UK.

To understand why, consider the fundamental laws of rumor established in the late 1940s by Harvard University professors Gordon W. Allport and Leo Postman. In The Psychology of Rumor, they put forward a mathematical formula describing the mechanics of falsity: the number of rumors in circulation grows with the importance of the subject multiplied by the ambiguity of the evidence pertaining to the topic.

The coronavirus pandemic is life-threatening and of high-concern to pretty much all of us. But not much is known about the disease. It is important yet ambiguous. In 1947, when Allport and Postman were studying the word-of-mouth spread of rumors about the war, this would have been a perfect storm. What it amounts to today, with millions of people trapped indoors refreshing their feeds over and over, is something altogether worse.

“The difference between political misinformation and what we have now is that face-to-face interactions are not as possible with social distancing, so lots of people are filling their social needs through Twitter and Facebook,” says Monica Stephens, an assistant professor at University at Buffalo who studies social media and misinformation.

Big Tech’s early efforts have been prone to snagging

On March 18, Facebook incorrectly flagged coronavirus news stories and other accurate information as spam or violations of community standards—the result of a bug, according to the company. Google’s efforts, meanwhile, have been repeatedly mischaracterized or overblown by the White House. And Twitter declined to remove a tweet from Elon Musk that seems to pretty clearly violate the site’s brand new misinformation policy. The Tesla founder, who has repeatedly downplayed the coronavirus threat, tweeted that children are “essentially immune” despite early evidence that, while on average younger people seem to be less susceptible to Covid-19, it can still cause major health problems for children and infants.

“This challenge [of preventing misinformation] becomes compounded when we lose trust in ‘official’ sources—e.g. government agencies charged with managing the response,” argues Kate Starbird, an engineering professor at the University of Washington, in an analysis she posted on Twitter. “When elected leaders share dubious info and contradict their own agencies and scientists, this foments distrust and diminishes our collective ability to find the best information at this time—increasing uncertainty and anxiety, and even causing people to take the wrong actions.”

This state of affairs is sure to continue. An election or referendum, even a hugely consequential one, is a relatively brief moment in time with limited options for possible outcomes. Covid-19, by contrast, is a problem that has so far only multiplied and metastasized. This suggests the focus on how well Big Tech battles online misinformation is unlikely to dissipate anytime soon.

Silicon Valley has long lionized leadership in times of crisis, whether in the parable of Apple co-founder Steve Jobs after he was fired from his own company or in the famous pivots of the likes of YouTube and Twitter. When the industry takes stock of its work during the coronavirus pandemic, whose leadership will be lionized then?