North Carolina has recently been hit with the biggest chickenpox outbreak in two decades, with 36 children falling ill as of Nov. 16. The flare-up was most likely preventable: an effective vaccine for the disease, which can be fatal, has been available in the United States since the 1990s. In the meantime, the rise of social media helped usher in new ways of spreading misinformation—including rumors about the harms of vaccination.
Social media platforms provide the perfect environment for unfounded medical information to thrive. Here are some ways in which this happens:
Simple vaccination-related searches on Facebook surfaces multiple “anti-vaxxer” pages with tens of thousands of followers, as The Daily Beast reported earlier this year. And that’s on top of countless general-interest health groups or pages, like this one, where this kind of content lurks as well.
A new working paper published by the National Bureau of Economic Research (NBER) argues that Facebook groups function as anti-vaccination echo-chambers, reinforcing misleading information. A small number of users post a disproportionately large amount of content in these groups. But then, sharing by members spreads it beyond the groups to the wider community, something that happens on Facebook in various other situations.
Facebook told Quartz in October that anti-vaccination content in itself does not go against Facebook’s community guidelines, and that the company does not believe that simply removing this kind of content is a productive way to combat the misinformation. But, a spokesperson said, health-related content on pages is eligible for fact-checking by its network of partners, which means that if determined false, it could be demoted in users’ feeds (this does not apply to groups). They also said the company was exploring solutions for surfacing educational information about vaccines, but it does not do so now.
The NBER paper also suggests that after Facebook banned fake news websites from advertising, the number of shares anti-vaxxing content received dropped dramatically.
In North Carolina, the outbreak was limited to one private school, where parents were requesting religious exemptions from the vaccination requirements, reportedly in part because of fears over vaccine safety. In other places, parents have been organizing old-school “pox parties” to expose their children to the disease so that they could become immune as adults.
“Pox parties” were once a common way for parents to expose their children to chickenpox. These days, given there’s an effective vaccine for the disease, “pox parties” have gone somewhat underground.
Parents who believe vaccinations are dangerous and ineffective form Facebook groups, and invite each other to these events. In one alternative-health group, a parent asked: “Could you help me with advice about relieving the symptoms of Chicken pox [sic]? Here in the US [chickenpox] vaccine is the norm so we have secret pox parties for our kiddos. I’m going to my first one tomorrow so want to get prepared.”
The social-media pox parties aren’t a new phenomenon. The New York Times reported in 2011 that selling chickenpox-infected items (like lollipops or even spit) was thriving on Facebook, as were pox party invites. Apparently they are still going strong.
“While vaccine-hesitant parents are a very small subset of the overall population, they tend to be more vocal,” Neal Goldstein, assistant research professor at Drexel University’s Dornsife School of Public Health, told Quartz. “With Facebook and other social media sites, it is easy to spread your message to a broad audience.”
Between 2013 and 2016, one study showed that just over 2% of parents refuse to vaccinate their kids in the US.
“Chickenpox can cause serious complications, and, in rare cases, death,” Goldstein said.
On Twitter, misinformation about vaccinations is spread by bots as well as people, according to researchers from several US universities. And since vaccinations are such a polarizing issue, foreign actors have used bots and trolls to sow discord among the American public, the researchers have shown. The same Russian trolls that were used in election interference efforts in 2016, were tweeting about vaccinations. They took up, sometimes clumsily, both sides of the debate, creating an impression that it was more robust than it actually is among the public.