A digital funnel drives people to commit hate crimes in real life

The man who killed 11 people in a Pittsburgh synagogue posted radical comments on the website Gab.
The man who killed 11 people in a Pittsburgh synagogue posted radical comments on the website Gab.
Image: Reuters/Cathal McNaughton
By
We may earn a commission from links on this page.

The gunman accused of killing 11 people in a Pittsburgh synagogue over the weekend appears to have passed through the final stage of a funnel of online extremism.

Robert Bowers was reportedly riled by Donald Trump’s baseless claims that a “migrant caravan” slowly approaching the US-Mexico border was filled with “criminals and Middle Easterners.” He wrote on online platform Gab that “[Jewish organization] HIAS likes to bring invaders in that kill our people. I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.”

While we don’t know yet exactly how Bowers’s mindset evolved, a model of the online radicalizing process can be found in Georgetown professor Fathali M. Moghaddam’s “Staircase to Extremism,” says Ben Decker, a research fellow in online disinformation at Harvard University’s Shorenstein Center. Here’s how the typical journey works, according to that model:

Tier 1—Anger

Upset by a story, whether true or false, you seek to air their grievances and try to right the situation. For some, that might mean writing to their member of Congress. For others, that can mean going on Facebook, Twitter, or YouTube, and arguing about it on those platforms.

Tier 2—Blame

If that hasn’t resolved your anger, you move to a second tier where “anger leads to identifying a target to blame,” Decker says. On Facebook, this can mean obsessing over the topic, following many platforms’ algorithms which take you to more and more extreme content—most of it targeting the emotions. You end up in hyper-partisan and/or conspiratorial groups who “provide a boogeyman, a target to direct your aggression at,” Decker says.

Tier 3—Self-marginalizing

People in such likeminded fringe groups might lament that extreme figures like race-bating conspiracy theorists like Alex Jones have been removed from Facebook, Twitter, and YouTube. They post content from fringe sites like Gab, Discord, 4chan, or 8chan, and encourage you to check it out. “All of a sudden, you’re in a digital filter bubble where the only content is on the fringes or not acceptable for mainstream media,” Decker says. “There are other voices that offer moral equivalence to acts of violence.”

You start to identify as part of a group that feels marginalized, is rife with conspiracies, and is intent on bringing justice to society at large. “You’ve gone from concern about the caravan, to blaming George Soros, to believing he and the Jewish people are the source of a whole host of things that have made life difficult,” says Decker.

Tier 4—Evangelizing your obsession

You’re now radicalized and want to take active steps to further your group’s beliefs in the outside world. Alongside other members of the group, you coordinate various actions—all of which mean returning to the platforms you started on. These might involve:

  1. Coordinated harassment of high profile people from ethnic, religious, or sexual minorities. You go back onto a site like Twitter and send vile racist abuse to, say, a Jewish journalist whose views you disagree with.
  2. Recruiting people to your favorite extremist site.
  3. Spreading disinformation.

There are various techniques for amplifying your message. You might use bots to ”megaphone the content, to give it the illusion of popularity—with the intent to create a bandwagon effect,” says Samuel Woolley, research director at the Institute for the Future’s DigIntel Lab and author of a recent report on anti-semitic online harassment.

On Twitter, the aim is not to communicate with real people, but to get the subject picked up by Twitter’s “trending” algorithm, Woolley says. If real people start replying, a human takes over the bot account and has a conversation with them. On Facebook, the idea is to “drop seeds of disinformation”—like a false news story—into a group, and then wait for real people to spread them, Woolley says.

Tier 5—Dehumanization

People from the group you blame with you no longer seem human. You’ve become so angered and certain of conspiracy that you are ready to take violent action on anyone from the group that you blame—even if you’ve never exchanged a word with them.

The websites that radicalized you will disavow the violence, but their users continue to spread racist messages on those sites, and idolize you through memes and hashtags.