In May 2016, two groups of protesters gathered outside a mosque in Houston. One, bearing guns and Confederate flags, was railing against the “Islamization of Texas.” The other was protesting the protest.
Both sides had one thing in common: They had unwittingly gone there at the behest of Russian trolls on Facebook.
A tool launched today aims to give new insight into how this kind of manipulation—whether stemming from US or foreign actors—spreads online, by monitoring over 1,000 of the most active extremist accounts on Twitter. The Exploring Hate Online dashboard tracks the most popular topics, hashtags, articles, and links being shared and discussed by far-right accounts. It was developed by the New America think tank and the Anti-Defamation League, and shared exclusively with Quartz.
“These are influencers. These are people who are reaching millions of people on Twitter and other networks, and we need to understand what they’re talking about in real-time so we can start to have an impact and blunt their tactics and nefarious operations,” says Dipayan Ghosh, a fellow at Harvard’s Kennedy School and former tech policy advisor in the Obama administration, who helped develop the dashboard.
The tool is not aimed at enforcement, but at helping communities—ranging from academics to government to major tech companies—get a better understanding of extremism networks. “The dashboard will help us see when things are spiking—whether they’re anti-Muslim, anti-Semitic, or anti-LGBTQ,” says Robert McKenzie, a senior fellow at New America. The tool works in a similar way to Hamilton 68, a dashboard run by the German Marshall Fund think tank, which shows analysts what Russia-linked Twitter accounts are focusing on.
The dashboard has been deliberately launched ahead of the 2018 US midterm elections. “We’re going to see a spike in the spread of disinformation,” Ghosh says. “It’s really insane what’s happening.”