As the US midterms approach, disinformation on social media is ramping up on all sides.
Last week, US law enforcement announced Russia’s infamous Internet Research Agency troll farm is actively targeting the midterms—and it nearly doubled its budget in early 2018. Bangladeshi trolls are using divisions in US politics to sell T-shirts. And researchers say the biggest increase in propaganda at the moment is coming from domestic troll and bot networks.
There’s no way to be sure your social media diet is propaganda-free. Technology’s greatest powers have struggled to beat down the networks; when Twitter published the histories of thousands of Russia-linked accounts last week, independent researchers quickly pointed out that it had missed vast swaths of them. But there are ways to work out what narratives Russia and others are pushing.
Disinfo 2018: The dashboard, run by internet security firm New Knowledge, tracks 1,055 Russian Twitter accounts and 2,020 rightwing US accounts. It charts the accounts’ top hashtags, keywords, domains, and posts per hour. Uou can toggle between US-based accounts, Russian accounts focused on the US, and Russian accounts focused on global politics. However, you can’t see the individual accounts.
Hamilton 68: Tracks a different set of 600 Russia-linked Twitter accounts, providing similar information. Run by the German Marshall Fund think tank and built with the help of New Knowledge.
Bot Sentinel: Tracks tens of thousands of Twitter bots and trolls. Bot Sentinel has a much looser verification system and doesn’t try to verify where the operators are based. However, you can see much more about individual accounts and who’s saying what. It also has a browser extension that lets you check the likelihood of whether any given account is a bot, troll, or “fake news” purveyor—and actively flags notable culprits. Some of these are politically controversial: president Donald Trump is flagged.
Hoaxy: Visualizes how articles have spread on Twitter, and indicates whether the accounts pushing those articles are more likely to be humans or bots. A bit clunky to use but you can see in realtime where and how information moves. Run by Indiana University.
Researchers readily admit that the above tools cover a small fraction of the social media space. “There are still surprisingly so few resources for understanding this problem,” says Jonathon Morgan, CEO of New Knowledge.
One huge gap: There’s no way for a lay person to see how Facebook is being manipulated in realtime. Monitoring disinformation on the platform “just doesn’t work” in a public-facing dashboard format, Morgan says—most reports come out after the fact, such as when Facebook removed hundreds of domestic accounts in early October.
So, no matter how much troll tracking you do, bear in mind that social media just isn’t the best place to form your opinions about politics, as Morgan points out. “That’s ultimately a pretty superficial way to talk about things that are as important as elections,” he says.
“Maybe it’s better to form that opinion based on conversations in person, God forbid.”