It starts with a single video. A YouTube search for keywords like “crisis actor,” a term used last week to discredit the outspoken survivors of a school shooting in Parkland, Florida, can drag viewers down a rabbit hole of conspiracy-related content, a researcher found.
Professor and data journalist Jonathan Albright looked at 256 videos (paywall) returned by a search for “crisis actor” on YouTube’s API, and found that the “next up” recommendations for each of the results led to nearly 9,000 conspiracy-related videos with almost 4 billion viewers in total, he detailed in a Medium post on Sunday. It’s a snapshot of the conspiracy-related content that the platform has been grappling with.
Some of the videos uncovered in Albright’s search pushed the crisis-actor conspiracy or led to popular theories around other subjects like 9/11, the JFK assassination, the moon landing, Pizzagate, the murder of JonBenét Ramsey, last year’s Las Vegas shooting, or celebrity conspiracies. Titles of videos included, “Michael Jackson Seen At His own Memorial” and “Retired Expert Pilot John Lear – No Planes Hit the Towers on 9/11,” based on a Quartz review of the data, which Albright published alongside his Medium post.
Not all the videos Albright found promoted or tied back to conspiracies. Some were from mainstream news organizations like CNN and MSNBC, or news and commentary programs like The Young Turks that fact-checked and debunked rumors and hoaxes. Others were from late-night or satirical shows like Jimmy Kimmel Live and The Daily Show with Trevor Noah, including unrelated segments such a bit of Eminem teaching Jimmy Kimmel how to rap.
But the broader body of results showed how easy it is to fall down a rabbit hole of conspiracies after watching one video. The view count for 50 of the top mass shooting-related conspiracy videos was around 50 million, wrote Albright, research director at Columbia University’s Tow Center for Digital Journalism. He added:
Every time there’s a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value. The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach.
In other words, due to the increasing depth of the content offerings and ongoing optimization of Youtube’s algorithms, it’s getting harder to counter these types of campaigns with real, factual information.
Albright’s findings did not detail how many of the conspiracy videos earned money from advertising on YouTube. He told BuzzFeed News he was working with researchers to learn more about how conspiracy videos are monetized on the platform.
YouTube, meanwhile, has been clambering to keep up with the spread of rumors and false information on its platform without censoring content altogether. It recently removed a video posted to one of the accounts run by the popular far-right site InfoWars, The Alex Jones Channel, which claimed that student Daniel Hogg, who survived the Parkland shooting, was an actor. A similar video found its way into YouTube’s Trending tab last week, before YouTube removed it from the site for violating its harassment and bullying policies. It also altered its algorithms last year to prioritize videos from credible news outlets in search results, particularly around breaking news.
The company did not immediately return Quartz’s request for comment on the latest research.