Photos of Donald Trump getting arrested by police officers went viral this week, after the former US president announced it was likely he would get indicted at the Manhattan District Court on Tuesday (March 21).
The problem? The photos were actually fakes created in an AI art generator by Eliot Higgins, founder of the investigative outlet Bellingcat, using simple prompts like “Donald Trump falling down while being arrested.”
According to Higgins, he was just experimenting with the generator’s capabilities and did not mean to mislead anyone into thinking Trump actually had been arrested. However, that did not stop the photos from being shared across the web, and viewed over 5 million times.
Now, Trump has gotten involved, re-sharing an artificially created image of himself solemnly praying, with the caption demanding that Trump supporters “pray for this man.” While the photo is not identified as an AI-generated image, it bears many of the tell-tale signs, like a finger missing from Trump’s right hand and poorly proportioned figures in the background.
An earlier post on Twitter with the same image shows thousands of replies from Trump’s base, apparently believing it was an actual photo of the former president praying, with comments reading “love this picture” and “that’s powerful.”
Higgins didn’t think the photos posted in his Twitter thread would fool anyone. With irregularities like extra limbs and out-of-proportion bodies, as well as increasingly ridiculous situations (in one, Trump is cleaning a jail bathroom in an orange jumpsuit), the photos are clearly fake. But they still went viral, along the way likely confusing a lot of people, at least at first glance.
“I had assumed that people would think a three-legged Trump was out of the ordinary, but whether an average person would notice that I couldn’t say,” Higgins told Quartz via email.
Higgins said it’s doubtful that photos like the ones he posted would ever be used in a courtroom, or even a news source, as they would not pass most authentication processes. However, he concedes they are likely to be used to sew confusion in fast-paced, unregulated corners of the internet.
“There are images as evidence and images as information, or in the case of AI-generated images, images as disinformation,” Higgins said. “If anything, the thread I posted proves how quickly images that appeal to individuals interests and biases can become viral, and fact-checking is something that takes a lot more time than a retweet.”
Midjourney, the company that created the AI art generator, banned Higgins from using the service. It did so without explanation, Higgins said, but presumably considered his photos a violation of company policies. The decision begs the question: Who is ultimately responsible for the consequences of fake images like these?
As AI technology rapidly outpaces regulation, culpability for fake images is still a gray area and largely left up to the discretion of private companies.
But US senator Mark Warner, the Virginia Democrat who chairs the Senate Intelligence Committee, wants companies who make photo generators—or other tools to create false information—to know that they could be held responsible for their content. Developers “should already be on notice: if your product directly enables harms that are reasonably foreseeable, you can be held potentially liable,” Warner told the Washington Post after the image went viral.
Meta, for its part, said it had labeled Higgins’ images as “misleading” on Facebook and Instagram. But fake photos of Trump in an orange jumpsuit remained easily findable by searching under the hashtag “#trumparrested.”
Meanwhile, the series of photos posted by Higgins—clearly marked as false—remains up on Twitter.