Photoshop is the first program that Adobe has “deeply integrated” with AI. Six weeks ago, the software giant unveiled Firefly, a set of generative AI models designed for its Creative Cloud suite.


These developments come at a time when fears are running high about the ability to spot manipulated images. On May 22, a fake photo of an explosion near the Pentagon circulated on social media, sowing confusion and even causing a momentary dip in the S&P 500 index.

Fake news, Twitter, and market jitters

The fake Pentagon image was shared by several verified Twitter accounts, including one called @BloombergFeed, which isn’t associated with Bloomberg News (the account has since been suspended). Wall Street news account @zerohedge, which has more than 1.6 million followers, tweeted and deleted the image, as did Russian state media outlet RT.


The image was debunked within an hour, with the Arlington County Fire Department tweeting that no explosion or incident had taken place. But netizens and investors had already been spooked.


The rapid spread of disinformation on Twitter, which scrapped its old verification system in March, showed the potential for fake images on the platform to wreak havoc.

Adobe is promoting Content Credentials as a solution to fake images

In November 2019, Adobe announced it was partnering with Twitter and the New York Times Co. to create the Content Authenticity Initiative. Its aim:“developing an industry standard for digital content attribution.”


Since then, Adobe has developed so-called Content Credentials, which it describes as “nutrition labels” attached to images. Each tag is intended to contain a tamper-proof record of an image’s provenance and history of changes. The feature is still in development.

Related stories

China goes a step further in regulating deepfakes

🪦 With AI, Bill Gates sees the end of Google Search and Amazon

💭 Why do generative AI tools hallucinate?

📬 Sign up for the Daily Brief

Our free, fast, and fun briefing on the global economy, delivered every weekday morning.