Adobe Photoshop can now spot images that have been “photoshopped”
The advent of digital image editing has made it easy for people to create and edit images. This has unfortunately led to the proliferation of “fake” images that have been edited, or “photoshopped,” on the internet.
Now Adobe Photoshop, the same program that helped fuel this digital image editing revolution, is getting a tool to help identify edited and faked images.
As part of its open source Content Authenticity Initiative (CAI), Adobe has unveiled a new attribution tool for Photoshop that will help consumers better understand the authenticity of images while also giving credit to their creators.
According to Adobe, the attribution tool will create a “tamper evident” paper trail for an image. This will allow users to more easily identify authentic versus edited or deepfaked images and even see how they were created.
CAI was first launched by Adobe with Twitter and The New York Times following numerous instances of fake and altered images proliferating online. Since then, the company has collaborated with other launch partners, such as Microsoft, Qualcomm, the BBC and others to launch the prototype tool.
In a recent video, Adobe has demonstrated how the tool works. With the attribution tool Photoshop automatically tagged the edited photo with credits for the original photographer, the creator who produced the composite and even the exact editing activities used—AI assist for key imported assets and transformation in the case of the video.
Adobe noted, however, that the image attribution tool can still be turned off. It will only track authentic content if creators want that.
As expected, the attribution also works with images exported into Behance—Adobe’s own photo-sharing social media platform—with images showing the exact same information seen in Photoshop. In addition, people looking for more information on the image can get a full report on Adobe’s new website. Better yet, the site will even show a split-screen comparison view showing the original stock photo versus the edited image.
Right now Adobe’s new tool only works for images. The company and its partners, however, do plan to eventually expand it into other types of media, including video.
The latter is a more interesting proposition as it could help weed out deepfakes. These AI-generated videos have become even more problematic than just ‘shopped images as they can be used to make a person appear to be doing something they haven’t done and convince people to believe things that aren’t true.
But before Adobe can work on that, however, it needs to finish work on the image attribution tool first. The tool is currently in testing with it becoming available to select customers in Photoshop and Behance in the coming weeks.
With this, it will take some time before it gets the widespread adoption by artists, publishers and even rivals to Photoshop that it needs to become truly useful. That said, Adobe seems confident in its technology, calling it’s launch a “huge leap forward.” Whether this does become the case, only time will tell.