- The arrest of Donald Trump and a fake image of Pope Francis in a coat recently tricked the internet.
- Images generated by AI programs such as Midjourney, DALL.E2 and Stable Diffusion are increasing.
- An AI expert shared four tips for identifying deepfakes, including reverse image search and fact-checking.
Recently, deepfake images of Donald Trump and Pope Francis generated by artificial intelligence have gone viral online.
One viral image of the Pope in a stylish white puffer coat and bejeweled cross was actually created by an AI program called Midjourney, which David Holz founded last year.
The program, which creates images based on user-provided text descriptions, has been used to create misleading images of famous people. President Donald Trump Arrested.
Midjourney has since suspended free trials “due to a combination of exceptional demand and trial abuse,” Holtz said.
But that doesn’t mean the end of fake images, according to Henry Ajudar, an AI expert and presenter on the Metaz Reality Lab’s European Advisory Board. Tools such as OpenAI’s Dall. E 2 and Stable Diffusion had this feature.
“The only way realistic fakery was possible in the past was in Hollywood studios, to the level we see every day now.” people have the power of a Hollywood studio.”
He warned that the results of deepfake images ranged from fake news about politicians to non-consensual pornographic images.
Most recently, a face-swapping app called Facemega was used to promote sexually suggestive ads using Emma Watson’s face.
However, Adjer said it’s not just the “pretentious fakes” that people need to worry about. More subtle ones like Pope Francis can “slowly erode our trust in visual media, making the truth harder to navigate.”
He and another expert provided four tips to help distinguish AI-generated images from the real thing.
Some AI-generated images have a “plastic” appearance
One of the tell-tale signs that the image was in Midjourney is the “plasticy” look, but the platform may address this issue as it progresses.
Ajder says Midjourney was developed with artists in mind.
This is inconsistent with other AI platforms, but be careful.
Beware of Aesthetic Contradictions
Ajder pointed out that AI programs generally struggle with “semantic consistency” such as lighting, shape, and subtlety.
Some examples include checking if the lighting of a person in an image is in place. Whether someone’s head is a little too big. Or even exaggerated eyebrows and bone structure.
Other discrepancies include images of people smiling with their bottom teeth because people usually smile with their top teeth, not their bottom teeth.
Not all images have these signs, but they are useful pointers.
Alexei Kitrov, founder of biometric security firm ID R&D, said the image of Pope Francis is “an artifact of something completely unnatural” and contains some “physically impossible” features. said he was.
For example, the crucifix that the Pope appears to wear had the chain attached to only one side.
Other errors included the odd shape of his ears and the distance between his glasses and the shadow on his face.
context is key
Aesthetics alone are not always sufficient to identify deepfakes, especially as AI tools start to become more sophisticated.
Khitrov recommends asking questions about suspicious images. “Try searching for images as if you were searching for information you received.”
Ajder agreed that context is important and it’s worth trying to find “authoritative sources.”
“It should be noted that if something looks outrageous or sensational, something is likely wrong. is to go.”
He recommends asking questions like:
try reverse image search
When all else fails, Ajder suggested using a reverse image search tool to find the context of the image.
“A reverse image search for images of Trump being arrested may take you to all the news websites shared in the article. [the image] Or like a mind map that comes out of that image. ”
For reverse image search capabilities, Ajder recommended Google Lens or Yandex’s visual search capabilities.
Watch Now: Top Insider Inc. Videos