Big move on YouTube! As AI detects celebrity deepfakes, those who create fake videos will be in trouble.

AI Video & Visuals


(YouTube’s new feature protects against deepfake videos)

YouTube AI features: In a major step, video platform YouTube has expanded its similarity detection technology to the entertainment industry. The goal is to identify fake AI-generated videos, especially deepfakes, and protect the identities of celebrities.

What is this similarity detection technology?

This technology uses AI to scan videos and find fake faces. Simply put, this system can detect if a celebrity’s face is being abused. This system is similar to YouTube’s old Content ID system for identifying copyrighted content. The only difference is that it now monitors your face and identity.

Why is this feature important?

Currently, there are many fake advertisements and scams using celebrity faces. Unauthorized use of their identities has become a major problem. This new technology aims to curb such incidents so that no one can exploit the images of celebrities.

Who benefits?

The feature was initially tested with a limited number of creators, but has now been expanded to include talent agencies, management companies, and their affiliated artists. Interestingly, celebrities don’t need to have their own YouTube channel to use it.

What choices do users have?

If someone’s identity in the video turns out to be fake, the person or organization involved has several options.

  • Video deletion request
  • make a copyright claim
  • or take no action

However, YouTube has clarified that not all videos will be removed as content such as parody and satire is allowed on the platform.

What will change next?

Currently, this technology only supports visual (facial) recognition, but audio (speech) recognition functionality will be added in the future. YouTube is also working to make this protection part of the law. The company supports the US NO FAKES Act, which aims to regulate the misuse of someone’s voice or identity through AI.



Source link