Cornell researchers working on light mark marking tactics to detect AI videos

AI Video & Visuals


The curated light source has a

The curated light source has a “secret code” that can be used to check the reliability of the video and to see if the visual is manipulated [File]
| Photo credit: Reuters

Cornell researchers have proposed ways to help forensic experts communicate AI-operated videos from real videos by using specially designed light sources at major events that will uncover when the video changes.

The paper entitled “Noise Code Lighting for Forensic and Photometric Videos” describes how to secretly encode the light sources featured in the video through variations in visual noise. Essentially, this would pass through the light source itself, rather than individually passing through all the videos shot at the event to prevent these clips from changing.

These curated light sources have “secret codes” that can be used to check the reliability of the video and to see if the visuals are manipulated.

Computer scientist and graduate student Peter Michael led the job with Noise Code Lighting (NCI).

“Our approach effectively adds a temporary watermark to video recorded under coded lighting. However, rather than encoding a specific message, this watermark encodes an image of an unmanipulated scene that appears to be illuminated only by coded lighting,” the paper said.

This tactic allows for forensic experts to compare the manipulated video with the original version of the easily accessible version, instead of manually searching for source material.

“When enemies operate on video captured with coded lighting, they unconsciously change the code images contained there. By knowing the codes used in each light source, you can recover and examine these code images.

The work noted that such an approach could be useful for public events and interviews to prevent the clips of these important meetings from changing.

However, the success of the venture depends on the widespread adoption of specially designed lights.

As AI-generated videos and AI morph clips become more realistic, experts are looking for more ways to see the original content through more. However, the need for time is a watermark method that even malicious attackers can't remove from the video they carry.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *