An artificial intelligence (AI)-generated video purporting to show a young Indian man from Haryana fighting for the Russian army and desperately pleading for help from the Indian government is circulating online as a genuine video.
In the 15-second clip, a young man speaks in Hindi and appeals to people to spread the video as widely as possible so it can reach the government. The young man in the video claims that 10 of his companions have already died and that he is the only survivor.
BOOM has discovered that the footage does not contain actual footage. Multiple AI detector tests confirmed that the video was created using artificial intelligence.
claim
The video was posted by @Baba_Thoka, who goes by the handle X, with the following caption: “With the cost of studying in India becoming so expensive, young people from Haryana rushed to Russia in search of a future, but after 10 days of ‘training’ they were thrown into a combat zone. Now some families have received partial bodies, some sons are missing, and one survivor is seeking help in New Delhi.”
Click here to view the archive of posts.
The same composite video is circulating in Hindi with similar captions. Click here to see the post, click here to see the archive.
What we discovered: Viral videos are generated by AI
AI detection tool flag operations: We first tried to track down the video but could not find any reliable sources to confirm that it was a real complaint from an Indian youth in a Russian war zone. This prompted us to test the video with our AI detection tool Hive Moderation, which concluded with 99% confidence that the footage was likely AI-generated or deepfake content.
Additionally, we tested the videos with Deepfake o Meter, another AI video detection tool developed by the University at Buffalo. We used multiple parameters in our tool to examine the videos, and we showed with high confidence that some of them were generated by AI and contained fake visuals.
Tests using the AI audio detection tool Hiya also concluded that the audio in the video was likely a deepfake.
