Pro-Russia False Information According to a new study released last week, the campaign is driving a “content explosion” that focuses on exacerbating existing tensions such as global elections, Ukraine and immigration, using artificial intelligence tools from consumers.
Many known campaigns, including Operation Overload and Matryoshka (also linked to Storm-1679), have been running since 2023 and are consistent with the Russian government by multiple groups, including Microsoft and the Institute for Strategic Dialogue. The campaign spreads false narratives by impersonating the media for the obvious purpose of the sowing sector in Democrats. The campaign aims to audiences around the world, including the US, but its main goal was Ukraine. Hundreds of AI-controlled videos from the campaign sought to promote pro-Russia narratives.
The report outlines how the amount of content generated by people running the campaign has increased dramatically between September 2024 and May 2025, and how millions of views are being received worldwide.
In their report, the researchers identified 230 unique content that was promoted by the campaign between July 2023 and June 2024, including photos, videos, QR codes and fake websites. However, overloading operations has unlocked a total of 587 unique content over the past eight months, with the majority of which being created with the help of AI tools, researchers said.
Researchers said content spikes are driven by consumer-grade AI tools that are available online for free. This easy access helped to promote the tactics of the “Content Amalgamation” campaign, where AI tools allowed them to create multiple content that manipulates multiple content pushing the same story.
“This illustrates the shift towards more scalable, multilingual, and increasingly sophisticated propaganda tactics,” researchers at Reset Tech, a London-based nonprofit that tracks and checks first, wrote in the report. “This campaign has significantly increased production of new content over the past eight months, indicating a shift towards faster, more scalable ways to create content.”
Researchers were also surprised by the various tools and types of content the campaign was pursuing. “What surprised me was the diversity of content, the different kinds of content that I've started using,” Alexandra Atanasova, lead open source intelligence researcher at ResetTech, tells Wired. “They seem to be diversifying their palettes and catching a lot like different angles of those stories. They stack different kinds of content one after another.”
Atanasova added that the campaign doesn't seem to be using custom AI tools to achieve its goals, but it used an AI-powered voice generator that everyone can access.
Although it was difficult to identify all the tools that campaign operatives were using, researchers were able to narrow down their focus to one tool, Flux AI in particular.
Flux AI is a text-to-image generator developed by Black Forest Labs, a German-based company founded by former staff of Stability AI. Using the SightEngine Image Analysis Tool, researchers discovered a 99% chance that many fake images shared in the overload campaign claim to indicate Muslim immigrants are rioting and causing fires in Berlin and Paris.
