- By Jack Goodman and Maria Korenuk
- BBC Global Disinformation Team
Video of Ukrainian Ihor Zakharenko quickly removed
The BBC has found that evidence of potential human rights violations can be lost after being removed by technology companies.
Platforms often use artificial intelligence to remove graphic videos, but footage that could aid prosecution may be removed without being archived.
Meta and YouTube have testified and said they aim to balance their obligations to protect users from harmful content.
But Alan Rusbridger, a member of Meta’s oversight board, said the industry was “overly cautious” in its moderation.
The platform says it waives graphic material when it’s in the public interest, but when the BBC attempted to upload footage documenting attacks on civilians in Ukraine, the footage was quickly removed. .
Artificial intelligence (AI) can remove harmful and illegal content at scale. But when it comes to adjusting the violent imagery of war, the machine lacks the nuances that identify human rights violations.
Human rights groups say there is an urgent need for social media companies to prevent this information from disappearing.
“The moment you see something that looks difficult or traumatic, you can see why they have developed and trained machines to defeat it,” Rusbridger told the BBC. The Meta Oversight Board, of which he is a member, was founded by Mark Zuckerberg and is known as a sort of independent “supreme court” for the company that owns Facebook and Instagram.
“I think the next question for them is how do we develop machines, whether human or AI, to make more rational decisions,” said Rusbridger, former editor-in-chief of The Guardian. added.
U.S. Ambassador for International Criminal Justice Beth Van Schark said no one would deny tech companies rights to police content, saying, “I think the concern comes when that information suddenly disappears.” says.
Former travel journalist Ihor Zakharenko encountered this in Ukraine. He has documented attacks against civilians since the Russian invasion.
The BBC met him on the outskirts of Kiev, where a year ago men, women and children were shot dead by Russian forces trying to escape occupation.
He photographed at least 17 bodies and a burnt-down car.
Video of Russian attack on civilians deleted within minutes
He posted the video online so the world could see what happened and hoped to refute the Kremlin’s narrative. However, once he uploaded them to his Facebook and Instagram, they were quickly taken down.
“The Russians themselves said it was fake, [that] They had no contact with civilians, they fought only with the Ukrainian army,” Ihor said.
We used a dummy account to upload Ihor footage to Instagram and YouTube.
Instagram deleted three of the four videos within a minute.
YouTube initially age-restricted the same three people, but removed them all 10 minutes later.
I tried again, but the upload failed completely. An appeal for the restoration of the video was denied as it contained evidence of war crimes.
The atrocities of war are documented on social media. This material can be used as evidence to help prosecute war crimes. But the BBC has seen large social media companies remove this content and spoke to people affected by the violent conflict.
YouTube and Meta say content that would normally be removed under the public interest graphic war exemption can remain online for adults only viewing. However, our experiments with Ihor’s video suggest otherwise.
In response to “valid legal requests from law enforcement agencies around the world,” Meta said, “In line with our legal and privacy obligations, we have taken further steps to support the international accountability process.” We continue to explore,” he said.
YouTube says the platform is not an archive, although there are exceptions for graphic content for public interest. “Human rights groups, activists, human rights defenders, researchers, citizen journalists, and other groups documenting human rights violations (or other potential crimes) should adhere to best practices for securing and preserving their content. There is.”
The BBC also spoke to Imad, who ran a pharmacy in Aleppo, Syria, until a Syrian government barrel bomb landed nearby in 2013.
He recalled how the explosion filled the room with dust and smoke. Hearing cries for help, he went outside to the marketplace and saw his hands, feet, and a bloody corpse.
A local TV crew filmed these scenes. The footage was posted on YouTube and Facebook, but has since been removed.
Amid the chaos of the conflict, Syrian journalists told the BBC that their own recordings of the original footage had also been destroyed in the bombing.
Years later, when Imad was applying for asylum in the EU, he was asked to provide documents proving he was on the scene.
“I was pretty sure my pharmacy was on camera. But when I went online, it was showing a deleted video.”
In the wake of such incidents, groups like Berlin-based human rights group Mnemonik decided to archive the footage before it disappeared.
Mnemonic has developed a tool that automatically downloads and stores evidence of human rights abuses, first in Syria and now in Yemen, Sudan and Ukraine.
They saved more than 700,000 images from the war zone before they were removed from social media, including three videos showing an attack near a pharmacy in Imad.
Each image can contain important clues about what really happened on the battlefield, the location, date, culprit, and more.
But organizations like Mnemonic cannot cover all conflict areas around the world.
It’s incredibly difficult to prove that war crimes were committed, so it’s important to get as many sources as possible.
“Verification is like solving a puzzle: combining seemingly unrelated pieces of information to build a bigger picture of what happened,” says BBC Verify’s Olga Robinson.
The task of archiving open source material available to almost anyone on social media often falls on those with a mission to help relatives caught up in violent conflict.
Rawa says it is her “duty” to archive open source material on the conflict in Ethiopia’s Tigray region
Rawa lives in the United States and has family in Ethiopia’s Tigray region, where violence has been rampant in recent years and where Ethiopian authorities tightly control the flow of information.
But social media means there are visual records of conflicts that might otherwise remain hidden from the outside world.
“It was our duty,” says Rahwa. “I’ve spent hours researching, so when you see this content trickle in, you’re trying to validate it with all the open source intelligence tools available to you, but you know you’re a family I don’t know if it is. It’s okay.”
Human rights activists say there is an urgent need for a formal system to collect and safely store deleted content. This includes validating content and storing metadata to prove that it has not been tampered with.
Van Schark, U.S. Ambassador for Global Criminal Justice, said, “We need to create mechanisms that can preserve information for future accountability. Social media platforms should be proactive in making arrangements with accountability bodies around the world.” said.
