AI - War crimes evidence erased by social media platforms

Platforms remove graphic videos, often using artificial intelligence - but footage that may help prosecutions can be taken down without being archived.

Meta and YouTube say they aim to balance their duties to bear witness and protect users from harmful content.

But Alan Rusbridger, who sits on Meta’s Oversight Board, says the industry has been “overcautious” in its moderation.

The platforms say they do have exemptions for graphic material when it is in the public interest - but when the BBC attempted to upload footage documenting attacks on civilians in Ukraine, it was swiftly deleted.

Artificial intelligence (AI) can remove harmful and illegal content at scale. When it comes to moderating violent images from wars, however, machines lack the nuance to identify human rights violations.

Human rights groups say there is an urgent need for social media companies to prevent this information from vanishing.

No-one would deny tech firms’ right to police content, says US Ambassador for Global Criminal Justice Beth Van Schaak: “I think where the concern happens is when that information suddenly disappears.”

Ihor Zakharenko, a former travel journalist, encountered this in Ukraine. Since the Russian invasion he has been documenting attacks on civilians.

The BBC met him in a suburb of Kyiv where one year ago men, women and children had been shot dead by Russian troops while trying to flee occupation.

He filmed the bodies - at least 17 of them - and burnt-out cars.

Videos documenting Russian attacks on civilians were taken down within minutes

He wanted to post the videos online so the world would see what happened and to counter the Kremlin’s narrative. But when he uploaded them to Facebook and Instagram they were swiftly taken down.

“Russians themselves were saying those were fakes, [that] they didn’t touch civilians, they fought only with the Ukrainian army,” Ihor said.

We uploaded Ihor’s footage on to Instagram and YouTube using dummy accounts.

Instagram took down three of the four videos within a minute.

At first, YouTube applied age restrictions to the same three, but 10 minutes later removed them all.

We tried again - but they failed to upload altogether. An appeal to restore the videos on the basis that they included evidence of war crimes was rejected.

A(nother) contentious issue … :thinking:

1 Like

This might have less to do with AI than with policy because all the platforms have an appeals process. The content can get reinstated if it’s removed by AI in error.

Reading that article, reinstatement is not possible - the content has been erased.

Reloading also fails and appeals are rejected.

1 Like