Facebook is Deleting Evidence of War Crimes Watch

AngeryPenguin
Badges: 17
#1
Report Thread starter 2 weeks ago
#1
Algorithms that take down “terrorist” videos could hamstring efforts to bring human-rights abusers to justice.

In the late summer of 2017, Khatib and his colleagues at the Syrian Archive were systematically building a case against the regime of Bashar al-Assad in much the same way ICC investigators pursued Werfalli. They had amassed scores and scores of citizens’ accounts, including video and photos that purportedly showed Assad was targeting hospitals and medical clinics in bombing campaigns. “We were collecting, archiving, and geolocating evidence, doing all sorts of verification for the case,” Khatib recalled. “Then one day we noticed that all the videos that we had been going through, all of a sudden, all of them were gone.”

It wasn’t a sophisticated hack attack by pro-Assad forces that wiped out their work. It was the ruthlessly efficient work of machine-learning algorithms deployed by social networks, particularly YouTube and Facebook.

Tech giants in Silicon Valley have taken on the role of prosecutors, judges, and juries in decisions about which words and images should be banished from the public’s sight.

Some of what governments ask of tech giants, such as suppressing violent content, counters other goals, such as bringing warlords and dictators to justice. Balancing these priorities is hard enough when humans are making judgments in accordance with established legal norms. In contrast, tech giants operate largely in the dark. They are governed by opaque terms-of-service policies that, more and more, are enforced by artificial-intelligence tools developed in-house with little to no input from the public. “We don’t even know what goes into the algorithms, what kind of in-built biases and structures there are,” Ní Aoláin said in an interview.

YouTube pulled 33 million videos off its network in 2018—roughly 90,000 a day. 73% were removed so fast that no community members ever saw them. Facebook removed 15 million pieces of content it deemed “terrorist propaganda” from October 2017 to September 2018 - a mere 0.5% of the purged material was reported by users first.

Human-rights advocates worry about the decisions tech giants and their algorithms will make under such outside pressure. “The danger is that governments will often get the balance wrong,” argued Ní Aoláin. “But actually we have the methods and means to challenge governments when they do so. But private entities? We don’t have the legal processes. These are private companies. And the legal basis upon which they regulate their relationships with their users, whether they’re in conflict zones or not, is determined by [the company’s] terms of service. It’s neither transparent nor fair. Your recourse is quite limited.”

https://www.theatlantic.com/ideas/ar...harder/588931/
0
reply
Notoriety
Badges: 21
Rep:
?
#2
Report 2 weeks ago
#2
You are so dramatic.
0
reply
username4454836
Badges: 20
Rep:
?
#3
Report 1 week ago
#3
Why didn't they download the so called evidence? Surely that would be the sensible move when collecting evidence.
1
reply
Fullofsurprises
Badges: 20
Rep:
?
#4
Report 1 week ago
#4
Facebook has far, far too much power now and it is all unaccountable.

Zuckerberg and his inner circle at Facebook have become the global supremos of world information and they clearly intend not to change that happy situation without legal force being applied.

It's also a huge monopoly, that along with Google has devastated local news publishing around the world and placed news gathering and distribution in the hands of fakers and charlatans. The monopoly power is not being restrained under the bogus pretence that it is 'free', whereas of course it is not free to to the advertisers, who receive users and user data bundled up - we humans are mere commodities to Facebook.
1
reply
AngeryPenguin
Badges: 17
#5
Report Thread starter 1 week ago
#5
(Original post by Notoriety)
You are so dramatic.
I didn't write the article, I merely summarised it.
1
reply
Napp
Badges: 22
Rep:
?
#6
Report 1 week ago
#6
It's also apparently creating pages and content for terrorists and other dubious groups, go figure...
0
reply
Retired_Messiah
Badges: 20
Rep:
?
#7
Report 1 week ago
#7
(Original post by Decahedron)
Why didn't they download the so called evidence? Surely that would be the sensible move when collecting evidence.
I think downloading video off of facebook isn't a thing that's very easy to do. That said, you would've thought that they'd have considered the fact that content in direct violation of the ToS might've disappeared at some point, and so worked to account for that.
0
reply
username4454836
Badges: 20
Rep:
?
#8
Report 1 week ago
#8
(Original post by Retired_Messiah)
I think downloading video off of facebook isn't a thing that's very easy to do. That said, you would've thought that they'd have considered the fact that content in direct violation of the ToS might've disappeared at some point, and so worked to account for that.
It wouldn't be that hard to do video and audio capture.
0
reply
X

Quick Reply

Attached files
Write a reply...
Reply
new posts
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise

How did your AQA GCSE Physics Paper 1 go?

Loved the paper - Feeling positive (487)
31.16%
The paper was reasonable (603)
38.58%
Not feeling great about that exam... (256)
16.38%
It was TERRIBLE (217)
13.88%

Watched Threads

View All