The Guardian Site Reviews Documents Used By Facebook Executives To Moderate Content
Monday, May 22, 2017
The Guardian news site in the United Kingdom (UK) published the findings of its review of "The Facebook Files" -- a collection of documents which comprise the rules used by executives at the social site to moderate (e.g., review, approve, and delete) content posted by the site's members. Reporters at The Guardian reviewed:
"... more than 100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm. There are even guidelines on match-fixing and cannibalism.
The Facebook Files give the first view of the codes and rules formulated by the site, which is under huge political pressure in Europe and the US. They illustrate difficulties faced by executives scrabbling to react to new challenges such as “revenge porn” – and the challenges for moderators, who say they are overwhelmed by the volume of work, which means they often have “just 10 seconds” to make a decision..."
The Guardian summarized what it learned about Facebook's revenge porn rules for moderators:
Reportedly, Facebook moderators reviewed as many as 54,000 cases in a single month related to revenge porn and "sextortion." In January of 2017, the site disabled 14,000 accounts due to this form of sexual violence. Previously, these rules were not available publicly. Findings about other rules are available at The Guardian site.
Other key findings found by The Guardian during its document review:
"One document says Facebook reviews more than 6.5m reports a week relating to potentially fake accounts – known as FNRP (fake, not real person)... Many moderators are said to have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing... Anyone with more than 100,000 followers on a social media platform is designated as a public figure – which denies them the full protections given to private individuals..."
The social site struggles with how to handle violent language:
"Facebook’s leaked policies on subjects including violent death, images of non-sexual physical child abuse and animal cruelty show how the site tries to navigate a minefield... In one of the leaked documents, Facebook acknowledges “people use violent language to express frustration online” and feel “safe to do so” on the site. It says: “They feel that the issue won’t come back to them and they feel indifferent towards the person they are making the threats about because of the lack of empathy created by communication via devices as opposed to face to face..."
Some industry watchers in Europe doubt that Facebook can do what it has set out to accomplish, lacks sufficient staff to effectively moderate content posted by almost 2 billion users, and Facebook management should be more transparent about its content moderation rules. Others believe that Facebook and other social sites should be heavily fined "for failing to remove extremist and hate-crime material."
To learn more, The Guardian site includes at least nine articles about its review of The Facebook Files:
Comments
You can follow this conversation by subscribing to the comment feed for this post.