Week in Review: April 27, 2018

by Dan Bateyko

Facebook releases its internal Community Standards enforcement guidelines

Facebook released its 27-page Internal Enforcement Guidelines this week, a document which Facebook moderators use to determine what content violates Facebook's policies, reports The Guardian. Facebook also announced a new appeals process for users to contest flagged content, in what is seen as a transparency effort following the Cambridge Analytica data misuse scandal.

A Wired article points to how the document "describes with often explicit levels of granularity how Facebook defines more than 20 different offenses, from harassment and graphic violence to false news and fake accounts." According to an accompanying blog post, Facebook uses a "combination of artificial intelligence and reports from people to identify posts, pictures or other content that likely violates our Community Standards,” which then are reviewed by a Community Operations team. If users contest the takedown, they now have a means of appeal by requesting additional review by the Community Standards team. The team will assess the content takedown decision within 24 hours, and if the team determines that a mistake was made, the post, photo, or video will be restored.

Citizen Lab releases new report on Netsweeper Internet filtering

The Toronto-based Citizen Lab released a four-part report this week on the global proliferation of Netsweeper, an Internet filtering system manufactured by a Canadian company of the same name. The report found Netsweeper installations in 30 countries and drew attention to 10 country case studies where Netsweeper is used to filter political, LGBT, and news content, suggesting that Netsweeper may be contributing to adverse human rights impacts abroad and underscoring the need for corporate social responsibility.

In its discussion of the report, Citizen Lab recommends export controls and greater transparency in licensing decisions, and advocates for Canada to take steps towards drafting statutes “tailored to harms and human rights violations caused by Canadian corporate practices.” Citizen Lab has previously discussed the use of Netsweeper in reports on Internet filtering in Bahrain, Pakistan, and Somalia.

YouTube releases its first Community Guidelines enforcement report.

YouTube released a transparency report on how its Community Guidelines are enforced to handle abuse and remove videos, reports TechCrunch. According to the report, over 8 million videos were removed in the last three months of 2017, the majority of which were flagged by YouTube’s automated algorithms for abuse detection.

In an accompanying blog post, YouTube also announced a new Reporting History dashboard, which allows users to see the status of videos they’ve flagged for review. According to the report, the majority of videos removed from YouTube qualified as spam or adult content, and of the 6.7 million videos flagged by Google’s automated system, 76% of videos were removed before receiving a single view. YouTube also describes their heavy reliance on automated systems in extremist content detection and removal: “As of December 2017, 98% of the videos we removed for violent extremism were identified by our machine learning algorithms.” YouTube released a video, “The Life of a Flag” to further illustrate their content moderation policies.