The opaque standards of Facebook

The World
Aboriginal women from the remote Central Australian community of Ampilatwatja performing at a public ceremony in 2010 to protest against the Northern Territory intervention.

Recently Facebook caused an uproar when it suspended several accounts that posted a photo of Aboriginal women in traditional dress. The photo was part of International Women's Day, and was classified by Facebook as sexually explicit. 

It's a glimpse into a system that impacts all Facebook users — one they only see when something goes wrong. It's also a system that operates with almost no transparency. The Facebook Community Standards are a set of guiding principles intended to outline how the organization moderates posts and responds to user reports. But as many have noted, enforcement can vary greatly. 

At its core, the enforcement of these standards falls to two systems, one automatic and one human. Facebook, and other Internet companies, use learning systems to automatically identify content that's been previously removed, and block it from appearing again. But the majority is still done by an army of 'content moderation' employees in cubicles across the world. As Wired found, their jobs expose them to the worst of the Internet, full of horrific images and videos.

Yet their jobs also involve a lot of judgment calls. Unlike Amazon, Buzzfeed, or any number of internet companies, there's just one Facebook intended to serve all users. What one culture deems empowerment may be considered pornographic in another, and it's all complicated when the person making a judgment is from a third. Beyond “universal” Community Standards, Facebook also abides by country-by-country content restriction laws. In Germany, materials denying the Holocaust are subject to removal. But those exact same protocols can be used for oppression in other countries. 

Between January and June of 2015, Facebook agreed to government requests demanding they block access to 20,000 links, posts, or videos in specific countries. 15,000 of these came from India, and 4,500 from Turkey. Other governments simply block the social network entirely. 

In response to the outrage, Facebook reiterated its position that it believed the image was still not fit for Facebook. 

Sign up for our daily newsletter

Sign up for The Top of the World, delivered to your inbox every weekday morning.