Onlinecensorship.org is a project of the Electronic Frontier Foundation and Visualizing Impact. It was founded in 2012 by Ramzi Jaber and Jillian C. York, both of whom had begun to notice posts disappearing from their friends’ Facebook pages.
Onlinecensorship.org seeks to encourage social media companies to operate with greater transparency and accountability toward their users as they make decisions that regulate speech. We’re collecting reports from users in an effort to shine a light on what content is taken down, why companies make certain decisions about content, and how content takedowns are affecting communities of users around the world.
Unfriending Censorship: Insights from four months of crowdsourced data on social media censorship
Source: Jessica Anderson, Matthew Stender, Sarah Myers West, and Jillian C. York, Onlinecensorship.org, March 2016
From the summary:
The report draws on data gathered directly from users between November 2015 and March 2016.
We asked users to send us reports when they had their content or accounts taken down on six social media platforms: Facebook, Flickr, Google+, Instagram, Twitter, and YouTube. We have aggregated and analyzed the collected data across geography, platform, content type, and issue areas to highlight trends in social media censorship. All the information presented here is anonymized, with the exception of case study examples we obtained with prior approval by the user.
Here are some of the highlights:
– This report covers 161 submissions from 26 countries, regarding content in eleven languages.
– Facebook was the most frequently reported platform, and account suspensions were the most reported content type.
– Nudity and false identity were the most frequent reasons given to users for the removal of their content.
– Appeals seem to present a particular challenge. A majority of users (53%) did not appeal the takedown of their content, 50% of whom said they didn’t know how and 41.9% of whom said they didn’t expect a response. In only four cases was content restored, while in 50 the user didn’t get a response.
– We received widespread reports that flagging is being used for censorship: 61.6% believed this was the cause of the content takedown.