The downside of Facebook as a public space: Censorship

The benefits of being on Facebook are fairly obvious by now: you can connect to friends and family and share things with them no matter where they are — and it’s all free! This quasi-public space is also owned and controlled by a corporate entity, however, and it has its own views about what kinds of behavior should be allowed. That inevitably raises questions about whether the site is engaging in what amounts to censorship — questions that resurfaced this week after a page belonging to film critic Roger Ebert disappeared, and a group of protesters in Britain found their content blocked. Who is watching the watchmen?

Ebert triggered a storm of criticism on Monday with his response to the death of Jackass co-star Ryan Dunn, who was killed in a single-vehicle accident early Monday morning. Police said that speed was likely a factor in the crash, and there have also been suggestions that the TV actor may have been drinking before the incident. Ebert — who took to Twitter after a cancer operation led to the loss of his lower jaw, and now has 475,000 followers — posted that “friends don’t let jackasses drive drunk,” a comment that drew attacks from Dunn’s co-stars and from celebrity blogger Perez Hilton.

The film critic later tweeted that his page had been removed (even though his comments on Twitter about Dunn never actually appeared there), to be replaced by an error message stating that the page had been removed due to violations of Facebook’s terms of use, which ban any content that is hateful, threatening or obscene or that attacks an individual or group. In response, Ebert said his page was harmless and asked: “Why did you remove it in response to anonymous jerks? Makes you look bad.”

Follow @ebertchicagoRoger Ebert@ebertchicago
Roger Ebert

Facebook later said that the page was taken down “in error” and it was reinstated. But as Jillian York of the Electronic Frontier Foundation and Global Voices Online noted in a blog post, it’s not clear what kind of error led to the page being removed. Was it taken down automatically after being flagged as abusive? York — who has written in the past about Facebook removing pages set up by political dissidents in the Middle East and elsewhere — says the company has denied removing pages automatically. So was there human error involved? And if so, what steps is Facebook taking to prevent that in the future?

If critics of his Twitter comments attacked Ebert’s page by repeatedly flagging it, they effectively took the same approach some governments have taken in trying to shut down dissent: Foreign Policy magazine columnist Evgeny Morozov said recently that he knows of at least one government that flags dissident group pages as pornography in order to get them removed. Facebook has also removed pages in the past that were seen as anti-Islam or anti-Israel — in some cases reinstating them later — and has taken down more innocuous content as well, such as pages about the benefits of breastfeeding.

And it’s not just taking down pages that Facebook users are concerned about: According to a blog post from one of the organizers of a recent public anti-government protest in Britain, a number of users reported that Facebook not only blocked them from linking to a website set up by the group, but from linking to a blog post about it as well. A spokesman for the social network said this too was an error that was later corrected — but, again, what kind of error it was isn’t clear. Nor is it clear what criteria Facebook uses to make these decisions.

As the British blogger notes in his post on the incident, Facebook is “increasingly the space within which people receive their information, including civic information.” We are living more and more of our public lives and getting more of our information through networks such as Facebook, and while that can be a very powerful thing — as we’ve seen with events such as the Arab Spring uprisings in Tunisia and Egypt — it also means that more of our information is being filtered by a corporate entity, with its own desires and rules, not all of which are obvious. The implications of that are profound.

Post and thumbnail photos courtesy of Flickr user David Reece

Related content from GigaOM Pro (subscription req’d):

  • Players and Strategies for Real-Time In-Stream Advertising
  • Finding the Value in Social Media Data
  • Defining the next era of social music

Comments are closed.