On Sun. Aug. 14, liberal page administrators and bloggers in my network started spreading the word that they had been blocked from posting ANY content on ANY other wall for 15 days. No prior warning was given.
The offenders are charged with posting links to news/opinion articles — such as this liberal’s guide to Republican talking points — a small number of times to LIKE-MINDED pages with which they regularly interact. (Some reported making posts to as few as 4 other pages before being suspended — for 15 days — with no warning!) I’ve also been alerted that Facebook has revoked some administrator’s posting privileges for sharing links to their liberal Facebook page on the walls of other liberal pages.
Facebook indicates these activists have been posting “spam and irrelevant” content. So, Facebook seems to be deciding for community pages what is spam or irrelevant before the actual page administrators are ever able to see — or re-share — the content.
A Facebook spokesperson has offered the following response to us via email:
Facebook is not — and has never been — in the business of disabling accounts or removing content simply because people are discussing controversial topics. On the contrary, we want Facebook to be a place where people can openly express their views and opinions, even if others don’t agree with them. It’s also incorrect to assume that in every case where a person’s account is disabled, or a piece of content is blocked or removed, it’s because of the nature of the content itself. There are a few types of content that we don’t allow, such as nudity and pornography, hate speech, and threats of real physical violence, but sensitive topics are not against our policies.
When we take action on an account or piece of content, it’s nearly always for one of the following two reasons:
The content or behavior associated with the account was reported to us by people on Facebook, and we reviewed it and determined that it violated our community standards (http://www.facebook.com/communitystandards).
It was flagged by one of our automated systems for preventing spam and other annoying behavior (explanation of these systems here: http://blog.facebook.com/blog.php?post=403200567130).
Examples of behavior that might be flagged include having a high percentage of friend requests ignored or marked as spam, or sending lots of friend requests or messages to members of the opposite sex who are not your friends. We’re constantly building and refining these systems to protect the people who use Facebook. Of course, no system is perfect, and ours occasionally make mistakes. When this happens, we work quickly to fix it and learn from it, and to apologize to those affected.
Facebook has a history of owning up to mistakes, apologizing and and restoring content that might have been deleted unfairly.
The company outlines its censorship policies in its statement of rights and responsibilities. They prohibit users from posting anything that is “hateful, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence.”
Now here’s what the AddictingInfo blog thinks Facebook should do — let us know whether you agree with these suggestions:
- Undo the 15-day posting suspensions for sharing news articles with the welcoming walls of like-minded community pages.
- Adopt a more lenient policy for how often users can post articles to other community pages before being labeled as a spammer.
- Consider investigating users who frequently report content as spam or abuse.
- Set spam triggers for posting to community pages based on whether a significant number of page administrators label content as spam, not based on reports from random visitors to the page who might be gaming the system.
- Audit the content that Facebook employees ban. If, as you say, someone is reviewing the content I’ve linked to here to decide whether it meets community standards and then ruling to ban it, there’s a good case that they are doing so for political motives.