Facebook’s photo censors strike again: The latest victim mistake is happened on the official Nirvana fan page.
The iconic album cover of the naked floating baby chasing a dollar bill on a fishing line was allegedly pulled from the fan page yesterday, but today when we checked the photo was up and posted as the page’s profile picture.
The Nirvana fan page admin changed the profile photo to the “Nevermind” album cover to celebrate the album’s 20th anniversary and reissuance this fall.
The photo in question does not violate our policies. Our team reviews thousands of pieces of reported content every day, and we occasionally make a mistake. When we’re notified of a mistake and can confirm it on our end, we act quickly to fix it, apologize to the person or people affected, and take appropriate steps to improve our systems and processes. We take these responsibilities very seriously, but believe our error rates rival those of any company in any industry.
We suspect Facebook staff must have had a change of heart. The profile picture is back, and there was even another photo of the naked floating baby posted on the wall.
If this sounds like déjà vu, that’s because this has happened before. No, not with Nirvana’s “Nevermind” album or floating babies in water, but it has happened to a birth photographer who had her account disabled for posting photos of women moments after giving birth. Facebook later issued an apology to the woman and fully enabled her account.
In May Facebook took down a photo of two men kissing at a park. The story of the censorship went viral and the social site apologized for ‘erroneously” taking the picture down, admitting it didn’t really violate its Statement of Rights and Responsibilities.
It seems that photo approvals and denials on Facebook leave a lot of room for human error and subjectivity, leaving it up to the person interpreting the photo to make the call if it’s inappropriate or not.
The cycle sort of goes like this: A user complains about the photo, the Facebook staffer decides the customer is right and yanks it, then another Facebook user complains about the censorship, so Facebook issues an apology calling the deletion an error. Rewind. Repeat.
What do you think Facebook could do to stop this problem? Is there a solution here or will we continue to see this pattern?