Facebook Finally Admits It May Never Become Transparent Or Consistent On Matters Of Appropriate Content.
In the most frank, and arguably the most defeatist, of admissions by any social network employee I have ever spoke to, a member of a Facebook content review team today told me the world’s biggest social network has “failed miserably” in its attempt to ban violent or offensive content without suppressing the free sharing of information that it says it wants to encourage.
“We are fighting a losing battle to walk this particularly line,” a member of a Facebook content review team said. “I doubt very much that we will ever become fully transparent or consistent on matters of appropriate content.”
I would be surprised if this whistleblower is alone in their rather damning indictment of a review system that has been bolstered to the eyeballs with publishing guidelines, but instead appears to be in worse shape than it was five years ago when Facebook set up it’s new global review teams.
I only wish Facebook would stand up and admit that the reasoning behind it’s decisions to block or allow content are often opaque and inconsistent. When they disapprove an ad or reject a post, wouldn’t it be great to see the following disclaimer.
Facebook’s content review team is often conflicted and its decisions are not always consistent or indeed fair. It is entirely possible that you may see content much less appropriate that your banned content not banned on this social network. Decisions are often arbitrary.
One can understand why terrorist organisations like the Islamic State have long been banned from Facebook. But I do wonder how many hours have been spent by social media managers working in ISIS’s marketing department, trying to decipher vague messages from the Facebook review team — “Your advert was not approved. It contains things that your audience may find offensive. This could include pornographic images or other restricted content. Please edit the ad and try again.”
One of Facebook’s most stringent publishing guidelines is the updated rule about “supporting or praising groups involved in “violent, criminal or hateful behavior”, which is banned, although, according to my source, Facebook content review teams often fail to enforce this rule. “We are told to take into account the full context of content. For example, some people share information about atrocities in the world as a way of raising public awareness, but this is where we often get it wrong,” my source said.
“We ban all Islamic State videos, but graphic photos and videos of other atrocious acts of extreme violence and murder make it through the review process. We are making judgement calls often with impaired judgement.”
Here is a case in point. A video of a brutal street killing posted on my feed. The actual film is extremely graphic.
So, it’s ok to post content like this, but Facebook will remove photographs of people displaying genitals or focusing in on fully exposed buttocks. It also restricts some images of female breasts if the nipple shows, like this newspaper photo of topless protesters marching in Manhattan against a law to rid Times Square of nearly nude women.
This screen shot from Rust, a survival-theme online video game, was also banned.
As was this image on a vintage postcard.
My recent post advertising a couples-only adult party in Venice, for a client, was also disapproved as an advert.
But if you are Peter Stringfellow, you can advertise adult parties.
And it is not only consistency on publishing violence and porn that Facebook gets wrong. In December, it blocked a page in Russia that was promoting an antigovernment protest, then allowed copycat pages to stay up. And in October, it created an exception to its requirement that people use their real names on the service when it allowed San Francisco’s drag queens to use their stage names while continuing to crack down on others using false names.
If you enjoyed reading this, you may want to read my last article on the inconsistency and injustice of search and social policy.