Facebook is seeing its popularity in South Korea slip. Source: Shutterstock

Leaks of Facebook’s content moderation policies reveal company’s shortfalls

FACEBOOK has been under fire for what many critics see as an irresponsible approach to content moderation – that reputation only soured further after an investigation by The Guardian unearthed a trove of leaked policies on what users can and cannot post on the site, as well as guidelines for content moderators.

The so-called “Facebook Files” offers a glimpse into the world of a company that has been lately plagued by calls from netizens – particularly those in Europe and the United States to exert more control over the kinds of content floating around at least two billion timelines, particularly posts that are violent in nature.

The Guardian’s investigation uncovered more than 100 documents, including “internal training manuals, spreadsheets and flowcharts” used by Facebook moderators to assess and cull content along the lines of the most controversial of issues, such as hate speech, terrorism, child pornography, self-harm and racism.

SEE ALSO: Facebook teams up with Vietnamese government to fight illegal content

At times, the lines drawn between issues seem arbitrary and inconsistent.

For example, speech calling for the harming of a “head of state”, as conferred upon US President Donald Trump” must be removed, yet comments such as “To snap a b***h’s neck, make sure to apply all your pressure to the middle of her throat” or “f**k off and die”, or “I hope someone kills you” should not be regarded as credible threats.

“People commonly express disdain or disagreement by threatening or calling for violence in generally facetious and unserious ways.” – leaked document in the Facebook Files

The site came under fire in late April when a Thai man filmed himself murdering his daughter via Facebook’s livestreaming tool, and moderators failed to remove the video, even after 24 hours.

However, in the documents reviewed by The Guardian, it is noted videos of violent deaths may stay to raise awareness of mental health issues. It also says self-harm videos need not be censored, as the company “doesn’t want to censor or punish people in distress”.

The sheer scale of paper that has been dedicated to parsing the issue of content is evidence of the internal struggle Facebook has been undergoing as it tries to strike the right balance between free speech arguments and calls for better moderation of disturbing content that could also otherwise validate violence and gross spectacle.

They’re also having to grapple with fast-developing challenges that have barely been acknowledged by legal bodies, such as “revenge porn” and “catfishing”.

Facebook CEO Mark Zuckerberg speaks during a panel discussion held as part of the China Development Forum. Source: AP.

Moderators speaking to The Guardian expressed frustration and feelings of being overwhelmed, pointing out they usually get mere seconds to judge the validity of a piece of content, as well as where exactly to draw moral lines. Facebook reportedly receives more than 6.5 million reports every week with regards to potentially fake accounts.

“Facebook cannot keep control of its content,” said one source to The Guardian. “It has grown too big, too quickly.”

Some said despite reams of paper, the rules Facebook implemented were still too nebulous and inconsistent to be useful.

And Facebook knows this is all a huge problem.

In one of the leaked documents, the social media titan admitted the site had become a kind of safe space for users to express frustration with violent language as there is little fear of retribution or face-to-face interactions.

“We should say violent language is most often not credible until specificity of language gives us a reasonable ground to accept there is no longer simply an expression of emotion, but a transition to a plot or design,” one of the documents reads.

“From this perspective, language such as ‘I’m going to kill you’ or ‘F**k off and die’ is not credible and is a violent expression of dislike and frustration.”

“We have a really diverse global community and people are going to have very different ideas about what is OK to share,” Facebook’s head of global policy management Monika Bickert told The Guardian.

“It’s absolutely our responsibility to keep on top of it. It’s a company commitment. We will continue to invest in proactively keeping the site safe, but we also want to empower people to report to us any content that breaches our standards.

“We feel responsible to our community to keep them safe and we feel very accountable.”

The recent spate of violence readily available on the site has resulted in calls from critics from all over the world for Facebook to come under publishing regulations, much in the way broadcasters and publishers are subject to.

Facebook contests, however, it is not a traditional media company and its product is a platform for the news rather than a source of the news itself, as noted by Bickert to The Guardian.

SEE ALSO: Facebook closes data backdoor previously exploited by cops for surveillance

It has become quite the tangle, content moderation expert Sarah Roberts says, as the site’s two billion-strong user base has grown beyond the shared code and principles of what could reasonably be expected from a smaller community of like-minded people.

Furthermore, Facebook has muddied the water further by throwing money into the mix – after all, virality spawns clicks that bring in the cash.