Facebook thinks that the message “to snap a b—h’s neck make sure to apply all your pressure to the middle of her throat” can be permissible. Likewise, it won’t take down video livestreams of people self-harming. Images of animal abuse? Also ok.
We know all this because The Guardian has got its hands on the manuals and internal documents used to trained Facebook’s hidden army of moderators, who police the platform for material that fall foul of its community standards.
Facebook’s standards for moderation have previously attracted heavy criticism. Take the time it censored iconic Vietnam War photo “The Terror of War,” and censured Aftenposten, Norway’s biggest newspaper, for publishing it. Or its banning of a Renaissance-era Italian statue for being “sexually explicit.” Or the time it suspended users who posted a photo of Aboriginal women in traditional dress.
Facebook has extraordinary, unparalleled power to shape the flow of information in the world today. The “community standards” it sets control how more than a billion people communicate, while its opaque algorithms decide what is noteworthy and deserving of human attention and amplification.
And it does all this with almost no oversight, beholden only to its shareholders — and its CEO Mark Zuckerberg, who holds a controlling stake in the company.
The publication of Facebook’s internal moderation rules is welcome — but it’s scandalous that this is the only way users, lawmakers, and journalists will get to see them. Facebook does publish public “community standards” outlining what is and isn’t ok on the social network. But previous moderation scandals have only highlighted their shortfalls, without shedding light on why these failings keep happening.
That’s the real value of The Guardian’s leak. They provide clarity on how one of the world’s most powerful companies — one with the power to shape public debate and set social norms, unlike any other — really operates.
(Facebook did not immediately respond to Business Insider’s request for comment.)
Over the last year or so, people have increasingly woken up to the power that Facebook wields. The social network has been accused of suppressing conservative news by former moderators. Fake news has propagated on its platform, spreading misinformation, and with some critics accusing it of having a hand in the election of Donald Trump.
More recently, Wired reports, the company boasted to advertisers of its ability to target young users when they were feeling “insecure,” “anxious,” like a “failure,” and during other times of psychological crisis.
“We have now arrived at the point where Facebook, by controlling what they show to more than 1 billion people every day, has aggregated so much editorial power, that Zuckerberg must acknowledge his responsibility and take part in the discussion,” Aftenposten editor Espen Egil Hansen wrote in a column for The Guardian last year. “The alternative, a continued passive approach to this debate, will be bad for democracy, bad for the conversations our communities rest on, and maybe even bad for Facebook themselves in the long term.”
In short: Facebook is not just another app maker or tech company. Zuckerberg may insist that he doesn’t intend to run for president, but he sounds more and more like a politician every day, publishing an epic near-6,000-word manifesto earlier in February 2017 about his intentions to build a “global community.”
If Facebook wants to live up to that awesome responsibility, it needs to commit to proactively releasing far more guidance on what it allows and how it handles its users’ data — and if not, governments should compel it to do so.
This column does not necessarily reflect the opinion of Business Insider.