Facebook's leaked rules on censoring hate speech protect 'white men' but not 'black children'

Mark ZuckerbergJustin Sullivan/Getty ImagesFacebook CEO Mark Zuckerberg.

A bombshell investigation by ProPublica sheds new light on how Facebook trains its thousands of content moderators to police hate speech, including a rule that calls for the blocking of attacks on “white men” but not “black children.”

The report, which includes internal training documents used by Facebook moderators, details how the social network’s policies on hate speech “tend to favour elites and governments over grassroots activists and racial minorities.”

Here are the highlights:

  • Facebook trains its thousands of content moderators to delete hate speech directed at so-called “protected categories” of people such as race, sex, gender, and religious affiliation. Hate speech against “subsets” of protected categories, such as “radicalized Muslims,” is considered less censorable.
  • To illustrate the difference between protected categories and unprotected subsets, a training slide Facebook shows moderators says they should delete hate speech against white men over female drivers and black children.
  • President Donald Trump’s 2016 Facebook posts about banning Muslims from entering the US violated Facebook’s internal rules on hate speech. But CEO Mark Zuckerberg personally intervened to keep the posts from being deleted.
  • Facebook uses the US State Department’s list of designated terrorist groups and other similar databases to help it monitor hate speech. But it also keeps a “secret list” of designated “hate organisations” that it bans.

When asked for comment on ProPublica’s report, a Facebook spokesperson directed Business Insider to a post on Facebook’s company blog about moderating hate speech that was published one day prior. In the post, VP of public policy Richard Allan said Facebook deleted 66,000 hate speech posts per week over the past two months.

“But it’s clear we’re not perfect when it comes to enforcing our policy,” he wrote. “Often there are close calls — and too often we get it wrong.”

You can read ProPublica’s full investigation on its website. The report comes after The Guardian detailed Facebook’s rules on moderating sex and violence last month.

NOW WATCH: A judge explains how Facebook can be used against you in court

NOW WATCH: Tech Insider videos

Want to read a more in-depth view on the trends influencing Australian business and the global economy? BI / Research is designed to help executives and industry leaders understand the major challenges and opportunities for industry, technology, strategy and the economy in the future. Sign up for free at research.businessinsider.com.au.