SAN FRANCISCO — Mark Zuckerberg published a Facebook post late Friday outlining some of the steps the social network is taking to deal with its massive fake news problem.
The Facebook CEO is apparently looking to show that the site is taking the problem seriously after he initially appeared to dismiss concerns about fake news last week when he said the idea that Facebook was a hotbed for such content was “pretty crazy.”
“The problems here are complex, both technically and philosophically,” Zuckerberg wrote Friday night. “We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible.”
He added: “We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties” to help moderate content on the site.
To that end, Zuckerberg outlined a few solutions that are in the works:
- Stronger detection: better technical systems to detect what people will flag as false before they do it themselves.
- Making it much easier for people to report stories as fake to catch more misinformation faster.
- Third-party verification via “respected fact-checking organisations.”
- Warnings: labelling stories that have been flagged as false by third parties and the Facebook community, and showing warnings when people read or share them.
- Working with journalists and others in the news industry to get input and better understand their fact-checking systems and learn from them.
Facebook was sharply criticised during the last days of the US presidential election for hosting false news stories that were widely shared on the platform. Much of that content was election-related, leading to suggestions that the fake reports potentially influenced the election’s outcome.
President Barack Obama in recent days has been the highest-profile critic, first calling fake news articles on Facebook a “dust cloud of nonsense,” and later saying false reports on the internet amount to “active misinformation.”