On Tuesday, Twitter cofounder Ev Williams neatly pointed out how big Facebook’s fake news problem is with one screenshot.
Williams was reading Facebook CEO Mark Zuckerberg’s November 18 Facebook post, which updated the world on Facebook’s progress and goals in tackling fake news, when he noticed some suspicious posts on the side. Both turned out to be fake news.
Here’s the screenshot (with our added annotations in red) that Williams posted on Medium, the publishing platform where he is CEO:
“Despite appearances, the first one doesn’t point to espn.com,” Williams wrote in a Medium post. “It goes to espn.com-magazine.online and attempts to sell a muscle-building supplement using ESPN branding and a fake news story. The CNN-branded ad goes to less work. It just takes you to a site called Fine the Racers with an exclusive offer for a 12-week program to strengthen your toes (?).”
What’s particularly worrying about this incident is that these posts are ads (note: “sponsored”), and Facebook has already pledged to ban fake news sites from its ad network.
“We do not integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, which includes fake news,” Facebook said in a statement to Reuters on November 15.
However, that looks to be easier said than done.
The spread of fake news on Facebook became a focal point of discussion after the election of Donald Trump, drawing comments from the likes of President Obama. A recent study by BuzzFeed showed that in the lead-up to the election, the top fake-news stories on Facebook outperformed legitimate news stories shared by some of the most popular media companies. One fake-news writer even said he thought he might have helped Trump win the election.
Facebook has said it’s working on fixing the problem. Zuckerberg wrote on November 18th (in the post cited by Williams), a few things that Facebook was working on to try and combat fake news. Here’s a summary:
- Stronger detection: better technical systems to detect what people will flag as false before they do it themselves.
- Making it much easier for people to report stories as fake to catch more misinformation faster.
- Third-party verification via “respected fact-checking organisations.”
- Warnings: labelling stories that have been flagged as false by third parties and the Facebook community, and showing warnings when people read or share them.
- Working with journalists and others in the news industry to get input and better understand their fact-checking systems and learn from them.
But it seems that Facebook has a long way to go.
One good thing to note is that Williams does have a bit of a bias here, as he founded both Medium, a publishing platform that in some ways competes with Facebook, and cofounded Twitter. But his point is certainly still well-taken.
Additional reporting by Bryan Logan.