In November 2016, Facebook CEO Mark Zuckerberg said the idea that fake news could influence the US election was “crazy.”
But seven months down the line, and the social network has announced an all-out war on misinformation and hoaxes in the UK ahead of the general election in June.
Facebook is tweaking its algorithm to try and halt the spread of fake news, giving users tips on identifying fake news, and it even ran full-page adverts in British newspapers on Monday reiterating how to spot misinformation online. (You can see the full advert below.)
The PR-friendly campaign is welcome — but overdue. And it’s the latest in a chain of examples of Facebook doing too little, too late to battle very serious issues it has had a crucial hand in creating.
Copyright theft and livestreamed suicides
A few years ago, Facebook CEO Mark Zuckerberg decided that the future of the web wasn’t text — it’s video. Since then, Facebook has heavily pushed video on its platform, prioritising it with its algorithm and even paying news outlets (including Business Insider) to create it. If you’ve wondered why your News Feed is so full of videos these days, this is why.
But as Facebook pushed to grow its video efforts, it developed a major problem: “Freebooting.” That is, video theft. Unscrupulous users were ripping viral videos from professional producers on YouTube and republishing them on Facebook — building pages with millions of followers off the back of stolen content. ” Facebook is prioritising growth (and the perception of growth) over acting decently,” YouTuber Hank Green told The Verge in 2015.
The social network eventually took steps to solve it by introducing Rights Manager — software to help track and takedown infringing videos — in April 2016. But it took more than a year of complaints from video creators to get there.
Another example: Facebook Live. In January 2016, Facebook let American users (with iPhones) to broadcast live videos from their phones, and has since gradually rolled the feature out to users in the rest of the world. But in the intervening months, it has developed a grizzly reputation.
A student committed suicide. A two-year-old accidentally killed themselves and another man. An 18-year-old crashed her car and died. Two journalists were gunned down. An Ohio man killed an elderly man and bragged about allegedly having killed others. These are just a few of the examples of what has been broadcast to the world on Facebook Live.
Facebook is now hiring 3,000 extra moderators to review content on the social network — after a man killed his 11-month-old daughter, and the videos stayed online for more than a day.
Facebook is finally facing up to its fake news problem
And then there’s fake news.
Concerns about politicised misinformation spreading on Facebook surged around the US presidential election. Some argue that it played a role in the election of Donald Trump, and by some metrics fake news stories even outperformed the most popular pieces from mainstream news outlets. A story (falsely) alleging that the Pope endorsed Donald Trump was shared almost a million times, for example.
But Zuckerberg was initially dismissive of the idea that it could have any impact, even as the company’s ad sales team boasted of the impact its adverts can have on elections.
The 32-year-old CEO has since evolved his position on the issue. Earlier in 2017, he wrote a mammoth 6,000 manifesto, addressing fake news alongside other issues: “While we have more work to do on information diversity and misinformation, I am even more focused on the impact of sensationalism and polarization, and the idea of building common understanding.” In March, Facebook started to flag potentially fraudulent stories as “disputed,” with the help of fact-checking services like PolitFact and Snopes.
Now, as the UK general election approaches, Facebook is determined to look like it’s tackling the problem head on.
The social network is running fullpage ads in British print newspapers from Monday. “Investigate the source,” it recommends, and “be sceptical of the headline.” It is also introducing a tweak to its algorithm that tries to identify misleading articles being shared on its platform, tracking down fake accounts, and working with fact-checking organisations.
“People want to see accurate information on Facebook and so do we,” UK policy director Simon Milner said in a statement provided to the media (via The Independent). “That is why we are doing everything we can to tackle the problem of false news.
“We have developed new ways to identify and remove fake accounts that might be spreading false news so that we get to the root of the problem. To help people spot false news we are showing tips to everyone on Facebook on how to identify if something they see is false.
“We can’t solve this problem alone so we are supporting third party fact checkers during the election in their work with news organisations, so they can independently assess facts and stories.”
Facebook’s attempts are positive — but they’re late
Print newspaper ads are a great way to be seen to be doing something visible by politicians and media execs, but the real-world impact is questionable. if you’re buying a print newspaper, you’re in the minority, with circulation dropping as the vast majority of Brits turn to online for their news.
More broadly, though, these efforts are — of course — a good thing. It’s just a shame Facebook was unwilling to face up to the reality of its impact until overwhelming public pressure (and the threat of legal action) forced it to.
Facebook used to live by the motto “Move Fast and Break Things.” It emphasised speed and innovation — and if there were a few problems from time to time, so be it. That’s not such a bad way to do things, if you’re a social network for college buddies and you’re willing to tolerate a few bugs.
But when you’re the world’s most powerful media and publishing company, with a global community of two billion and unparalleled impact on public discourse and the political process, greater care and attention is vital.
This column does not necessarily reflect the opinion of Business Insider.