Facebook has found itself in the middle of a firestorm of criticism over allegations it is suppressing conservative news.
Reports from Gizmodo alleged that the curators who control the “Trending” bar on the side of the social network routinely choose not to feature conservative topics and news outlets, while favouring others like activist movement Black Lives Matter.
In a Facebook post published early Tuesday morning, the social network’s VP of search Dan Stocky finally denied the reports, writing that “we take these reports extremely seriously, and have found no evidence that the anonymous allegations are true.” (Scroll down for the full statement.)
Whatever the truth of the allegations over conservative suppression, the incident points to a broader truth — that Facebook is becoming more and more like a traditional media company, and needs to embrace the thorny issues that come with the territory.
There may not be guidelines telling curators not to select certain topics, but people have implicit biases. It’s natural, and to an extent unsurprising. Curators will necessarily pick topics which they know about, and frame them in a way that is reflective of their mindset. The Gizmodo article discusses exactly this (emphasis ours):
“Depending on who was on shift, things would be blacklisted or trending,” said the former curator. This individual asked to remain anonymous, citing fear of retribution from the company. The former curator is politically conservative, one of a very small handful of curators with such views on the trending team. “I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognise the news topic or it was like they had a bias against Ted Cruz.”
This is precisely what happens in any media organisation: Writers and editors gravitate towards subjects of interest — and the combined weight of these biases give the organisations their (deliberate or otherwise) slant, from the left-leaning Guardian to the populist Daily Mail.
The big difference is that Facebook doesn’t frame itself as a media company — and generally tries to avoid the necessary debates about bias and slant that media companies have to tackle. Facebook’s Trending section is presented as a neutral reflection of organically popular stories on the site: I’d wager that many Facebook users don’t even realise there are human curators.
But as Mathew Ingram writes over at Fortune, these problems are now catching up with the social network. “Facebook routinely says that it doesn’t see itself as a media entity, and doesn’t see its algorithmic choices as being of any concern to anyone outside the company — even when those choices help influence the way people think and behave, like whether they decide to vote and how they see political issues … At some point, however, Facebook is going to have to grapple with these kinds of issues, or at least acknowledge that they exist and that people have a right to be concerned about them.”
One potential remedy is to do what newspapers and media organisations have done for decades — use bylines and mastheads.
The vast, vast majority of the media business credits authors and producers on stories, as well as the organisation’s editorial leadership. This allows for public accountability. Sure, the writer might be biased — and many writers embrace their biases, arguing against the falseness of forced neutrality — but the byline means there is editorial accountability.
If Facebook adopted an optional masthead that showed who has curated topics, and which editors were online, it wouldn’t solve the problem of unconscious biases. But it would alert users to them, allowing them to be better informed. It would be an acknowledgement that yes, the social network is grappling with these issues, and it’s owning up to that.
However, to do so would require Facebook embracing its role and responsibility as a dominant media organisation to a degree that is has previously shown no inclination to do.
And even if it did, it might only be a temporary fix: Almost all of the former Facebook curators Gizmodo talked to told the tech site that they believed “they were there not to work, but to serve as training modules for Facebook’s algorithm.”
In short: It’s only a matter of time until the curators are completely replaced by opaque algorithms — and we are once again none the wiser as to how the most powerful media company in the world decides what to show us.