Now that Donald Trump is officially going to be the next president of the United States, many people who were vehemently opposed to him are looking around and saying, “What happened?”
And some of them are answering that in a word, “Facebook.”
Facebook and its news feed algorithm is tuned to try and show you stuff you like. It doesn’t distinguish between fact and fiction. It’s been called the “filter bubble,” a phrase coined by Eli Pariser and his 2011 best seller, “The Filter Bubble: What the Internet Is Hiding from You.”
In the book, Pariser even predicts the effects the filter bubble could have on democracy.
“Ultimately, democracy works only if we citizens are capable of thinking beyond our narrow self-interest. But to do so, we need a shared view of the world we cohabit. We need to come into contact with other people’s lives and needs and desires. The filter bubble pushes us in the opposite direction — it creates the impression that our narrow self-interest is all that exists. And while this is great for getting people to shop online, it’s not great for getting people to make better decisions together.”
And so if someone shares a story of a politician that fits your world view, but the story is completely false, you’ll see it in your news feed (as well as perhaps, in your Google searches, another company that uses algorithms to find things it predicts you’ll like). Your news feed may then choose to skip showing you stories that debunk the false story. More than that, it may also skip stories about all types of world events that could inform, if not influence, your world view in a different way.
While there are certainly plenty of news sources that any curious individual could read to become informed about the issues, the truth is, more Americans are increasingly leaving the curation to Facebook.
A recent Pew Study found that a majority of U.S. adults — 63% — use Facebook as a source of news on events and issues beyond pictures posted with friends and family.
And yet Facebook has not been willing to accept responsibility for this kind of role, refusing to believe it’s acting like a news organisation, a media company. CEO Mark Zuckerberg said in August:
“We’re a technology company. We’re not a media company. When you think about a media company, you know, people are producing content, people are editing content, and that’s not us,” he said. “We exist to give you the tools to curate and have the experience that you want, to connect with the people and businesses and institutions in the world that you want.”
But, waking up to the results of this election has made some people rethink the role Facebook, and not in a good way.
That’s the leader millions of Americans never honestly saw because Facebook prioritises engagement over truth.
— James Hamblin (@jameshamblin) November 9, 2016
Hillary supporters did gather in “secret, private facebook sites” because they were mocked mercilessly on social media this entire cycle
— Charlotte Alter (@CharlotteAlter) November 9, 2016
Thank you Facebook for those cool algorithms that show people fake news catered to their biases. Super helpful!! Now facts don’t exist yay!
— Alex Hirsch (@_AlexHirsch) November 9, 2016
1/2 I believe social media has had a major impact on us… we read Facebook & twitter rather than NY Times, Newsweek, Time… rather than
— Stephanie J. Block (@StephanieJBlock) November 9, 2016
No one will be working as hard today as Facebook’s Unfollow button.
— De Nada Donna (@Donna_McCoy) November 9, 2016
I know it’s complicated but I can’t avoid the conclusion that Facebook played a huge role in this, in Brexit.
— Nat Friedman (@natfriedman) November 9, 2016