- Facebook has been trying to combat the spread of misinformation on its platform through measures like putting a “disputed” flag next to fake news.
- The company will ditch this particular measure after finding it could actually entrench people more strongly in wrongly held beliefs.
- The social-media firm will stick to pointing people who read or share fake news to fact-checked, contextual articles.
Stopping people from reading fake news is proving tougher than expected, Facebook has acknowledged.
The company says it will no longer stick the word “disputed” next to misleading information, an action that had been one of its original measures to tackle the spread of intentionally false information on Facebook.
But the tactic didn’t design for human pigheadedness in which telling people their opinion is wrong simply entrenches them further in that opinion.
Facebook wrote in two explanatory blog posts that it would drop the “disputed” tag and provide contextual information next to fake news through its existing “Related articles” feature.
According to one of the posts (emphasis ours):
“Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs – the opposite effect to what we intended. Related Articles, by contrast, are simply designed to give more context, which our research has shown is a more effective way to help people get to the facts. Indeed, we’ve found that when we show Related Articles next to a false news story, it leads to fewer shares than when the Disputed Flag is shown.”
In a separate Medium post, the three Facebook staffers leading the firm’s efforts against fake news wrote that giving people more context meant they shared less fake news. Unfortunately, neither the disputed tags nor the additional context stopped people from actually clicking on fake news.
“During these tests, we learned that although click-through rates on the hoax article don’t meaningfully change between the two treatments, we did find that the Related Articles treatment led to fewer shares of the hoax article than the disputed flag treatment.”
The trio cited academic research showing that giving context to fake news helped “reduce misperceptions.”
Facebook will still use some other tactics it was testing: It will still use fact-checkers to determine the accuracy of articles, reduce the distribution of fake news, and send alerts to people who have shared disputed stories with extra context.