For example, it will no longer rely on a list of external websites to decide whether a topic surfaced by its algorithm is important.
Facebook launched an investigation after several Gizmodo reports earlier this month sparked widespread controversy by alleging that former news “curators” routinely omitted articles from conservative news sources in the Trending section on the right side of its main News Feed.
Although the company’s investigation revealed no evidence of the systematic bias described in the reports, Facebook says it couldn’t “fully exclude” the possibility of that some of its curators exhibited unintentional bias or isolated improper actions.
So, it decided to make a number of changes to minimise the risk of human judgment interfering.
Here are the tweaks Facebook made to its rules, via its letter to Senate Commerce Committee chairman John Thune:
- We have already updated terminology in our Guidelines and conducted refresher training for all reviewers that emphasised that content decisions may not be made on the basis of politics or ideology.
- We will institute additional controls and oversight around the review team, including robust escalation procedures.
- We will expand our Help Center content related to the Trending Topics feature to provide additional information about this feature.
- We will eliminate our reliance on external websites and news outlets to identify, validate, or assess the importance of trending topics. This means that we will discontinue use of the top-10 list of news outlets, the Media 1K list, and the RSS feed.
As a reminder, the issue is not about which news articles appear in your Facebook Newsfeed, which are shared by your friends and the pages that you’ve Liked. The news bias controversy relates only to the selection of links in the upper right corner, known as the Trending topics section, which looks like this: