Mark Zuckerberg reportedly signed off on a Facebook algorithm change that favoured established news sites and made feeds 'nicer' after the election

Nick Wass/Associated Press
  • In the days after the US election, Facebook’s CEO Mark Zuckerberg signed off on an algorithm change to the platform’s news feed in an effort to fight misinformation, the New York Times reports.
  • The change meant established news sites were promoted, while content from more engaged, partisan sites was demoted.
  • Some employees asked in a company meeting if this “nicer news feed” could become permanent, the Times reported.
  • Visit Business Insider’s homepage for more stories.

In the days following the US presidential election, Facebook made some emergency changes to its algorithm.

The New York Times reports the company changed the way it promoted and demoted content on people’s news feeds after employees noticed an alarming amount of misinformation spreading on the platform.

The change meant the algorithm assigned more importance to posts from established news organisations such as CNN, The New York Times, and NPR. This was based on an internal metric the company uses called “news ecosystem quality,” abbreviated to NEQ.

Publishers with a high NEQ benefitted from the change, while partisan but highly engaged pages, such as Breitbart and Occupy Democrats, were demoted.

Three people with knowledge of the matter told the Times that Facebook’s CEO Mark Zuckerberg personally signed off on the change.

The change was reportedly popular with some employees at Facebook, and two sources said at a meeting held one week after the election that employees asked whether the “nicer news feed” could become a permanent feature.

It is not clear from the Times’ report if and when this feature has been rolled back.

According to the Times, Facebook also conducted research earlier in November into whether the content shared on its platform is predominantly “good for the world” or “bad for the world.” The company surveyed users, asking them to categorise specific posts as one or the other.

The Facebook researchers found a correlation between posts gaining lots of views and being categorised as “bad for the world.”

Facebook then trained an algorithm to predict which posts users would find “bad for the world” and make them less visible in people’s feeds. An internal report found the algorithm achieved this fairly well, but this correlated with users opening Facebook less often.

“The results were good except that it led to a decrease in sessions, which motivated us to try a different approach,” a summary of the report’s findings viewed by the Times said.

Subsequently the company tweaked this algorithm to demote “bad for the world” content less stringently.

Facebook was not immediately available for comment on the report when contacted by Business Insider.

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.