In a recent report, the Wall Street Journal reveals that internal Facebook documents show that a change to Facebook’s algorithm in 2018 designed to make the platform a nicer place appeared to backfire completely by boosting content that promoted outrage.
In a recent article titled “Facebook Tried to Make Its Platform a Healthier Place. It Got Angrier Instead,” the Wall Street Journal claims that Facebook’s newsfeed algorithm saw a major change in 2018 that appeared to promote outrageous and negative content on the platform. When informed of this, top executives including CEO Mark Zuckerberg were allegedly hesitant to solve the issue.
In 2018 Facebook announced plans to change its News Feed algorithm to boost “meaningful social interactions,” between friends and family, according to internal documents. Facebook CEO Mark Zuckerberg said that the aim of the change to the algorithm was to strengthen the bonds between users and to improve their overall well-being.
However, within the company staffers were sounding alarm bells, stating that the change to the algorithm was actually having the opposite of the desired effect. Facebook interactions were becoming angrier and more vitriolic. Researchers at the social media giant found that publishers and political parties were receiving more comments and interactions when posting content that was aimed at creating outrage and sensationalism.
A team of data scientists wrote: “Our approach has had unhealthy side effects on important slices of public content, such as politics and news. This is an increasing liability.” The scientist concluded that the new algorithm placed more importance on reshared content on its News Feed and made negative comments more visible. “Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” researchers said.
The Wall Street Journal writes:
Anna Stepanov, who led a team addressing those issues, presented Mr. Zuckerberg with several proposed changes meant to address the proliferation of false and divisive content on the platform, according to an April 2020 internal memo she wrote about the briefing. One such change would have taken away a boost the algorithm gave to content most likely to be reshared by long chains of users.
“Mark doesn’t think we could go broad” with the change, she wrote to colleagues after the meeting. Mr. Zuckerberg said he was open to testing the approach, she said, but “We wouldn’t launch if there was a material tradeoff with MSI impact.”
Last month, nearly a year and a half after Ms. Stepanov said Mr. Zuckerberg nixed the idea of broadly incorporating a similar fix, Facebook announced it was “gradually expanding some tests to put less emphasis on signals such as how likely someone is to comment or share political content.” The move is part of a broader push, spurred by user surveys, to reduce the amount of political content on Facebook after the company came under criticism for the way election protesters used the platform to question the results and organize protests that led to the Jan. 6 riot at the Capitol in Washington.
Facebook spokesman Andy Stone in a written statement to the Wall Street Journal: “Is a ranking change the source of the world’s divisions? No. Research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed.”
Read more at the Wall Street Journal here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com