Facebook's Algorithm 'Fostered Rage' by Preferring Emoji Reactions to Likes, Internal Docs Show
In what Mark Zuckerberg has already blasted as a "coordinated" attempt to cast a shadow on Facebook's reputation, a consortium of 17 US news agencies triggered a series of stories based on internal Facebook documents leaked by a whistleblower that show how the social media platform fails to protect its users from harmful content and disinformation.
Facebook's five-year-old feature of "emoji reactions" that expanded ways of engaging with posts from just liking them to selecting among the options of "love", "haha", "wow", "sad", and "angry" appear to have been "worth" more than just a "thumbs up" - rather an aspect that added to fostering rage and misinformation on the platfrom, internal documents, cited by The Washington Post
in a Tuesday report, showed
According to a new portion of "The Facebook Papers", emoji reactions were valued five times more than a regular like, and therefore received larger engagement, feedback, and distribution.
This caused concern among employees, with many Facebook engineers suggesting that such a feature could provide additional promotion for posts that are negative, provocative, deliberately fake, or hate-instigating.
A screenshot of the internal correspondence cited by The Washington Post shows one employee even giving an example of how this could play out.
"Quick question to play devil's advocate: will weighting Reactions 5x stronger than Likes lead to News Feed having a higher ratio of controversial than agreeable content?", the employee writes.
One colleague replies with "it's possible", adding that the new option will have to be evaluated, and assuring that the firm's engineers are "working" to "mitigate the potential integrity impact of the launch". Other Facebook staffers noted that anger is a "core human emotion" and anger-generating posts could be essential to protests movements in different countries.
"Anger and hate is the easiest way to grow on Facebook", Frances Haugen, a former Facebook employee and a whistleblower behind "The Facebook Papers" told the British Parliament on Monday.
Additionally, the social media platform's CEO Mark Zuckerberg himself once recommended that users use the anger emoji as a "dislike" button - with the reaction still boosting the post the user felt the urge to dislike.
Moreover, according to The Washington Post, Facebook's engineers have been conducting multiple experiments to evaluate the effectiveness of the different measures, emoji reactions included. In 2019, the platform came up with a mechanism to "demote" content that was receiving disproportionately angry reactions, although the leaked documents do not specify how exactly it worked.
Later, a proposal emerged to cut the value of certain emojis down to a like or even equate them to nothing. According to internal research, potentially toxic content was estimated to be inviting more "angry", "wow", or "haha" emojis. Reactions like "love" and "sad" were suggested to be worth four likes, since they were safer.
Per one staffer, this would have been an "easy fix" with "fewer policy concerns" that could follow the attempts to identify toxic content, but the proposal ended up being ditched at the last minute.
"The voice of caution won out by not trying to distinguish different reaction types and hence different emotions", one staffer wrote later.
Yet, as time passed, additional research revealed that users did not like when their posts received "angry" reactions, and Facebook slashed the value of all reactions to one and half likes. Eventually, the social media platform removed the "angry" reaction from the equation and stopped using it as an indicator of what users wanted.
After the move, according to the documents cited by WaPo, users began to receive less disturbing content and misinformation, while the levels of their activity and engagement remained unaffected.
Now that this and many other revelations about Facebook are being unleashed in the media, Zuckerberg has said that "The Facebook Papers" leak is nothing but a "coordinated effort" by news agencies and whistleblowers to paint a "false picture" of his company. He said that the leaked materials were used "selectively" in order to cast a shadow over Facebook, while his company is burdened with the task of "balancing different difficult social values" as it is being accused of focusing on making money rather than upholding its principles.