Authored by the network's Chief Security Officer Alex Stamos and colleagues, Information Operations and Facebook describes an expansion of the company's security focus from traditional abusive behavior, such as hacking, malware, spam and financial scams, to more "subtle and insidious forms of misuse," including "attempts to manipulate civic discourse and deceive."
Facebook is dedicated to combating misinformation and information operations. Read our report here: https://t.co/xuImdH5DdS— Alex Stamos (@alexstamos) April 27, 2017
The move seems an odd one given founder Mark Zukerberg's repeated insistence that "99 percent" of the content users see on the social network is authentic, and firm dismissal of the notion alleged fake news on the site in any way swayed voters in the US Presidential election as "crazy."
Still, the paper claims the company is taking the issue very seriously indeed. Facebook, the paper said, identified three main components involved in information operations — targeted data collection, content creation and false amplification. This triumvirate is used to steal and expose private information, spread false stories to third parties via fake accounts, and using fake accounts to manipulate political discussion.
Although the report attributes such behavior to specific, ideologically driven "sides" in politics — countries, candidates or candidates' supporters — it also concedes independent groups of bad actors are often interested simply in causing chaos using social media for their own amusement.
"Facebook sits at a critical juncture. The reality is not everyone shares our vision, and some will seek to undermine it — but we are in a position to help constructively shape the emerging information ecosystem by ensuring our platform remains a safe and secure environment for civic engagement," the paper said.
To deal with this apparent threat, the company is investing heavily in machine learning applications to weed out fake accounts and stories — money perhaps misspent, given computer scientists from the University of California and the Swiss Federal Institute of Technology have found simple code can identify fake news 99 percent of the time.
Facebook will also add features to the service to make it easy for users to report fake news — although this raises the issue of whether users will be able to discern fake from real news.
In any event, in the current media environment, many individuals have been conditioned to view mainstream media outlets as the sole arbiters of real information, despite these same outlets being intimately involved in the dissemination of false information in the past, such as the Iraq war. Bias and lies from public figures, official reports and mainstream news can be difficult if not impossible to detect, especially when a number of outlets all say the same thing.
Professor Elig Skogerbo, an expert in media at the University of Oslo, says it's difficult to say whether the strategy will effective.
"I don't know whether the strategy represents a U-turn, but it seems like the company is getting proactive about the problem. Facebook typically takes a 'step back' and refuses to get involved in editorial issues, and it's difficult to tell whether this is viable or not. It's good they've acknowledged it's a problem — and that it's an age old issue too, not a new one — but these measures may affect ordinary users," Professor Skogerbo told Sputnik.
The paper also makes no mention of whether Facebook will be similarly cracking down on articles about astrology, climate change denial, conspiracy theories, homeopathy, intelligent design and supernatural powers, among other nonsense that routinely filters through its network — or indeed the deceptive marketing and stealth advertising that Facebook features throughout its site, which it depends on for revenue.
Given Zuckerberg's oft-stated public skepticism of the power and scale of the alleged fake news epidemic, the paper, and the company's strategies, are arguably a mere public relations exercise, designed to offer the appearance of doing something about something believed to be a problem.
The report itself acknowledges the reach of "false amplifiers" is miniscule, with content these accounts posted accounting for one tenth of one percent of overall civic engagement on Facebook.
"Facebook conducted research into overall civic engagement during [the US Presidential election]… while we acknowledge the ongoing challenge of monitoring and guarding against information operations, the reach of known operations during the election was statistically very small compared to overall engagement on political issues," the report said.
In an ironic twist, few if any mainstream media organizations which have covered the paper's release have acknowledged these elements — perhaps unsurprising, given how frequently these same outlets have spoken of the immense threat to democracy posed by fake news. Evidently, fake news isn't quite the crisis it's made out to be.