While almost every Western government has talked tough on the alleged "fake news" epidemic, Germany is one of the few to have taken serious, concrete measures against it — or, at least, attempted to.
After much debate, in June, national lawmakers passed a controversial law that would see social media platforms fined over US$50 million for failing to remove content flagged as misleading, or hate speech within 24 hours — despite warnings from experts the measures would like be entirely unenforceable.
While the legislation applies to any company identifiable as a social network, it is Facebook that has drawn the bulk of political ire on the issue of "fake news" — perhaps understandably, as it is by far and away the world's largest.
Such is the pressure being applied to the company, in April, Facebook published a report detailing its plans to tackle "fake news" — a phenomenon it defined as a key plank of "information operations" carried out by both government and non-state actors, which seek to use the social media platform "to distort domestic or foreign political sentiment" in order to achieve "a strategic and/or geopolitical outcome."
Facebook is dedicated to combating misinformation and information operations. Read our report here: https://t.co/xuImdH5DdS— Alex Stamos (@alexstamos) April 27, 2017
It claimed the company had identified three main components involved in "information operations" — targeted data collection, content creation and false amplification, a triumvirate used to steal and expose private information, spread false stories to third parties via fake accounts, and manipulate political discussion.
Nonetheless, some critics suggested the publication was a mere public relations exercise, designed to offer the facade of being proactive about an issue widely touted as a problem.
In a little-acknowledged section of the report, the authors themselves admitted the reach of both "false amplifiers" and "fake news" was miniscule, with such content accounting for one tenth of one percent of overall civic engagement on Facebook.
The paper also made no mention of whether Facebook would be cracking down on articles about astrology, climate change denial, conspiracy theories, homeopathy, intelligent design, the paranormal and other such content of questionable veracity and value that routinely filters through its network — or indeed, misleading or stealth advertising, which Facebook sanctions for display across its network and keeps the company's lights on.
Moreover, the next month, Zuckerberg himself seemed to row back somewhat on the company's determined commitment to preventing the proliferation of misleading content in a 6,000-word open letter.
"We know there is misinformation on Facebook, [but] there is not always a clear line between hoaxes, satire and opinion. In a free society, it's important people have the power to share their opinion, even if others think they're wrong. Our approach will focus less on banning misinformation, and more on surfacing additional perspectives and information," he wrote.
Even if only indirectly, the Facebook founder's missive acknowledged the contradiction at the core of the "fake news" debate — who decides what's "real" news and what's not, and how and why, is just as important as the very question of what content is "fake" and what's not. Yet, politicians, journalists and social media platforms themselves have failed to consider this inconsistency, much less offer a satisfactory remedy — the universal solution is said to be the introduction of fact-checking units.
These teams are charged with indentifying potentially dubious content, and investigating its reliability. The issue of who fact checks the fact checkers has gone unremarked upon, much less explored.
Facebook rolled out just such a tool in Germany in April, which allows users to flag potential "fake news" — the content is then sent to an independent verification team for analysis.
The only potential problem with this is approach is that studies suggest it's doubtful non-experts can tell the difference between fake and non-fake content — and furthermore, highlighting a story as untrue can actually make people remember the story as true.
Going one further, German investigative journalism center Correctiv started a dedicated website, "Echtjetzt" ("Really now?"), to debunk false information circulated during the election campaign.
The system also does indeed require a second team of fact-checkers to verify Correctiv's work, meaning it can be hours — if not days or weeks — before a story can be conclusively demonstrated to be untrue. Quite some undertaking, especially given the content may just stay on Facebook anyway.
Facebook has decided to prevent modification of link previews, in order to combat “fake news” pic.twitter.com/IdLTaBT0iP— Anthony De Rosa 🗽 (@Anthony) July 13, 2017
In any event, the apparent risk to Western elections posed by "fake news" and cyberattackers is demonstrably overblown. In the lead up to almost every major Western election in 2017, it was claimed by politicians and the media that hostile foreign governments — particularly and perhaps specifically Russia — were sharpening their proverbial knives, preparing to throw votes in their favor.
Most notably, despite a glaring lack of evidence, it was claimed Russian cyberattackers were actively meddling in the Dutch and French electoral processes — allegations that evaporated hastily when the "right" results were actually achieved.
Claims Russia had attacked President Emmanuel Macron's campaign were enthusiastically seized upon by the mainstream media prior to the French second-round runoff — the Chief of France's National Agency of Information Systems Security admitting in June there were no traces of Russia's hand in the election was largely ignored by the same outlets.
In the end, Germany's alarmism over "fake news" and the like may well prove to be the dog that didn't bark.
Come September 25, when Merkel's Union almost inevitably emerges triumphant, it's likely the phenomenon will be hastily forgotten, until the next time.