10:45 GMT22 February 2020
Listen Live
    Get short URL

    The majority of Brits believe that social media companies, like Facebook, should have to take responsibility to prevent their platforms being used to spread misinformation. That's according to a new poll by YouGov.

    In the light of US elections and the Brexit referendum, Facebook has faced allegations that fake stories spread on it's site, influenced the election of President-elect Donald Trump.

    Saturday, November 12, Facebook CEO Mark Zuckerberg dismissed accusations that fake news on the social media platform influenced the US election. In a personal blog post, he insisted that 99% of news links shared on the site are legitimate.

    "After the election, many people are asking whether fake news contributed to the result, and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right. I want to do my best to explain what we know here.

    "Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics.

    "Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other," Zuckerberg wrote.

    Two days later, Facebook announced that it updated its advertising policies to highlight that its ban on misleading content applies to fake news. In other words, Facebook hopes to prevent fake news sites from earning money through Facebook advertising.

    That's two denials on fake news in three days.

    Keeping the pressure up, a new YouGov poll has found that 72% of Brits believe that social media companies, like Facebook should take responsibility for fake news shared on their sites.

    So, why is it so hard for social media to police fake news?

    A former Facebook engineer, Lars Backstrom in a blog post in August 2013, explained that it's down to how Facebook's news link algorithm works.

    On average, there would be thousands of potential stories a day from friends and people users follow that could appear on each users feed, without Facebook's algorithm.

    "With so many stories, there is a good chance people would miss something they wanted to see if we displayed a continuous, unranked stream of information," Backstrom said.

    Facebook's algorithm boils this down to around 300 stories, that it "prioritizes" using factors like how often you interact with a friend, page or public figure; how many likes, shares and comments individual posts have received; and how much you have interacted with that kind of post in the past.

    Facebook insists that users like having this tailored social media experience. However, it means that if users engage more with fake news than real news, as seems possible, then Facebook's algorithm will promote the fake news.

    The consequences are far-reaching.

    Critics of this kind of filtering say that it's creating an artificial news bubble. That public discourse today is not as diverse as it should be, because influential news sources like Facebook are ensuring we mainly just see what we want, and have hidden, what we don't.

    This kind of "echo chamber" is significant. Through the US 2016 election and this year's UK Brexit referendum campaign, political pundits liked to use words like "polarized" or "more divided than ever."

    It's possible that the social media's news bubble is contributing to this.

    A Pew Research study last year, found that 61% of US millennials used Facebook as their most common source for political news. That's 17 points higher than the next most consumed source, which was CNN at 44%. It's undeniable, that for millions of people, Facebook is not just about sharing personal baby photos and inane cat videos, it's a legitimate news source.

    Mark Zuckerberg has repeatedly insisted that Facebook is doing enough to prevent fake news from being shared, and so "legitimized" on his platform.

    However, many remain unconvinced.

    Moving forward, Facebook could try and emulate Google News, with more sophisticated vetting algorithms.

    In October, Google News started attaching a "fact check" label to dubious stories and linking to trusted sites that debunked them — all achieved by an algorithm.

    Or they could go the human route, and like more traditional news sources, hire journalistic editors to vet stories.

    In the meantime, Facebook earns around 80% of its revenue from advertising. So, regardless of the veracity of news stories, the more we share, the more they earn.


    Hillary Clinton Divorcing Bill and Seven Other Fake News That Media Fell For
    Is This A Fake Video? Facebook Cracks Down on False News
    fake, social media, technology, media, news, online, Internet, 2016 US Presidential election, YouGov, Facebook, Mark Zuckerberg, Silicon Valley, Britain, United States, United Kingdom
    Community standardsDiscussion
    Comment via SputnikComment via Facebook