10:14 GMT +318 December 2017
Listen Live
    Facebook eye

    Facebook Racism Scandal 'Shows Power of Voluntary Big Brother We've Created'

    CC0
    Opinion
    Get short URL
    414

    Facebook is in damage control mode after a series of racist and anti-Semitic ad categories prompted it to shut down the user-generated fields in its advertising system. Speaking to Radio Sputnik, Jewish and human rights activist Rabbi Abraham Cooper outlined what can be done to help companies like Facebook eradicate racism and anti-Semitism online.

    Speaking to Radio Sputnik, Rabbi Abraham Cooper, director of the Global Social Action Agenda at Simon Wiesenthal Center, a leading Jewish and human rights organization, outlined the potential for radical groups, including terrorists, to use social media to spread their hateful messages, and what can be done to deal with the problem.

    Cooper explained that his organization has been monitoring these sorts of activities for two decades now, and has found that terrorist groups certainly use social media, enjoying particular success with Twitter until just recently, to establish "marketing beachhead[s]" of sorts.

    "So when you think about the power of the internet and especially social media, if you're an extremist group that wants to make your way into the mainstream, if you want to target your audience as we just saw with the ads used on Facebook, or even target your enemies, it's an extremely powerful tool, and whatever the target may be…that is part of the power of the internet, and part of the reality of the internet."

    "As far as the Facebook incident is concerned, while it was short lived, it's extremely disconcerting," Cooper said, referring to the algorithms scandal. 

    "It could have been targeting any individual or any group of people; that shows the power of these algorithms, and if you will the power of the voluntary Big Brother that we've created collectively by giving our personal information out to these various companies in the US and around the world."

    Cooper suggested that a big part of the problem often comes down to companies appetite for profit, without considering the other implications. "When you combine the potential financial windfall, and just rely on technology, and leaving ethics and the human element out, you can have that kind of result," he said.

    Ultimately, the activist stressed that right now it's not enough for Facebook to simply apologize. Instead, he said, the company should turn to its teams of programmers to create safeguards to eradicate this kind of hatred online. "They clearly failed to do so, and that's not the first time. We're very disappointed with what happened. We're very concerned because others may pick up that mantle, and it's hopefully a wakeup call to both users and companies globally, that just because some engineering and marketing department comes up with a great idea that doesn't need human input, and you can just make it into another ATM machine, another monetary stream, that [when you do so] you're playing with fire here."

    Cooper noted that this was a global problem, affecting not just Facebook and Twitter, but other social media platforms as well, including Russia's VKontakte. "We would certainly like to see them tighten up their own rules."

    Finally, Cooper said that his organization's effort to help deal with this issue has been "to try to spend less time with governments, and more time with the creative geniuses that are giving us social media and the rest of the internet; to have them set transparent rules, have them put staff in-house to create digital-electronic tripwires to catch these kinds of individuals and trends, and less of a focus on governments." The latter, he said, "generally know much, much less about the technological breakthroughs, niceties and visions" in the internet, and about the fine balance "between becoming censors, and allowing for free expression on the internet."

    Facebook shut down its 'self-advertising targeted fields' after ProPublica, a US non-profit newsroom, found anti-Semitic, racist, and far right groups in the categories advertisers could use to direct ads to. After being alerted, the company removed the categories and issued a statement on its tough anti-hate speech policy.

    Related:

    Racism Scandal: 'Facebook Must Realize Great Power Means Great Responsibility'
    Techies Trolling! Huawei Takes a Bite at Apple iPhone X (VIDEOS)
    'Can’t Catch Me': Wanted Man Trolls North Wales Cops on Facebook
    iPhone X: The Most Vulnerable Smartphone Ever?
    'Jew Haters', 'KKK': Facebook Launches Probe Into Site's Ad Targeting System
    Tags:
    Anti-Semitic, expert commentary, algorithms, racism, Facebook
    Community standardsDiscussion
    Comment via FacebookComment via Sputnik
    • Сomment