19:01 GMT12 August 2020
Listen Live
    Get short URL

    As part of a continued push to ensure internet users can only access information deemed acceptable by Western governments, Stanford University’s Internet Observatory is pushing tech giants to surrender more user data than ever - and admitting some of its dark secrets in the process, a Sputnik journalist has uncovered.

    The Stanford Internet Observatory (SIO) is the newest enforcer of the US State Department’s official line on acceptable news, under the direction of former Facebook security chief Alex Stamos and former New Knowledge head researcher Renee DiResta.

    By winning special backdoor access to user data, the SIO is able to police search engines and social media sites like Facebook, Google and Bing in order to pressure them into removing or hiding alternative points of view, such as those found at Sputnik and RT, and present the dissemination of viewpoints that disagree with Washington’s or London’s as the nefarious work of state actors working in coordinated disinformation networks.

    Sputnik News journalist Morgan Artyukhina, the author of a Thursday story on Sputnik News exposing the SIO’s dark secrets, told Radio Sputnik’s Loud and Clear Friday that groups like the SIO use Russiagate scare tactics in order to “police what people see in the interest of defending democracy.”

    “I first started looking into this story back in December, when I saw this report that came out from the Observatory attacking Bing, or taking it to task, for failing to de-rank Sputnik and RT in search results for certain specific search results: ‘novichok,’ ‘MH17,’ ‘Skripal,’ I think, were the three ones that they mentioned,” Artyukhina told Sputnik. “And they show alongside [each other] that in the search results, when you search these things, [Sputnik] and RT links rank higher on Bing than on Google. And this, to them, was a problem, because this was like Bing wasn’t doing enough to tackle disinformation on their platform.”

    “So, I’m very familiar with that kind of dynamic of policing - I think we all, here at Sputnik, are familiar with that dynamic - so I looked into ‘who is this group, I’ve never heard of it,’ and started recognizing names and relationships, and the rabbit hole went deeper,” Artyukhina told hosts Brian Becker and John Kiriakou.

    “Stanford has long been a center of computer research and things like this. It was one of the anchors, when ARPANET was turned on for the first time, it was one of the original anchors for that; Larry Page and Sergey Brin, the co-founders of Google, met doing their PhDs in Stanford’s computer science program,” they noted. “If you were going to do it anywhere, this is kind of where you would do it.”

    “The Observatory started up last summer, more or less, and it’s headed by … some people who really are very hostile to the idea of user privacy and also very big on the idea of pruning the information that you see on social media in the interests of forwarding a State Department line,” Artyukhina said.

    “One of the big things about this that I talk about in the story is that, not only does the Observatory basically admit to this fact, but also stories that have come out in the last year or so have really proven that, even though Google claims it doesn’t manipulate search results, there’s now a whole lot of proof that they do,” they noted.

    “They always said, for example, a year or two ago, [Google CEO] Sundar Pichai was testifying before Congress and said, ‘We don’t manually prune [search results], it’s an algorithm, we don’t determine it,’ and whatever. But there was documents leaked to the Daily Caller, there was a report in the Wall Street Journal, that showed that actually, they do manually prune the blacklists that govern whether search results appear higher or lower. They call them the ‘10 blue links,’ or that first page of search results, and like 99% of people click on one of those first 10 links - usually the first five or six or so. So it really matters whether you’re the 20th result or the fifth result.”

    “That’s what became the impetus for a lot of this policing of information, or fake news, or disinformation, or whatever, has been the fallout of claims by the Director of National Intelligence and so on that Russians meddled in the election, whether it was ads or whatever have you. So that was kind of the impetus.”

    “And the really interesting thing is that most of the article talks about their work with Facebook, because the guy who’s the director of the Observatory is Alex Stamos, who was the former cybersecurity chief at Facebook during the Cambridge Analytica scandal, which was this enormous information gathering operation by this political consultancy firm that grabbed the information about 87 million Facebook users, most of whom didn’t even know that it was happening to them. And Facebook paid a £4 billion fine for that.”
    Stanford Internet Observatory Director Alex Stamos (left) and Research Manager Renée DiResta (right)

    “But the interesting thing is that Facebook [initially] said, ‘No, no, there was no disinformation on our platform,’” the journalist noted. “And then after steady pressure from the US government and the security agencies, they eventually said, ‘Oh, we looked again, and we did find it that time.’ So they really, then, played into this idea that ‘well, we need to police what people see in the interest of defending democracy’ or whatever.”

    “So that’s what really helps to fuel this kind of thing. It’s the same thing that we see with these periodic takedowns of thousands of accounts on Twitter or whatever, where it’s ‘disinformation’ because they’re spreading links that agree with the Iranian government’s line or the Russian government’s line, or that just happen to not follow the US or the Western line,” Artyukhina noted. “As, you know, [with Sergei and Yulia] Skripal or something like that, where there’s a specific narrative that governments are trying to forward, and if you present a different point of view than that, then that’s now termed ‘disinformation’ and is attacked.”

    “And it’s really dark in a real way, because they categorize this so-called ‘disinformation’ or ‘fake news’ alongside pseudoscientific stuff, you know, like that vaccines cause autism, alongside conspiracy theory stuff like Pizzagate; it’s all in the same big bucket of stuff we don’t want people to see.”

    The views and opinions expressed in the article do not necessarily reflect those of Sputnik.


    Second Democratic False Flag Op Uncovered in 2017 Alabama Election
    Twitter Removes Over 88,000 Accounts Linked to Saudi-Backed Disinformation Campaign
    "Democratic" Disinformation: Why Even When Bernie Wins, He Loses
    The Washington Mandarins and Veteran Disinformation Spooks Behind Stanford’s Internet Observatory
    Alex Stamos, Facebook, RT, Sputnik, Bing, Google, policing, disinformation, cybersecurity, Stanford University, Journalist, Loud and Clear
    Community standardsDiscussion