"Clearly there are issues with the pages but we are very cautious about political speech," Simon Milner, Director Public Policy, Facebook, told MPs on Tuesday, December 19.
Britain First was founded in 2011 as a right-wing political party, although none of its members have ever been elected to a political post in the UK.
The Twitter account of the leader of Britain First, Paul Golding, and deputy leader, Jayda Fransen, were suspended after she was charged in Northern Ireland for inciting racial hatred after making a speech in which she criticized Islam.
Ms. Fransen was catapulted to global notoriety when US President Donald Trump retweeted several of her online posts which were roundly condemned as Islamophobic.
Mr. Milner said Facebook was now employing 7,500 people moderating content globally.
He was called before the Home Affairs Committee along with Dr. Nicklas Berild Lundblad, Vice President Public Policy, Europe, Middle East and Africa (EMEA), Google, and Sinead McSweeney, Vice President, Public Policy and Communications, EMEA, Twitter.
Simon Milner from @facebook says they share challenge in the @PublicStandards report and need to rise to it. Lots to learn from and appreciate that it’s recognised as a collective endeavour #publiclife pic.twitter.com/J8q7RARWVT— BCS (@bcs) 13 December 2017
"Our focus has been on global terrorist organizations. One of the issues with this is content from videos like this can be used by news organizations to highlight their activities," said Mr. Milner.
"With this material, context really matters. There is a chance we are taking down important journalism," he added.
The committee published a report in May about "hate speech" and asked the three online giants to return to update them on the progress they had made.
'Sea-Change' at Google
Google's vice-president of public policy Dr. Nicklas Lundblad said there had been a "sea-change" in the way they worked with troubling content and said they were going to use a form of artificial intelligence which would be "five times" more effective than human moderators.
He insisted they were proactively looking for content to be removed from Google, rather than just responding reactively.
McSweeney on Twitter’s verification system: “It became clear recently it was broken. People became verified who should have never been verified"— Mark Di Stefano 🤙🏻 (@MarkDiStef) 19 December 2017
Mr. Milner was asked about legislation which is proposed in Germany, which would impose huge fines on social networks who do not delete illegal content.
"The German legislation is not yet in action. It is asking us to decide what is illegal, not courts, and we think that is problematic," he replied.
Racist Video 'Not Removed by YouTube'
The committee's chairman, Yvette Cooper, claimed YouTube had failed to remove a racist video which she had repeatedly flagged up persistently for eight months.
"It took eight months of the chair of the select committee raising it with the most senior people in your organization to get this down. Even when we raise it and nothing happens, it is hard to believe enough is being done," said Ms. Cooper.
The MPs said the firms had not made enough progress on removing "hate speech".
"Is it not simply that you are actively recommending racist material into people's timelines? Your algorithms are doing the job of grooming and radicalizing," Ms. Cooper asked Dr. Lundblad.
He said Google did not want people to "end up in a bubble of hate."
Ms. Cooper, a former Shadow Home Secretary, asked Ms. McSweeney about racist tweets aimed at Shadow Home Secretary Diane Abbott and death threats which were twitted against anti-Brexit Conservative MP Anna Soubry.
"Where we see someone getting a lot of abusive content, we are increasingly communicating to them within the platform," said Ms. McSweeney.
"Right now, I can't say what you'd see. You can clean a street in the morning and it can still be full of rubbish by 10pm," Ms. McSweeney added.