Facebook, Twitter, and YouTube are “consciously failing to combat the use of their sites” for promoting terrorism, British politicians have argued in a highly critical report published Thursday.
“These companies are hiding behind their supranational legal status to pass the parcel of responsibility,” the scathing report on radicalisation from the Home Affairs Select Committee reads.
The social networks are, MPs assert, “refusing to act responsibly in case they damage their brands.”
But the tech sector has pushed back against the report, with industry body TechUK arguing that it “paints an inaccurate picture of the commitment of tech companies to tackle online extremism.” Reached for comment, the companies singled out say that they take their responsibilities seriously, and comply with lawful requests from the authorities.
Extremism has been thrust into the spotlight following a spate of deadly terror attacks across Europe, with authorities questioning how best to combat online radicalisation. Keith Vaz MP, chair of the Committee, said that “companies’ failure to tackle this threat has left some parts of the internet ungoverned, unregulated and lawless.”
Here’s the key passage from its Conclusions and Recommendations, emphasis ours:
“The internet has a huge impact in contributing to individuals turning to extremism, hatred and murder. Social media companies are consciously failing to combat the use of their sites to promote terrorism and killings. Networks like Facebook, Twitter and YouTube are the vehicle of choice in spreading propaganda and they have become the recruiting platforms for terrorism. They must accept that the hundreds of millions in revenues generated from billions of people using their products needs to be accompanied by a greater sense of responsibility and ownership for the impact that extremist material on their sites is having. There must be a zero tolerance approach to online extremism, including enticement to join extremist groups or commit attacks of terror and any glorification of such activities. Manuals for terrorists and extremists should be removed from the internet. It is therefore alarming that these companies have teams of only a few hundred employees to monitor networks of billions of accounts and that Twitter does not even proactively report extremist content to law enforcement agencies. These companies are hiding behind their supranational legal status to pass the parcel of responsibility and refusing to act responsibly in case they damage their brands. If they continue to fail to tackle this issue and allow their platforms to become the ‘Wild West’ of the internet, then it will erode their reputation as responsible operators.”
It’s strong stuff — but the companies in question take issue with it.
In response, TechUK policy director Charlotte Hollway said that “the report paints an inaccurate picture of the commitment of tech companies to tackle online extremism. As a number of companies made clear in their evidence to the committee, responsibilities to tackle online extremism are a serious and ongoing priority, backed by significant resources, a zero-tolerance approach, and decisive and fast action when needed.”
Meanwhile, a YouTube spokesperson said in a statement: “We take our role in combatting the spread of extremist material very seriously. We remove content that incites violence, terminate accounts run by terrorist organisations and respond to legal requests to remove content that breaks UK law.”
And Facebook UK’s director of policy said that “as I made clear in my evidence session, terrorists and the support of terrorist activity are not allowed on Facebook and we deal swiftly and robustly with reports of terrorism-related content. In the rare instances that we identify accounts or material as terrorist, we’ll also look for and remove relevant associated accounts and content.”
Reached for comment, a Twitter spokesperson pointed Business Insider towards an announcement made by the company on Friday that it had suspended 235,000 accounts for promoting terrorism over the last six months. “We strongly condemn these [terrorist] acts and remain committed to eliminating the promotion of violence or terrorism on our platform,” it said in a blog post.
NOW WATCH: ‘I don’t even really know where to start on answering this question’: Watch President Obama respond to Trump’s claim that the election will be rigged
Business Insider Emails & Alerts
Site highlights each day to your inbox.