MPs are calling on the Government to introduce heavy fines for social media giants who fail to remove illegal or extremist content online.
The Home Affairs Select Committee of MPs said in a report that “many websites are ‘shamefully far’ from tackling the issue, with some putting profits before safety.”
The report, initially launched last year following the murder of the Labour MP Jo Cox by a far-right gunman, calls for a review of UK laws around he removal of illegal material, and wants fines of up to £44 million to be introduced for companies who do not remove promptly remove harmful content.
The committee is also calling for social media forms to publish quarterly reports outlining their safeguarding strategy.
“Social media companies’ failure to deal with illegal and dangerous material is a disgrace,” said Labour MP Yvette Cooper, chair of the committee, according to Sky News.
“They have been asked repeatedly to come up with better systems to remove illegal material… yet repeatedly they have failed to do so. It is shameful. These are among the biggest, richest and cleverest companies in the world.”
The Guardian reported that “swift action is taken to remove content found to infringe copyright rules” but added that the report shows “a ‘laissez-faire’ approach is adopted when it involves hateful or illegal content.”
Within recent months, a number of hate crimes have been broadcast on social media, including Facebook.
A Swedish court sentenced three young men to prison for gang raping a woman and livestreaming the attack on Facebook last week.
Jiranuch Trirat, a 22-year-old from Phuket, also watched on Facebook Live as her boyfriend Wuttisan Wongtalay hung their 11-month old daughter Natalie from the side of an abandoned building before taking his own life.
According to The Guardian, the investigation by the Home Affairs Select Committee found that extremist, hate crime or abuse-related material, including terror recruitment videos for banned jihadi and neo-Nazi groups or material encouraging child abuse, remained accessible online on sites like Facebook, Twitter, and YouTube even after MPs and journalists had complained about them, or despite the fact they had been the subject of a previous committee report.
Meanwhile, in many cases Google has failed to prevent paid ads from reputable companies from appearing next to YouTube videos posted by extremists.
According to The Guardian, Google told the committee that it plans to extend its “trusted flagger” programme and invest in its alert procedures.
Facebook also said that it is is “reviewing how it handles violent videos and other objectionable material after a video of a murder in the United States remained on its service for more than two hours.”
However, The Guardian reported that Google, Facebook and Twitter all declined to share information regarding the number of staff they employ to monitor and remove harmful content.
Get the latest Google stock price here.
NOW WATCH: Briefing videos
Business Insider Emails & Alerts
Site highlights each day to your inbox.