Prime Minister Scott Morrison has called for live-streaming to be suspended on social media as politicians around the world condemned big technology companies such as Facebook, Twitter and YouTube for failing to stop the spread of video footage of Friday’s terrorist attacks in Christchurch.
New Zealand Prime Minister Jacinda Ardern plans to take up the issue directly with Facebook, which said it had removed 1.5 million videos of the footage within 24 hours of the shootings.
“This is an issue that goes well beyond New Zealand but that doesn’t mean we can’t play an active role in seeing it resolved,” Ms Ardern said on Sunday. “This is an issue I will look to be discussing directly with Facebook.”
Mr Morrison said that “the capacity [for Facebook, Google and Twitter] to actually assist fully is very limited on the technology side. In the past, they have suspended this sort of Facebook live-streaming, and assurances were given that when it was put back up it could avoid this. Clearly it hasn’t.”
Facebook was used by Brenton Tarrant to live-stream the attack, and new versions of the video and the shooter’s manifesto spread quickly across Twitter and YouTube.
Facebook, Google – which owns YouTube – and Twitter struggled over the weekend to deal with the footage and the terrorist’s manifesto being repeatedly reposted.
“You really need to do more @YouTube @Google @facebook @Twitter to stop violent extremism being promoted on your platforms,” the UK Home Secretary, Sajid Javid, tweeted on Friday. “Take some ownership. Enough is enough.”
Mr Morrison said that “there is very real discussions that have to be had about how these facilities and capabilities, as they exist on social media, can continue to be offered where there can’t be the assurances given at a technology level. Once these images get out there, it is very difficult to prevent them.”
Facebook, Google and Twitter have had staff across time zones working around the clock in co-ordination with their various artificial intelligence platforms to track down and remove the video.
The complete removal of the content is not easy and goes to the core of what these platforms were set up to do – give people an unrestrained voice to share their views. The gunman’s various social media accounts were removed and the technology giants were also proactively searching for accounts being set up in his name to prevent impersonation and further spreading.
The technology being used to track down the video takes a visual footprint of the footage, meaning that specific version can be banked and blocked across a platform. The problem is the video is being downloaded and edited then uploaded and shared again, creating a new visual footprint that the likes of Facebook, YouTube and Twitter need to track down and block.
Each edit can modify colour, add watermarks or captions, making the task challenging. Facebook, YouTube and Twitter have removed and blocked hundreds of different versions of the video.
Essentially, it’s a game of whack-a-mole.
“We are deeply saddened by the shootings in Christchurch on Friday,” a Twitter spokesman said.
“Twitter has rigorous processes and a dedicated team in place for managing exigent and emergency situations such as this. We also co-operate with law enforcement to facilitate their investigations as required. We have dedicated government and law enforcement reporting channels for illegal content. We have a specially trained team that reviews each report against the Twitter Rules and our Terms of Service, and determines whether or not it is in violation. We remain committed to working with governments around the world, including in Australia and New Zealand, to encourage healthy behaviour on the platform.”
Facebook director of policy in Australia and New Zealand Mia Garlick said: “We continue to work around the clock to remove violating content from our site using a combination of technology and people. In the first 24 hours, we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload. Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content.”
As Peter Kafka quoted Facebook boss Mark Zuckerberg in Recode from a post relating to the spread of misinformation by Russians on the platform in 2017; these platforms put no roadblocks to the spouting of hate speech and now a terrorist attack before it has actually happened.
“We don’t check what people say before they say it, and frankly, I don’t think society should want us to. Freedom means you don’t have to ask for permission first, and by default, you can say what you want,” Kafka quoted from a Zuckerberg response in 2017.
Business Insider Emails & Alerts
Site highlights each day to your inbox.