- YouTube removed 8.3 million videos that violated its terms in the final three months of 2017.
- More than half were spam or porn, while others contained violent or extremist material.
- YouTube said 6.7 million of the inappropriate videos were spotted by a machine.
YouTube has, for the first time, revealed the scale of the task it faces in removing videos that violate its terms.
The Google-owned platform pulled down 8.3 million videos between October and December 2017, with more than half of the problem posts being spam or sexual content. Here’s a full breakdown:
- Sexual: 30.01%
- Spam or misleading: 26.4%
- Hateful or abusive: 15.6%
- Violent or repulsive: 13.5%
- Harmful dangerous acts: 7.6%
- Child abuse: 5.2%
- Promotes terrorism: 1.6%
In a blog detailing the work it’s doing to enforce its community guidelines, YouTube said the removed videos represented a “fraction of a per cent of YouTube’s total views” during the final three months of last year.
The company added that 6.7 million of the problem videos were rooted out by the machines rather than humans. The rest were identified by trusted flaggers, users, NGOs, and government agencies.
“Automated flagging enables us to act more quickly and accurately to enforce our policies,” YouTube said, adding that since introducing the system, more than half of videos removed for violent extremism had fewer than 10 views.
YouTube is under huge pressure to stay on top of inappropriate content, with the company being criticised for carrying videos promoting terrorism and child abuse. Inappropriate videos have also surfaced on YouTube Kids, including weird conspiracy videos.
Business Insider Emails & Alerts
Site highlights each day to your inbox.