WhatsApp has a child porn problem

  • WhatsApp has become a platform for users to “openly” share pictures and videos of child pornography, the Financial Times reports.
  • A group of Israeli researchers found dozens of active WhatsApp groups where people frequently sent messages with media depicting sexual abuse of children.
  • While WhatsApp’s end-to-end encryption keeps the company and governments from being able to see the contents of messages sent and received on the app, the researchers point out that many of the groups made their intentions clear with publicly visible references to child porn or explicit profile photos.

Child pornography is “openly shared” in dozens of groups on WhatsApp, even after researchers brought the problem to the company’s attention, the Financial Times reports.

Israeli researchers shared with FT that they discovered “extensive child abuse material” in dozen of WhatsApp groups earlier this year. The group reported their findings in September to Facebook, who owns WhatsApp, but FT was able to find “several” of these groups this week that were still extremely active.

“It is a disaster: this sort of material was once mostly found on the darknet, but now it’s on WhatsApp,” one of the researchers told FT.

The illegal content – “videos and pictures of children being subjected to sexual abuse” – was discovered in WhatsApp groups “easy to find and join” by researchers, who work at Israeli charities trying to improve online safety.

WhatsApp told FT it has “techniques” to scan users and groups for illegal content, and that thousands of accounts are banned a day. However, Israeli researches say that some of the groups they monitored made their purpose clear, with names like “cp,” an abbreviation for child porn, and explicit profile pictures.

WhatsApp has had end-to-end encryption since 2016. The feature provides users with an extra layer of privacy from any potential cybersecurity threats or government surveillance. However, that same protection also means that WhatsApp and law enforcement are unable to see the contents of messages suspected to contain illegal or abusive activity. In other words, the same detection tools that Facebook uses to monitor its site and Instagram can’t be used for WhatsApp.

Police and government officials have long criticised hardware and apps that utilise strong encryption as a hindrance to their investigations, while privacy activists tout the technology as one of the only ways to truly communicate privately in an era of widespread government surveillance.

A moderation problem

While Facebook employs thousands of content moderators, WhatsApp only has 300 employees to monitor its 1.5 billion users globally, FT reports.

WhatsApp has come under fire this year for allowing the major spread of hoaxes and viral fake news that has had severe real-world implications. In India, the killings of 31 people this year are attributed to viral false rumours and fake videos that spread on WhatsApp and incensed lynch mobs to take violent action. In Brazil, WhatsApp was used to spread disinformation and misleading information amidst the country’s contentious election.

But WhatsApp hasn’t been the only social platform that’s come under fire for its inability to keep child porn off its network. Tumblr was removed from Apple’s App Store for a month because child porn was slipping past the platform’s filters.

NOW WATCH: Tech Insider videos

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.