Apple reportedly threatened to boot Facebook from the App Store over human trafficking concerns

Tim Cook Mark Zuckerberg mashup
Apple CEO Tim Cook, Facebook CEO Mark Zuckerberg Visual China Group, Reuters
  • Apple threatened to remove Facebook from its App Store after a report about an online slave market.
  • The BBC in 2019 reported that human traffickers were using Facebook’s services to sell domestic workers.
  • The Wall Street Journal reports that Facebook knew about the practice even before Apple made its threat.
  • See more stories on Insider’s business page.

Apple threatened to kick Facebook off its App Store after a 2019 BBC report detailed how human traffickers were using Facebook to sell victims, according to The Wall Street Journal.

The paper viewed company documents that show a Facebook investigation team was tracking down a human trafficking market in the Middle East whose organizers were using Facebook’s services. What appeared to be employment agencies were advertising domestic workers that they could supply against their will, per the Journal.

The BBC published a sweeping undercover investigation of the practice, prompting Apple to threaten to remove Facebook from its store, the paper said.

An internal memo found that Facebook was aware of the practice even before then: A Facebook researcher wrote in a report dated 2019, “was this issue known to Facebook before BBC inquiry and Apple escalation?,” per the Journal.

Underneath the question reads, “Yes. Throughout 2018 and H1 2019 we conducted the global Understanding Exercise in order to fully understand how domestic servitude manifests no our platform across its entire life cycle: recruitment, facilitation, and exploitation.”

Apple and Facebook did not immediately respond to requests for comment.

The Wall Street Journal on Thursday also reported how Facebook’s AI content moderators cannot detect most languages used on the platform, a needed skill if the company is going to monitor content in foreign markets where it has expanded.

The paper found that human moderators don’t know how to speak the languages used in those markets, leaving a blind spot in the company’s efforts to crack down on harmful content.

One result was drug cartels and human traffickers using the platform to conduct their business, per The Journal.