- YouTube has pledged $US1 million towards “efforts to address social injustice” following the protests erupting across the US.
- YouTube was not specific about where the money will be going, and the firm drew accusations of hypocrisy.
- Critics pointed out YouTube has allowed white supremacy to proliferate on its platform over the years.
- Visit Business Insider’s homepage for more stories.
YouTube tweeted on Saturday that it will be donating $US1 million towards social justice in the wake of the protests raging across the US sparked by the killing of George Floyd, a 46-year-old black man who died after a white police officer knelt on his neck.
“We stand in solidarity against racism and violence. When members of our community hurt, we all hurt. We’re pledging $US1M in support of efforts to address social injustice,” YouTube said in an official statement. It was not specific about exactly where this money would be directed. A YouTube spokeswoman declined to comment further when contacted by Business Insider.
YouTube’s announcement drew a fair amount of backlash from people pointing out the platform’s history of allowing racist and white supremacist speech.
“Awesome to see how much positive feedback you’re getting from donating about half a day’s income without specifics when it took you years to do enforce your rules with white supremacists Richard Spencer, David Duke, Jared Taylor or Red Ice. Hypocrisy is a hell of a drug,” tweeted activist Twitter account Sleeping Giants. Sleeping Giants pressures advertisers to withdraw funding or ads from, usually, right-wing outlets.
YouTube has a reputation among the tech platforms for being particularly poor at discouraging far-right radicalisation. A2018 report from Data & Society detailed an “alternative influence network” of far-right creators who were flourishing on YouTube. A 2019 study from the University of Swansea found YouTube frequently recommended far-right content via its recommendation algorithms.
In June 2019 YouTube announced it was cracking down on neo-Nazi and white supremacist content on its platform and make an effort to stop its recommendation algorithms from promoting “borderline” videos.
The platform still struggles to enforce these policies however. In March of this year British politician Yvette Cooper said that after searching for a far-right group named in a report by Hope Not Hate, she was recommended neo-Nazi videos.
“They [YouTube] have become an organ of radicalisation instead of taking responsibility, for that is the way their algorithms work,” Cooper said.
Business Insider Emails & Alerts
Site highlights each day to your inbox.