- YouTube introduced new rules on its platform Thursday following a pro-Trump insurrection at the US Capitol on Wednesday.
- Channels whose videos violate YouTube’s policies will receive a “strike. If they get three strikes in a 90-day period, they will be permanently removed.
- YouTube said the move was in response to the “disturbing event that transpired yesterday” and comes as internet platforms continue to react to misinformation online, including from President Trump himself.
- Trump shared a video of him espousing baseless election fraud claims on Wednesday, a video that was removed by Facebook, Twitter, and YouTube.
- Facebook has since banned Trump “indefinitely” from using his account to post, and Twitter locked Trump’s account until he removed tweets that were in violation of the company’s policies. He removed them early on Thursday.
- Visit Business Insider’s homepage for more stories.
YouTube introduced new rules on Thursday that will dole out strikes to channels whose videos violate its policies.
Users will first receive a warning if their video violates YouTube’s policies, the company said. A first strike will cut users off from posting for one week, and a second strike within 90 days will prevent users from posting for two weeks.
If a channel receives three strikes within the same 90-day period, it will be permanently removed from the platform, according to YouTube. The strikes will still stand even if users delete the content that warranted one in the first place.
A YouTube spokesperson told Insider that the move was made in response to the “disturbing event that transpired yesterday,” when pro-Trump rioters stormed past federal officers into the US Capitol and vandalised its halls while lawmakers worked to certify the 2020 presidential election results.Officials including Vice President Mike Pence were evacuated, and the siege resulted in the death of four rioters, including a woman who was shot in the chest.
The National Guard and state police were called in to quell the unrest.
As many people called on Trump to publicly demand his followers to retreat from the Capitol on Wednesday, he released a recorded video telling them to “go home, we love you, you’re very special,” but he did not condemn their actions. He used the majority of the video to reiterate his baseless claims that the election was fraudulent and that he won in a “landslide.” He previously tweeted in a post that remains visible on his account for “everyone at the U.S. Capitol to remain peaceful. No violence!”
Trump shared the video on Twitter, and the platform initially placed a label on it warning that users would not be able to recirculate it “due to an increased risk of violence.” It later removed the tweet and two other tweets containing election misinformation and locked Trump’s account until they were deleted. He removed them early Thursday.
Facebook removed the video on Wednesday, as did YouTube.
As the aftermath of the siege continues to unravel, many have ramped up pressure on internet platforms for their role in providing a microphone to President Donald Trump and his far-right supporters and why it took an insurrection at our nation’s Capitol to prompt action from tech firms. And YouTube’s new rules are just some that companies are taking following the chaos.
Facebook said Thursday it was banning Trump “indefinitely” for at least the next two weeks “until the peaceful transition of power is complete.” That means Trump will not be able to access his account until after he exits the presidency. President-elect Joe Biden will be sworn in on January 20.