Experts say Facebook and Twitter were 'indirectly involved' in the US Capitol siege since they gave President Trump and his far-right supporters a place to congregate for years

Samuel Corum/Getty ImagesPro-Trump supporters storm the U.S. Capitol following a rally with President Donald Trump on January 6, 2021 in Washington, DC.
  • After pro-Trump extremists stormed the US Capitol on Wednesday, reports have shown how the organisers took to social media websites to plan the riot.
  • President Donald Trump himself simultaneously fanned the flames by perpetuating more baseless claims of election fraud on sites like Facebook, Twitter, and YouTube.
  • These companies responded by cracking down on Trump’s accounts, but many critics have admonished the firms for not reacting more harshly sooner.
  • Experts told Insider that internet platforms aren’t solely to blame for the insurrection, but they are “indirectly involved” since they have justified their hands-off policies on moderating misinformation, while violent far-right content proliferates.
  • They said smaller tech companies like Parler that don’t fact-check could grow in popularity, if Trump is permanently kicked off of all the big platforms.
  • “Cut off the big head, seven new heads sort of spring out, right?” one expert told Insider.
  • Visit Business Insider’s homepage for more stories.

A group of pro-Trump extremists stormed the US Capitol on Wednesday, and while some experts have told Insider platforms like Facebook and Twitter aren’t “directly implicit” in the violence that ensued, they’re not completely absolved of their role in it either.

Reports have detailed how violence escalated in far-right corners of the internet leading up to January 6, including on Twitter and Facebook, and President Donald Trump himself has used those same platforms to stoke the flames by further spouting baseless claims of election fraud.

“Directly implicit? No. Were they indirectly involved because they haven’t policed him accordingly over the past four years? Yes,” Ari Lightman, professor of digital media at Carnegie Mellon and social media expert, told Business Insider.

Lightman said Wednesday’s event is an example of what can result from false information propagated on social platforms.

“This isn’t something that happened based on Trump’s speech this week — this has been going on for months and months…This literal sort of building up of pent-up angst and anger around the election. This was very systematic,” Lightman said.

Since Wednesday, the tech giants have rolled out unprecedented changes to crack down on Trump’s misleading posts — Facebook and Amazon-owned Twitch have banned him until at least the Inauguration, and Twitter permanently suspended Trump’s account late Friday.

But many have lamented that these changes are too little, too late, including some Twitter employees who wrote a letter demanding that the platform suspend Trump permanently before the company agreed to do so. For years, Trump and his allies have been able to utilise online platforms and their algorithms — which favour sensationalistic content — to sow misinformation campaigns.


Read more:
Trump’s Facebook ban is just ‘a Band-Aid on a bullet wound,’ critics say — but no one can agree on the best way to wipe out the disinformation contagion

That made it easier for people to become radicalized in online groups before hearing their beliefs echoed by Trump and his allies on Twitter, Facebook, and other platforms, Emma Ruby-Sachs, executive director of the consumer watchdog group SumOfUs, explained to Insider. All of this content likely spread even faster, since algorithms favour the salacious posts typical of far-right communities online, she said.

“The movement that stormed the Capitol was born out of ‘Stop the Steal'” — Trump’s baseless campaign that the 2020 presidential election was stolen from him — “which not only started on Facebook and Twitter but grew, aided by the algorithm that those platforms use to make money,” Ruby-Sachs said.

Misinformation could find a new home on smaller, niche social platforms

Platforms continuing to simply add warning banners to posts containing misinformation likely won’t do much, Lightman warned.

He called it the implied truth effect: By selectively labelling false news and misinformation, platforms elevate the apparent credibility or legitimacy of other news stories, even if they are fake. Indeed, Twitter has done more harm than good in inconsistently implementing its moderation policies, like when it banned the URL of an article about Joe Biden’s son without explanation in October.

One scenario could be sites like Twitter shutting Trump’s account down, which we saw happen on Friday. That may not accomplish much, though, since he will likely redirect to sites like Gab and Parler, which doesn’t fact-check and is already crawling with his supporters, like Sen. Ted Cruz.

In that case, Lightman said, Trump could be more apt to spread even more misinformation, conspiracy theories, and false facts that his hundreds of thousands of loyal fans will be all too happy to engage with unfettered on those alternative social platforms.

Lightman likened it to the mythological monster Hydra: “Cut off the big head, seven new heads sort of spring out, right?”

Ruby-Sachs said these smaller sites don’t worry her as much — their backbone does not consist of algorithmic systems that elevate certain viewpoints over others, partly because they’re small and they haven’t yet found their way around the advertising world.

Incidents like Wednesday’s siege will continue to point to the broader discussion about finally reining in the tech industry

The tech industry is overdue for a regulatory body that understands the ins and outs of these platforms and their societal impact, experts told Insider.

The changes that tech companies have made in the last two days are just more examples of why they can’t regulate themselves, Ruby-Sachs said. Users like Trump should be banned, she said, but firms also need to address a broader problem that gives fake news spreaders a voice in the first place.

“When you’re in a house and it catches fire, you need to put out the fire in front of you, but you also have to figure out why the house caught on fire in the first place,” Ruby-Sachs said.

Fake, violent content will always spread faster with these companies’ current profit-driven algorithms in place, which will gain those users more followers and a subsequently bigger platform to “call for violence and undermine democracy,” according to Ruby-Sachs.

“That’s not going to change,” she said. “Trump has been exploiting that system brilliantly and to the detriment of the rest of the country, but others will exploit it after him unless Congress steps in and regulates.”

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.