There's a basic problem with calling Facebook and Twitter 'platforms' — and it took Trump for us to see it

  • The so-called deplatforming of Donald Trump has provoked misguided cries of censorship and injustice.
  • Thinking about powerful, for-profit corporations like Facebook, Twitter and Amazon as platforms is a fundamental misconception that lets the companies off the hook.
  • These internet services are publishers and they have a right, and a duty, to exercise judgment.
  • Visit Business Insider’s homepage for more stories.

After years of poisoning the civic well with his unique brand of lies, insults, distortions, narcissism, and overall poor manners,Donald Trump has found himself deplatformed.

My spellchecker doesn’t like this word, and I don’t like it either, at least in reference to the actions Facebook, Twitter, YouTube, and others have taken to silence, ban, suspend, penalise, and otherwise restrict the soon-to-be-ex-President and others from publishing on their respective properties.

Deplatforming, in contrast to these plain-English descriptions of Trump’s plight, is a deliciously Orwellian construct that gives credence to a misconception at the heart of the matter of what society expects from these powerful corporations. The word has its origins in academia, referring to movements at American and British universities beginning in the 1940s to disinvite — or not offer a platform to — certain campus speakers.

The notion of referring this way to the banishment of a particular user emerged in recent years because we have come to think of these behemoth technology companies as platforms for various things, primarily commerce and “social” media. The three biggies all began with whimsical communitarian uses: Twitter as a forum for random and brief observations, Facebook as a digital student handbook, and YouTube as a broadcaster of personal videos.

That was then. Now they are full-blown media companies, each with multi-billion-dollar advertising businesses and plenty of professionally produced content that runs alongside oodles of entertaining and informative ephemera. Referring to them as platforms, rather than the publishers that they are, acknowledges the protections that are due to them as mere tech concerns. Media companies are liable for what they publish; platforms are mere conduits for other people’s deeds. Put another way, a platform deplatforms; a media company exercises judgement in what it publishes from the outset.

If the tech companies had stuck to cat videos and the like, all would have been fine. But all manners of vile speech have appeared too, and the companies have labored, under great criticism, to clean up their acts. The problem, as Jonathan Knee, a Columbia Business School professor and media banker, puts it, is that they are not well suited to the task. “The fact that there’s media going on here is secondary to how they think about themselves,” he says. “They are first-rate technologists and third-rate media companies. And they’re really flailing around, trying to figure it out.”

It’s their own fault, of course. For all their short lives, these media giants operated under a fiction codified in law, the infamous Section 230 of the Communications Decency Act of 1996. It contains a provision that shields supposedly unique and fledgling digital companies from liability for content posted to their sites by someone else, the once shiny-new concept of user-generated content.

Mark Zuckerberg FacebookPhoto by Chip Somodevilla/Getty ImageFacebook CEO Mark Zuckerberg

Bad actors flocked to the sites alongside middle-aged people reading about their high-school chums. Belatedly and inadequately, the sleeping giants, no longer scrappy startups, woke up. Jonathan Greenblatt, head of the Anti-Defamation League, a group that battles hate speech, last year helped organise a ‘Stop Hate for Profit’ campaign, urging the companies to better police their businesses. He praises Facebook in particular for measure like agreeing to an audit of hate speech and adding a top civil-rights-focused executive. “We wished it had been pro-active,” he says. “But they did it.”

Still, these are baby steps. The sites largely are playing a hapless game of Whac-A-Mole. Twitter, for example, put ineffectual labels on inaccurate Trump posts before banning him. Amazon, under pressure for months from a prominent epidemiologist to stop selling books promoting anti-vaccine messages, just this month began placing barely noticeable links to Centres for Disease Control sites atop search results for books about “vaccines.” (The message recently appeared above titles including “Anyone Who Tells You Vaccines Are Safe and Effective Is Lying” and “Corona False Alarm?”)

The way forward for the giants isn’t simple, given their damned-if-they-do-or-don’t moment. Everyone from Angela Merkel and Alexei Navalny to the ACLU have become critics. The ACLU, which has worked tirelessly for four years to impeach Trump, said “it should concern everyone when companies like Facebook and Twitter wield the unchecked power to remove people from platforms that have become indispensable for the speech of billions.”

The answer, then, is to clip their power. Antitrust actions, currently focused especially on Facebook and YouTube parent Google, will take years. Tweaking of Section 230 could happen far more quickly. Remove legal liability, at least for the mega-caps, and suddenly they will be as responsible for what they publish as other media companies are.

Roger McNamee, the Facebook investor-turned-scourge, believes the giants finally acted against Trump not so much because he had gone over the line but because they knew they were in trouble. “Every tech law department said, ‘We enabled this and profited from it,'” says McNamee. “They were concerned with being complicit, like a getaway driver. This was pure self-interest.” Unless they are regulated like other dangerous industries in the past — he cites historic shifts in the behaviour of the building trades and chemical makers — they will do only what they need to get by.

The tech goliaths are well within their rights to remove bad actors from their premises. (The First Amendment restricts what the government, not a private company, can do.) In turn, the citizens of the countries where these companies make their money have a responsibility to hold them accountable for their actions. Removing Donald Trump and a bevy of white supremacists is only a start.

Adam Lashinsky is a Business Insider contributor and former executive editor at Fortune Magazine, where he spent 19 years. He is the author of two books: “Inside Apple”(about Apple) and “Wild Ride”(about Uber).

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.