- Parler claimed in a court filing Wednesday that Amazon said it was “definitely in this journey with” the social media company as it encountered abuse reports.
- Parler alleged that Amazon hadn’t raised concerns about Parler’s content moderation system until January 8, disputing Amazon’s claims that it had issued repeated warnings since mid-November.
- Parler sued Amazon earlier this week, alleging it unlawfully broke its contract with Parler and violated antitrust laws by intentionally trying to harm the company to the advantage of Twitter, a larger Amazon customer and Parler competitor.
- Amazon booted Parler off AWS, its web-hosting service, amid reports that rioters who seized the US Capitol last week had used Parler to organise and incite violence.
- Visit Business Insider’s homepage for more stories.
Parler fired back at Amazon on Wednesday in an escalating legal dispute between the two companies over Amazon’s decision to cut Parler off from Amazon Web Services, its web-hosting service, amid reports that rioters used the controversial social-media app to organise and incite violence at the US Capitol last week.
In a court filing, Parler disputed claims made by Amazon earlier this week that it had repeatedly warned Parler that violent content on its site â€” and the company’s lax approach to removing it â€” were grounds for Amazon to suspend Parler’s AWS contract.
Parler claimed that Amazon, in effect, terminated its contract completely, rather than simply suspending it, and did not warn the social-media company about potential contract breaches until after the Capitol riots â€” and continuing to try to sell it additional services as late as December.
“At no time before January 9, 2021, did AWS notify Parler that Parler was in material breach of the Agreement, thus blindsiding Parler,” the filing said. “And in the period up until then, AWS implicitly assured Parler that the two companies had a positive relationship that would continue into 2021.”
Amazon said in mid-December, Parler claimed, that it was “definitely in this journey with” Parler over abuse reports regarding problematic content on its platform, and that other customers including Twitter had received similar complaints.
An AWS spokesperson declined to comment on Parler’s latest filing but referenced an earlier statement which said, in part: “There is no merit to these claims. AWS provides technology and services to customers across the political spectrum, and we respect Parler’s right to determine for itself what content it will allow.”
Amazon had claimed in its Tuesday filing that it notified Parler “repeatedly” beginning in mid-November about content that violated the terms of the two companies’ contract but that Parler “was both unwilling and unable” to remove it.
But Parler alleged that “never before January 8, 2021, did AWS express any concerns” with its “reactive” content moderation system, and that while Amazon “occasionally” flagged problematic content, that Parler had “investigated and removed” those items.
Parler also claimed that Amazon had OK’d its content moderation approach as late as January 8, and even told Parler on January 6, the day of the riots, that it had adequately “resolved” a report about problematic content.
Parler also claimed that in mid-December, once Amazon knew Trump would likely create a Parler account, it tried to “sell Parler proprietary software that would permanently attach Parler to Amazon’s hip,” and that the two companies had even discussed Parler adopting Amazon’s AI systems starting in 2021 to “pre-screen” content.
Amazon in its response had pushed back against Parler’s claims that its actions were politically motivated and violated antitrust laws by deliberately favouring Twitter, which also uses AWS, and not taking similar action against it.
“AWS does not host Twitter’s feed, so of course it could not have suspended access to Twitter’s content,” Amazon said in the Tuesday filing, noting that Twitter eventually blocked the violent content, while alleging Parler refused to take similar steps.
Amazon also cited Section 230 of the Communications Decency Act, which gives companies that operate an “interactive computer service” the legal right to remove content as they see fit.
Parler responded in its Wednesday filing that Section 230 “is not a complete bar to liability for all wrongdoing.”
Expanded Coverage Module: capitol-siege-module