Parler was used at least in part to plan the Capitol Hill attack, pushing tech companies to reckon with the part they played in the riot. Stephen Shankland, Edward Moyer, Ian Sherr Amazon, Apple and Google have banned the Parler social networking app from their respective services and app stores in the wake of Wednesday's attack on the USCapitol by a mob of Trump supporters. Parler has been rife with violent comments since before the attack on the Capitol. Parler's CEO John Matze posted on his app late Saturday that Amazon had informed him it would no longer help to host his app on its Amazon Web Services platform. The move followed earlier announcements by Apple and Google would be pulling the app from their respective app stores as well. "This was a coordinated attack by the tech giants to kill competition in the market place," Matze wrote, adding that his service had become "too successful too fast." He didn't address his platform's comparatively lax moderation rules or its use by extremists ahead of the Capitol Hill riot. He also didn't mention increasing concerns that social media apps, including Parler, were being used to organize another attack in the coming weeks. Neither Amazon not immediately responded to requests for comment. Earlier in the day, Apple said in a statement that it had banned Parler from its App Store because it had failed to appropriately police content posted by users. Apple has "always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity," the company said. "Parler has not taken adequate measures to address the proliferation of these threats to people's safety. We have suspended Parler from the App Store until they resolve these issues." The App Store is the only way to distribute apps to iPhones, so banishment poses a serious challenge to online services, though they can often still be reached through websites. Apple's move followed Google's decision to remove Parler's Android app from its Play Store on Friday for similar reasons. "We're aware of continued posting in the Parler app that seeks to incite ongoing violence in the US," Google said. "We recognize that there can be reasonable debate about content policies and that it can be difficult for apps to immediately remove all violative content, but for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content." Google's ban won't impact Parler as much as Apple's because Android users can "sideload" apps without going through the Play Store, if they choose. The ability is disabled by default, however. Deplatforming a platform The modern internet provides an abundance of platforms to directly communicate to millions of people, and it has proved challenging to balance the benefits of online discussion with the drawbacks. Parler's CEO Matze had posted warnings his app might be removed from Amazon's web services after a group of employees called on the company to act. "We cannot be complicit in more bloodshed and violent attacks on our democracy," Amazon employees wrote in a tweet. Less than a day later, they declared victory. "We demanded Amazon deplatform white supremacists using tech we work on as a bullhorn to incite violence and attack our democracy," the group said. Enough is enough. Amazon hosts Parler on @awscloud. As Amazon workers, we demand Amazon deny Parler services until it removes posts inciting violence, including at the Presidential inauguration. We cannot be complicit in more bloodshed and violent attacks on our democracy. — Amazon Employees For Climate Justice (@AMZNforClimate) January 9, 2021 In Apple's case, the iPhone maker sent Parler a warning letter on Friday, according to Buzzfeed, demanding the app improve its moderation. "We have received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property. The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities," Apple reportedly said to Parler. "If we do not receive an update compliant with the App Store Review Guidelines and the requested moderation improvement plan in writing within 24 hours, your app will be removed from the App Store." In a follow-up letter Saturday to Parler's developers, Apple said it was still seeing unacceptable content on Parler. "In your response, you referenced that Parler has been taking this content 'very seriously for weeks,'" Apple wrote. "However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action." And an apparent plan put forward by Parler didn't satisfy Apple. "Your response also references a moderation plan 'for the time being,' which does not meet the ongoing requirements" in the App Store's guidelines, Apple wrote. "While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues. A temporary 'task force' is not a sufficient response given the widespread proliferation of harmful content." Parler didn't respond to a request for comment on Apple's ban either. In a Parler post on Friday, CEO Matze challenged Apple's position and said Apple doesn't hold Twitter or Facebook to the same standard. "Apparently they believe Parler is responsible for ALL user generated content on Parler," he said. "By the same logic, Apple must be responsible for ALL actions taken by their phones. Every car bomb, every illegal cell phone conversation, every illegal crime committed on an iPhone, Apple must also be responsible for." Apple didn't respond to a request for comment on Matze's remarks. Content crackdown on social media The biggest example of deplatforming happened Friday when Twitter permanently suspended PresidentDonald Trump's account "due to the risk of further incitement of violence." Twitter suspended President Donald Trump's Twitter account on Jan. 8, 2021. Twitter permanently suspended President Donald Trump's Twitter account on Friday. After the insurrection at the Capitol, which led to deaths, vandalism and property damage -- not to mention the insult to a national and international symbol of democracy -- social media sites have been taking a harder stance against activity they see as dangerous. Facebook and Instagram blocked Trump from new posts "indefinitely." Reddit cut off The_Donald, a major right-wing discussion forum, and Twitter banned several high-profile accounts associated with the right-wing, bogus QAnon conspiracy theory. In a Friday tweet, Rep. Alexandria Ocasio-Cortez, a prominent New York Democrat, had called for Google and Apple to take action after reported calls for violence on Parler. Parler's growing importance Parler is growing in importance to right-wing activists as Twitter, Facebook and Instagram have put the kibosh on Trump's social media accounts after loyalists stormed the Capitol on Wednesday. "Our investigation has found that Parler is not effectively moderating and removing content that encourages illegal activity and poses a serious risk to the health and safety of users in direct violation of your own terms of service," Apple reportedly told Parler on Friday, citing a handful of examples purportedly showing violent threats. "Content of this dangerous and harmful nature is not appropriate for the App Store. As you know from prior conversations with App Review, Apple requires apps with user generated content to effectively moderate to ensure objectionable, potentially harmful content is filtered out. Content that threatens the well being of others or is intended to incite violence or other lawless acts has never been acceptable on the App Store."