On Friday, as pressure grew for Apple and Google to remove it from their stores, Google suspended Parler from the Play Store, citing “continued posting in the Parler app that seeks to incite ongoing violence in the U.S.”
In order to protect user safety on Google Play, our longstanding policies require that apps displaying user-generated content have moderation policies and enforcement that removes egregious content like posts that incite violence. All developers agree to these terms and we have reminded Parler of this clear policy in recent months. We’re aware of continued posting in the Parler app that seeks to incite ongoing violence in the U.S. We recognize that there can be reasonable debate about content policies and that it can be difficult for apps to immediately remove all violative content, but for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content. In light of this ongoing and urgent public safety threat, we are suspending the app’s listings from the Play Store until it addresses these issues.
With the suspension, Parler is no longer available in Google Play, though people who have previously installed the app can continue to use it.
Apple also threatened a ban if the company doesn’t rein in the violent threats. Apple emailed Parler CEO John Matze saying that “Parler is not effectively moderating and removing content that encourages illegal activity,” BuzzFeed News reported. The iPhone maker gave him 24 hours to create a “moderation improvement plan.” Matze has previously said he disagrees with other platform’s moderation practices.
“Apparently they believe Parler is responsible for ALL user generated content on Parler,” Matze wrote in a response posted on Parler. “Therefor [sic] by the same logic, Apple must be responsible for ALL actions taken by their phones.”
Parler, which spiked in popularity following the November election as Twitter and Facebook cracked down on election misinformation, has been called out for its role in the violence in DC this week. Apple said in its letter that it had “received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property.”
Users have also turned to the app in the days since the riot to make a disturbing and violent threats about future plans. Screenshots of individuals calling for “firing squads” and threatening an armed response to Joe Biden’s inauguration have been circulating on Twitter, along with calls for Apple and Google to ban the app. (Notably, Twitter cited “plans for future armed protests” in its decision to permanently suspend Trump.)
When pressed by The New York Times this week, Matze — who in the past has decried the “censorship” from Twitter and Facebook — repeatedly insisted he hadn’t observed Parler users using the app for illegal purposes. “If people are breaking the law, violating our terms of service, or doing anything illegal, we would definitely get involved,” Matze said. “But if people are just trying to assemble or they’re trying to put together an event… there’s nothing particularly wrong about that.” Parler didn’t respond to a request for comment.
As The Verge points out, both companies have pulled apps associated with the far right in the past. Chat app Gab was removed from Google Play for hate speech in 2017. And Apple booted Alex Jones’ InfoWars app in 2018(Google removed the app last year for spreading coronavirus misinformation).
This story has been updated with a statement from Google.
source: Engadget.com by Karissa Bell