Parler needs Apple so much it’s actually moderating more content

Below a post from Parler CEO John Matze saying “Hello world, is this thing on?” an error message warns of technical difficulties on the platform.
Below a post from Parler CEO John Matze saying “Hello world, is this thing on?” an error message warns of technical difficulties on the platform.
We may earn a commission from links on this page.

Parler—the right-wing social media platform that bills itself as a haven from oppressive content moderation—is starting to embrace some of the top-down moderation practices it used to bemoan.

The impetus for this change of heart is that Parler now recognizes it needs Apple, one of the Big Tech companies it used to rail against. As recently as Jan. 9, former Parler CEO John Matze called Apple one of “those authoritarians who hate free speech,” after the iPhone-maker removed Parler from the App Store over its role in the Jan. 6 insurrection at the US Capitol. Matze has since been fired. Now the platform is waging a campaign to get back in Apple’s good graces—and back into its all-important App Store.

Apple rejects Parler again

It hasn’t gone well so far. On March 10, Apple denied Parler’s application to return to the App Store, saying the platform hadn’t done enough to strengthen its moderation practices after the Capitol riots. Bloomberg reports that along with its rejection letter, Apple sent Parler a litany of screenshots of its users posting hate speech. “As you know, developers are required to implement robust moderation capabilities to proactively identify, prevent, and filter this objectionable content to protect the health and safety of users,” Apple reportedly wrote.

In a statement the next day, Parler’s chief policy officer, Amy Peikoff, stressed the lengths to which the platform has gone in recent months to implement new moderation practices to appease Apple. “We worked tirelessly to adopt enhanced protocols for identifying and removing this type of content,” Peikoff wrote. “We have since engaged Apple to show them how we’ve incorporated a combination of algorithmic filters and human review to detect and remove content that threatens or incites violence.”

Peikoff also wrote that Parler has rolled out all-new moderation features that weed out “personal attacks based on immutable and irrelevant characteristics such as race, sex, sexual orientation, or religion.” Mainstream social platforms like Facebook and Twitter have long since adopted algorithms to block exactly these forms of hate speech (although they tend to use language like “protected categories” rather than “immutable and irrelevant characteristics”). The main difference is that Parler gives its users the ability to opt out of its hate speech filter so they can “curate their own feeds as they choose.”

None of that was enough to win Apple over. But Peikoff seemed to indicate that Parler would make more changes to placate the tech giant. “Parler expects and hopes to keep working with Apple to return to the App Store,” she wrote.

Parler changes its tune

The other major shift lies in the language Parler uses to describe Apple. In a break from Matze’s tirades against “the horrible double standard Apple and their big tech pack apply to the community,” Peikoff struck a positive note, writing about her hope that “Apple will continue to differentiate itself from other ‘Big Tech’ companies by supporting its customers’ choice to ‘think different.’”

Whether they like it or not, the fortunes of platforms like Parler are deeply entwined with the policies of tech giants like Apple and Google, crucial gatekeepers that decide which apps users of the iOS and Android operating systems can download. In November, at the height of Donald Trump’s lies about the integrity of US elections, and January, shortly after Trump was banned from Facebook and Twitter, Parler was featured as the number one app on the App Store.

Now, following a months-long ban, Parler understands that it can’t mount a comeback without Apple’s blessing—and that means bending to the reality that a social network can’t survive without some form of top-down content moderation.