Does Ye know Parler does content moderation too?

Under pressure from Google and Apple, Parler recently upped its moderation efforts
Does Ye know Parler does content moderation too?
Photo: Roy Rochlin (Getty Images)
We may earn a commission from links on this page.

Ye, the rapper formerly known as Kanye West, announced that he is buying Parler, the conservative social media website, on Oct. 17.

In a statement posted on Twitter, of course, Parler CEO George Farmer said that Ye “will never have to fear being removed from social media again.” Ye also hailed it as a victory for conservative speech: “In a world where conservative opinions are considered to be controversial we have to make sure we have the right to freely express ourselves.”

The announcement is sudden and unexpected, especially because Ye has never used Parler before and, before last week, hadn’t used Twitter in two years. But the deal for an undisclosed amount of money comes just one week after the rapper was suspended from Instagram and Twitter for posting antisemitic comments and violating their respective policies prohibiting hate speech.

But Ye’s new platform, like all major social media platforms, moderates content, despite their broad pronouncements about free speech. Since 2021, Parler has only increased moderation efforts in recent years to appease app store requirements from Apple and Google.

Every social media platform does content moderation

Parler bills itself as “the premier global free speech platform,” an answer to purported censorship of conservative voices on mainstream social media websites like Facebook and Twitter. (This narrative isn’t supported by the evidence: Studies consistently show that mainstream platforms don’t discriminate against conservative ideology and, in fact, many of their most influential users are right-wing.)

In recent years, social media companies have added new rules to reduce hate speech, limit threats of real-world violence, and curb the spread of misinformation about sensitive topics like elections, vaccines, and the spread of covid. After Donald Trump incited violence at the US Capitol Building on Jan. 6, 2021, he was banned from Facebook, Twitter, and many other online platforms. In response, Trump started Truth Social, a conservative social media platform with scant content moderation. Recently, the company that owns Truth Social has run into serious allegations of securities fraud associated with a troubled deal to go public via a separate blank-check company, or SPAC.

But Twitter will soon find itself owned by someone with a similar aversion to content moderation. Billionaire Tesla CEO Elon Musk, whose path to buying the company is winding its way through the courts, has promised to roll back some speech controls, promote “free speech” on the app, and reinstate Trump on Twitter.

On Oct. 8, Musk welcomed his friend Ye back to the platform right before the rapper went on an antisemitic screed—promising to “death con 3 On JEWISH PEOPLE”—and was suspended. Musk, who didn’t have control over the decision because he doesn’t yet own Twitter, tweeted that he spoke to Ye and “expressed [his] concerns,” which he said the rapper “took to heart.”

Of course, Parler has its own content controls

Parler touts a hands-off policy toward content moderation. The reality hasn’t been so simple: After the insurrection at the US Capitol, Apple and Google removed Parler from the App Store and the Google Play Store, respectively for failing to adhere to their own content moderation policies.

Removal from these two app marketplaces is a death knell for any mobile app. When an app is removed from a major marketplace, users have to turn to a desktop or mobile web version of the app, which is often clunkier, less secure, and far less appealing to mass audiences. Since its expulsion, Parler has desperately tried to get back in Apple and Google’s good graces by removing more hateful and violent content.

In March 2021, Apple denied Parler’s application to rejoin the App Store, sending the company’s executives screenshots of its users posting hate speech. In response, Parler’s chief policy officer Amy Peikoff outlined all of the app’s content moderation efforts to weed out rule-breaking content: “We worked tirelessly to adopt enhanced protocols for identifying and removing this type of content,” Peikoff wrote.“We have since engaged Apple to show them how we’ve incorporated a combination of algorithmic filters and human review to detect and remove content that threatens or incites violence.” She added that Parler has new tools to clamp down on “personal attacks based on immutable and irrelevant characteristics such as race, sex, sexual orientation, or religion.”

Apple re-added Parler to the App Store in May 2021 after “months of productive dialogue,” with some content excluded from the mobile version of the service and Google allowed Parler back in September 2022 after agreeing to “implement robust moderation practices.” (Even after reappearing on Apple’s and Google’s app stores, Parler has struggled to gain traction with a wide audience. In the first half of 2022, it had about 725,000 monthly active users, according to the analytics firm data.ai. By comparison, Twitter had 238 million users last quarter and Facebook parent Meta had 2.9 billion across its apps. In September, Parler announced it was pivoting to “uncancelable” cloud services, presumably as a way to generate new revenue.

As the new owner, Ye’s involvement will surely generate interest and an influx of users. He may even strip away whatever content controls Parlet has put in place to get back on the app stores. But it will prove tough for the hip-hop superstar to deliver on his version of free speech that promises the“right to freely express ourselves” with few if any conditions: Apple’s and Google’s dominance of the mobile app economy ensures they set the rules.