By now, most of the internet has seen the exchange that lay bare the indefensibility of Facebook’s political advertising policy when CEO Mark Zuckerberg testified before Congress on Oct. 23.
Things got particularly heated when Democratic representative Alexandria Ocasio-Cortez, who is fast becoming Congress’ most incisive interrogator, questioned Zuckerberg, who struggled to defend the company’s decision allowing politicians to publish false information through Facebook advertising.
In a critical moment, Ocasio-Cortez asked Zuckerberg whether a politician could pay to spread disinformation on Facebook by falsely claiming that election day has moved.
Ocasio-Cortez: […] You announced recently that the official policy of Facebook now allows politicians to pay to spread disinformation in 2020 elections and in the future. So I just want to know how far I can push this in the next year. Under your policy, using census data as well, could I pay to target predominantly black zip codes and advertise them the incorrect election date?
Zuckerberg responded no, as that would be voter suppression. Ocasio-Cortez then asked him whether she could run an ad targeted at Republican voters falsely claiming that a Republican candidate supports the Green New Deal. After hesitating, Zuckerberg acknowledges that such an ad could “probably” run on the platform.
Zuckerberg: Yes, in most cases, in a democracy, I believe that people should be able to see for themselves what politicians, that they may or may not vote for, are saying.
Zuckerberg testified that he believes allowing constituents to see what a politician is advertising, true or untrue, is more valuable than enforcing a policy requiring claims to be substantiated.
Where Facebook has claimed it is drawing a bright line, this exchange reveals a slippery slope that smears that line beyond all recognition.
But by his own testimony, Zuckerberg revealed that Facebook does reject ads when they meet the company’s narrow definition of “imminent physical harm, or voter or census suppression.”
But things that are outside of Facebook’s definition, such as lying about a political candidate’s policy positions, personal peccadillos, or other matters that potentially inflame people’s passions, are themselves forms of voter suppression and pose imminent harm to democracy.
That was the point Ocasio-Cortez drove home, and it’s at the heart of why Facebook must either revise its policy or get out of the political ad game entirely.
As entrepreneurs and executives, we have built and run ad agencies and tech startups, overseen millions of dollars of advertising spend, crafted ad policy for massive digital platforms including MySpace and the NFL, and have advised numerous Fortune 500 companies on digital marketing strategy.
With the rise of social media platforms, we’ve had front row seats to the radical shift away from responsibility and toward a laissez-faire approach that is eroding the policies that underpin and protect American democracy.
In many ways, moderating advertising used to be relatively simple. TV networks had limited airtime, newspapers had limited pages, radio stations could only sell so many spots.
There were constraints on how many ads could be run, so there were only so many ads submitted. Reviewing those ads was also relatively straightforward; judgment calls were based on clearly defined policies on whether an ad violated the standards set by CBS or NBC, CNN or FOX News, or your local radio station. These standards invariably include some requirement to back up claims, or the right to refuse ads promulgating falsehoods.
Then came the internet credo: Scale.
And with it, self-serve advertising on digital platforms was born. Software started assisting in generating thousands of versions of an ad to test what combinations of images, copy, colors, calls to action, etc., perform best with which audience segments. At that point, software applications largely determined whether or not an ad could run.
In fairness to the social media platforms, it is incredibly difficult for algorithms to determine when a post violates standards. Literally billions of pieces of content are posted to Facebook, YouTube, and Twitter every single day. Everyone from well-meaning people to nefariously-programmed bots to autocratic states are breaking the rules. It’s a game of Whack-a-Mole to take down those posts, even with tens of thousands of reviewers on the job.
That said, even with the complexity posed by today’s head-spinning volume, the rules are easier to enforce for paid ads than for regular contributor content.
If Facebook can discern when an ad meets one definition of voter suppression, such as this ad it pulled that urged voters to support adding a discriminatory question about citizenship to the census, it should be able to identify others.
The questions are where the lines get drawn, how the algorithms are written and trained, what level of human review is required, and where it sits in the priority queue.
Will it be imperfect? Yes. Will Facebook get a lot wrong? Most likely.
But with its current policy, Facebook is batting zero. That’s unacceptable. When it comes to political and issue-based advertising, the stakes could not be higher.
Facebook has acknowledged that Russia and Iran are already in the process of attempting to undermine the next election with ads that “mislead” American voters. The Trump campaign is already spending millions on ads targeted to voters in the nine states that are likely to swing the election.
This is not the time to use the American political system as a petri dish for poorly considered policy, well-intentioned as it may be.
The proliferation of disinformation online is starting to look like digital Ebola for democracy. We have to stop the epidemic it before it’s too late.
There 374 days until the 2020 presidential election, and we have a few vaccines ready to go: We can put stricter policies in place; consider an extended political ads blackout period prior to the election (as is done in France and Australia); or, for the safest option, Facebook can forego paid political advertising entirely.
At least, until we can get this right.