On Thursday, Facebook’s chief executive, Mark Zuckerberg, announced the company’s “New Steps to Protect the U.S. Elections.” They include blocking new political ads in the week leading up to Election Day and attaching labels to posts containing misinformation, specifically related to the coronavirus and posts from politicians declaring victory before all the results are counted.
One can — and many will — debate just how effective these measures will be at preventing election night chaos during a pandemic. (So far Facebook’s “misleading post” labels are vague to the point of causing additional confusion for voters. Similarly, blocking new political ads one week out from the vote ignores the vast amounts of disinformation Americans are subjected to year after year.) But what seems beyond debate is just how deeply Facebook has woven itself into the fabric of democracy.
Reading Mr. Zuckerberg’s election security blog post reminded me of a line from a seminal 2017 article by the journalist Max Read. Three years ago, Mr. Read was struck by a similar pledge from Mr. Zuckerberg to “ensure the integrity” of the German elections. The commitment was admirable, he wrote, but also a tacit admission of Facebook’s immense power. “It’s a declaration that Facebook is assuming a level of power at once of the state and beyond it, as a sovereign, self-regulating, suprastate entity within which states themselves operate.”
That power is consolidated in the decisions of its chief executive, who has voting control over the company. Here’s how Facebook’s co-founder Chris Hughes described Mr. Zuckerberg’s iron grip on the company in The Times last year:
Mark’s influence is staggering, far beyond that of anyone else in the private sector or in government. He controls three core communications platforms — Facebook, Instagram and WhatsApp — that billions of people use every day. Facebook’s board works more like an advisory committee than an overseer, because Mark controls around 60 percent of voting shares. Mark alone can decide how to configure Facebook’s algorithms to determine what people see in their News Feeds, what privacy settings they can use and even which messages get delivered. He sets the rules for how to distinguish violent and incendiary speech from the merely offensive, and he can choose to shut down a competitor by acquiring, blocking or copying it.
If Mr. Hughes’s description feels hyperbolic, it may be because such a consolidation of power is actually hard to comprehend.