Last Tuesday, Facebook vice president Nick Clegg announced that Facebook was going to give politicians more leeway than other users in using offensive speech, and their assertions would not be fact-checked. That set Dave Willner over the edge. Two nights later, Willner posted a long explanation—on Facebook, of course—attacking the policy. The 35-year-old tech worker described the social network’s new stance as “foolish, wrong, and a significant betrayal of the original democratizing ideals of Facebook.”
That essay is notable not just for its well-argued points but for who wrote it: Dave Willner is Facebook’s former head of content standards. Over 10 years ago, as part of the team monitoring content on the nascent social network, he took an ad hoc list of no-no’s and helped create a document that is the foundation for the company’s content standards. (Though the current version is longer and more detailed, Willner says Facebook’s hate speech rules haven’t changed that much in the last decade. “What has changed is the willingness of politicians to say things that are clearly racist, sexist, etc.,” he says.) Willner left Facebook in 2013 and heads community policy for Airbnb. His wife Charlotte, who worked with him at Facebook, heads Pinterest’s trust and safety team—making Willner half of online speech moderation’s First Couple.
Though Facebook says it will still remove content from politicians that encourages violence or harm, Willner argues that allowing hate speech—whether it’s from a politician or a private citizen white supremacist—can create a dangerous atmosphere. He cites research from the Dangerous Speech Project, which studies the types of public speech that spark violence, that backs up his claim. He also charges that Facebook’s exception now makes politicians a privileged class, enjoying rights denied to everyone else on the platform. Not only is Facebook avoiding hard choices, Willner says, it is betraying the safety of its users to placate the politicians who have threatened to regulate or even break up the company. “Restricting the speech of idiot 14-year-old trolls while allowing the President to say the same thing isn’t virtue,” writes Willner. “It’s cowardice.”
Willner intended his post, distributed to “friends-only,” to generate discussion and attention from former colleagues, both inside and outside Facebook. That it did. One top executive who engaged with Willner was Andrew “Boz” Bosworth, Facebook’s outspoken vice-president of AR/VR, and one of the engineers who created the News Feed. Taking pains to note that he is not part of the content standards team, Bosworth described Facebook’s decision as a reasonable balance between maintaining standards and letting people know what their political leaders really think. “I just feel you aren’t paying enough respect to the newsworthiness case,” he wrote in a comment under Willner’s post. “I’m not convinced the horror of the speech is greater than the horror of it going unnoticed.”
Another participant in the fray was Paul C. Jeffries, Facebook’s former head of legal operations. Speaking like someone who has spent a lot of time with lawyers—though he holds an engineering degree—Jeffries says of Facebook that “they aren’t really applying different rules to regular folks and politicians. It’s the same rule, it just evaluates [sic] different.”
The conflict is genuinely knotty. As Bosworth notes, it’s a classic tradeoff between two values. Twitter gives a pass to Donald Trump’s hateful tweets, because CEO Jack Dorsey believes it’s in the public good to document what the president says. And Facebook has used newsworthiness as moderation factor for several years. In December 2015, Facebook’s top executives debated what to do with Trump’s denigration of Muslims, which some employees thought clearly violated its standards. They allowed it to stand. A year later, Facebook took down a post from a Norwegian writer that included the iconic “Terror of War” photo, which depicts children fleeing a US napalm attack in Vietnam, because it showed a naked young girl. Responding to the outcry that the company was censoring the Pulitzer Prize–winning image, Facebook reversed itself. Thereafter, “newsworthiness” formally became a factor in treating speech that otherwise violated the company’s standards.
What triggered Willner was Clegg’s speech, given in Washington, DC, on Tuesday and posted in the Facebook newsroom. Clegg, a former UK politician himself, had firmly placed his employer’s thumb on the “newsworthiness” scale. “We do not submit speech by politicians to our independent fact-checkers, and we generally allow it on the platform even when it would otherwise breach our normal content rules,” Clegg said. The exception is when the words of a politician endangers people. Facebook will also continue enforcing its higher speech standards for the paid speech of political ads.
Clegg used the analogy of a tennis tournament: Facebook’s role is to provide a level court, a firm net, and well-painted lines. And it then stands back, Clegg argues, while the players volley.
But Facebook doesn’t stand back. It acts as a referee, calling some speech out of bounds. In this new rule, the referee takes a hike when favored players take the court, refusing to call them out even if they hit the ball into the stands. Willner levels a different complaint: He thinks comparing the problem to sports belittles the seriousness of the issue, which has real consequences to people.
But what makes him most sad about his former employer is that the company will now allow people like former Klansman David Duke to violate the rules that Facebook deems essential to protect the well-being of its users. “It sure as hell doesn’t make the world more open and connected,” Willner writes.
Facebook, for its part, has been scrambling to clarify Clegg’s speech. One question has been: Who qualifies as a politician in Facebook’s standards? Facebook’s spokespeople say that the company will take the loftiness of someone’s position into account: Prime minister more than city councilperson. Dogcatchers maybe not so much. It also says that in certain countries where officials have been known to indulge in dangerous hate speech, it will be less forgiving that it is with, say, Donald Trump. (Though one wonders if the president’s recent tweet that seemingly condoned a US Civil War qualifies as dangerous.) In other words, Facebook will make judgment calls. It’s ironic that in its attempt to stand back and let public figures speak freely, Facebook now finds itself having to define who qualifies as a politician, and which politicians aren’t granted the exemption.
Willner’s objection is significant because he isn’t one of Facebook’s apostate critics like Chris Hughes or Roger McNamee. He still has close ties to the company. When he and his wife hosted a wildly successful fund-raiser for the immigrant aid group RAICES, top Facebook executives praised them and contributed to the campaign. So his strong attack on the politician exemption to Facebook’s standards is significant, and not made lightly. After all, he helped create those standards.
“I’m one of very few people qualified to speak up,” he explained to me when I contacted him to find out why he posted. “Facebook often, and rightly, pushes back on critics for not understanding the challenges of scale. I understand those challenges intimately, and in the context of their own systems. So I felt compelled to write.” Willner was OK with my writing about his friends-only thread, because “this decision is dangerous for vulnerable people.”
One other note: The discussion on this post has been one of the most substantial, respectfully waged debates I’ve seen online in a long time. While grappling with a tough issue, Willner, Bosworth, Jeffries, and the other participants have shown that serious discussions can actually be conducted on the News Feed. Willner’s post, and the ensuring back-and-forth, blissfully distracted me from impeachment rants, someone’s vacation I could never afford, and ads for the Johnny Cash socks I had looked for on the web a few minutes earlier.