OCTOBER 24, 2021
Facebook has long been accused of playing favorites on multiple sides of the political spectrum, and it’s now clear just how much of that uproar extends to the company’s ranks. A leak to The Wall Street Journal reportedly shows Facebook leaders and staff have clashed numerous times over the social network’s approach to conservative content, particularly outlets like Breitbart. Rank-and-file employees have accused Facebook of making “special exceptions” from policies for right-wing outlets, while senior-level staff warned of potential pitfalls.
Workers argued that Facebook kept Breitbart in a second tier of the News Tab, a section meant to focus on reliable news, despite very low trust and quality scores as well as misinformation violations. Facebook was not only making exceptions, one employee said, but “explicitly” endorsing outlets like this by including them as trusted partners. Staff claimed Facebook was “scared of political backlash” if it enforced policies equally, and believed the site let conservative influencers Diamond and Silk lobby fact checkers to avoid punishment for spreading misinformation.
Higher-ups countered with justifications for those decisions. They argued that booting a news outlet for trust scores would risk booting more mainstream outlets like CNN, for instance. When staff asked Facebook to intervene over Breitbart‘s alleged attempts to dodge sites’ advertising blocks, a director said Facebook had to resist the urge and “rely on our principles and policies.”
Facebook repeated its familiar stance in a response to the Journal, maintaining that limited access to low-quality material to “improve people’s experiences,” not due to political leanings. A spokesperson added that Facebook studied the effects of potential changes before implementing them, and that publishers like Breitbart still met requirements for honoring rules against misinformation and hate speech.
The revelations likely won’t satisfy people on either side of the American political spectrum. Liberals may be concerned Facebook is knowingly allowing the spread of heavily spun and outright false claims, while the right wing may see it as evidence of a claimed anti-conservative bias. The insights reveal a more conflicted approach to material, though. They also underscore the importance of tools meant to automatically limit the reach of misinformation — they could minimize internal debates by curbing fake news without requiring as much human input.