Facebook Oversight Board launches review of company’s XCheck system

The XCheck program was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists (Photo: Reuters)
The XCheck program was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists (Photo: Reuters)

Summary

Inquiry prompted by Wall Street Journal investigation into social-media giant’s treatment of high-profile users

Facebook Inc.’s Oversight Board said it is reviewing the company’s practice of holding high-profile users to separate sets of rules, citing apparent inconsistencies in the way the social-media giant makes decisions.

The inquiry follows an investigation by The Wall Street Journal into the system, known internally as “cross-check" or “XCheck." The Oversight Board, an outside body that Facebook created to ensure the accountability of the company’s enforcement systems, said it has reached out to the company and expects a briefing in coming days.

The XCheck program was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists. It grew to include millions of accounts, according to documents viewed by the Journal. In addition, some users are “whitelisted," meaning they were rendered immune from enforcement actions, the documents showed.

A 2019 internal Facebook review found that the practice of whitelisting was “not publicly defensible."

The company had previously told the Oversight Board in writing that its system for high-profile users was only used in “a small number of decisions."

In a blog post on Tuesday, the board said it was looking into whether Facebook has “been fully forthcoming in its responses in relation to cross-check, including the practice of whitelisting."

The post continued: “This information came to light due to the reporting of the Wall Street Journal, and we are grateful to the efforts of journalists who have shed greater light on issues that are relevant to the Board’s mission. These disclosures have drawn renewed attention to the seemingly inconsistent way that the company makes decisions, and why greater transparency and independent oversight of Facebook matter so much for users."

Facebook didn’t immediately respond to a request for comment.

A Facebook spokesman has previously said that criticism of how it executed the system was fair, but added that it was designed “for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding." He said the company is phasing out the practice of whitelisting.

The details about the XCheck program and whitelisting were part of a series of articles published in the Journal last week detailing how Facebook’s platforms have negative effects on teen mental health; its algorithm fosters discord; and that drug cartels and human traffickers use its services openly.

In response, Facebook vice president of global affairs Nick Clegg on Saturday published a blog post saying the articles “have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees."

The post didn’t cite any factual inaccuracies.

In a separate post on Tuesday, the company highlighted that it now has 40,000 employees working on safety and security, and that it invested more than $13 billion in these areas since 2016.

“How technology companies grapple with complex issues is being heavily scrutinized, and often, without important context," said the unsigned post. “What is getting lost in this discussion is some of the important progress we’ve made as a company and the positive impact that it is having across many key areas."

The Oversight Board, in its blog post, said it planned to release details on what it heard from Facebook on the matter in October as part of its quarterly transparency report.

“The choices made by companies like Facebook have real-world consequences for the freedom of expression and human rights of billions of people across the world," the group said. “By having clear rules and enforcing them consistently, platforms can give users the confidence that they’ll be treated fairly. Ultimately, that benefits everyone."

The Oversight Board has previously requested information on the company’s XCheck program—asking it to explain how the system works and to share the criteria for adding pages and accounts to the system and its error rates.

In its response, Facebook provided an explanation but didn’t elaborate on criteria for adding pages and accounts to the system, and declined to provide reporting on error rates.

Catch all the Industry News, Banking News and Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
more

MINT SPECIALS

Switch to the Mint app for fast and personalized news - Get App