Facebook secret program let ‘celebrities avoid moderation’

FILE PHOTO - A 3D-printed Facebook logo is seen placed on a keyboard in this illustration taken March 25, 2020. REUTERS-Dado Ruvic-Illustration-File Photo

Facebook maintains a secret program to exempt athletes, politicians, and other high-profile users from its typical moderation process, according to The Wall Street Journal. The program is reportedly meant to stop “PR fires,” or bad press caused by pulling down photos, posts, and other content from high-profile users that should have been allowed to stay up. In reality, the program just lets these users break the rules in ways that would have gotten most people into trouble, according to the report.

The program is known as XCheck, or “cross-check,” and it’s ostensibly meant to provide additional quality control around moderation when it comes to high-profile users, according to the Journal. Posts from users flagged for XCheck are supposed to be routed to a set of better-trained moderators to ensure Facebook’s rules are properly enforced. But the program reportedly protected 5.8 million people as of 2020, and just 10 per cent of posts that hit XCheck actually get reviewed, according to a document seen by the Journal.

Read Also:  Twitter deal: Elon Musk offloads $4 billion in Tesla shares

High-profile users protected by the program include former President Donald Trump, Donald Trump Jr., Senator Elizabeth Warren, and Candace Owens, according to the report. Users are usually unaware that they’re being given special treatment, the report says.

Facebook told the Journal that criticism of XCheck was warranted and the company is working to fix the program. The system is meant to “accurately enforce policies on content that could require more understanding,” a spokesperson said. They added that “Facebook itself identified the issues with cross-check and has been working to address them.”

While it’s a bad look for Facebook, which has promised even enforcement of its rules, there’s a level on which none of this is particularly surprising. Facebook has a long and detailed set of moderation policies. But it’s always been clear that those policies are enforced at Facebook’s discretion, with leeway often granted to major names or questionable content when removal might lead to problems for the company. With the Journal’s report, it’s evident that in some cases, Facebook’s own system, by design or not, is helping keep some of those posts online.

Get more stories like this on Facebook, Twitter and Telegram