Oversight Board tells Meta to fix how it treats VIP users for content violation

The report revealed that Facebook operated a two-tiered content moderation system where common users were subject to the platform's defined rules while VIP users were secretly taken into a programme called "cross check." When users on Meta's cross-check lists posted such content, it was not immediately removed as it would be for most people, but was left up, pending further human review.

The independent Oversight Board on Tuesday asked Meta (formerly Facebook) to come clean on treating VIP user accounts separately from the ordinary users for violating its content policies, making 32 recommendations as it reviewed the company's cross-check programme.

Last year, following disclosures about Meta's cross-check programme in the Wall Street Journal, the Oversight Board accepted a request from the company to review cross-check and make recommendations for how it could be improved.

Advertisement

The report revealed that Facebook operated a two-tiered content moderation system where common users were subject to the platform's defined rules while VIP users were secretly taken into a programme called "cross check."

When users on Meta's cross-check lists posted such content, it was not immediately removed as it would be for most people, but was left up, pending further human review.

Advertisement

"For years, cross-check allowed content from a select group of politicians, business partners, celebrities, and others to remain on Facebook and Instagram for several days when it would have otherwise been removed quickly," the Oversight Group said in a statement.

Also read | Meta threatens to ban news on its platform in US over journalism bill

Advertisement

As the volume of content selected for cross-check may exceed Meta's review capacity, the programme operated with a backlog which delays decisions.

In the review, the Board found several shortcomings in Meta's cross-check programme.

Advertisement

"We found that the programme appears more directly structured to satisfy business concerns. We also found that Meta has failed to track data on whether cross-check results in more accurate decisions, and we expressed concern about the lack of transparency around the programme," the Board emphasised.

Some content that fell under cross-check remained up for seven months before Meta decided to remove it under the cross-check programme.

Advertisement

The metrics that Meta currently uses to measure cross-check's effectiveness do not capture all key concerns.

Meta did not provide the Board with information showing that it tracks whether its decisions through cross-check are more or less accurate than through its normal quality control mechanisms.

Advertisement

"Without this, it is difficult to know whether the programme is meeting its core objectives of producing correct content moderation decisions, or to measure whether cross-check provides an avenue for Meta to deviate from its policies," said the Board.

The Board was also concerned about the limited information that Meta has provided to the public and its users about cross-check.

Advertisement

Also read | WhatsApp rolls out picture-in-picture mode on iOS beta

As part of its recommendations, the Board said that Meta should measure, audit and publish key metrics around its cross-check programme so that it can tell whether the programme is working effectively.

Advertisement

"The company should set out clear, public criteria for inclusion in its cross-check lists, and users who meet these criteria should be able to apply to be added to them," it added.

"Content identified as violating during Meta's first assessment that is high severity should be removed or hidden while further review is taking place," said the Board.

Advertisement

tags
Advertisement