Oversight Board overturns Meta’s original decision in 'Colombian police cartoon' case (2022-004-FB-UA)
The Oversight Board has overturned Meta’s original decision to remove a Facebook post of a cartoon depicting police violence in Colombia. The Board is concerned that Media Matching Service banks, which can automatically remove images that violate Meta’s rules, can amplify the impact of incorrect decisions to bank content. In response, Meta must urgently improve its procedures to quickly remove non-violating content from these banks.
About the case
In September 2020, a Facebook user in Colombia posted a cartoon resembling the official crest of the National Police of Colombia, depicting three figures in police uniform holding batons over their heads. They appear to be kicking and beating another figure who is lying on the ground with blood beneath their head. The text of the crest reads, in Spanish, “República de Colombia - Policía Nacional - Bolillo y Pata.” Meta translated the text as “National Police – Republic of Colombia – Baton and Kick.”
According to Meta, in January 2022, 16 months after the user posted the content, the company removed the content as it matched with an image in a Media Matching Service bank. These banks can automatically identify and remove images which have been identified by human reviewers as violating the company’s rules. As a result of the Board selecting this case, Meta determined that the post did not violate its rules and restored it. The company also restored other pieces of content featuring this cartoon which had been incorrectly removed by its Media Matching Service banks.
As Meta has now recognized, this post did not violate its policies. Meta was wrong to add this cartoon to its Media Matching Service bank, which led to a mass and disproportionate removal of the image from the platform, including the content posted by the user in this case. Despite 215 users appealing these removals, and 98% of those appeals being successful, Meta still did not remove the cartoon from this bank until the case reached the Board.
This case shows how, by using automated systems to remove content, Media Matching Service banks can amplify the impact of incorrect decisions by individual human reviewers. The stakes of mistaken additions to such banks are especially high when, as in this case, the content consists of political speech criticizing state actors.
In response, Meta should develop mechanisms to quickly remove any non-violating content which is incorrectly added to its Media Matching Service banks. When decisions to remove content included in these banks are frequently overturned on appeal, this should immediately trigger reviews which can remove this content from the bank.
The Board is particularly concerned that Meta does not measure the accuracy of Media Matching Service banks for specific content policies. Without this data, which is crucial for improving how these banks work, the company cannot tell whether this technology works more effectively for some Community Standards than others.
The Oversight Board’s decision
The Oversight Board overturns Meta’s original decision to remove the content.
The Board recommends that Meta:
- Ensure that content with high rates of appeal and high rates of successful appeal is reassessed for possible removal from its Media Matching Service banks.
- Limit the time between when banked content is identified for additional review and when, if deemed non-violating, it is removed from the bank. This would ensure that content which is not violating is quickly removed from Media Matching Service banks.
- Publish error rates for content mistakenly included in Media Matching Service banks of violating content, broken down by content policy, in its transparency reporting.
For further information:
To read the full decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.