A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board Overturns Meta's Decision in Haitian Police Station Video Case


December 2023

The Oversight Board has overturned Meta’s decision to take down a video from Facebook showing people entering a police station in Haiti, attempting to break into a cell holding an alleged gang member and threatening them with violence. The Board finds the video did violate the company’s Violence and Incitement policy. Nonetheless, the majority of the Board disagrees with Meta’s assessment on the application of the newsworthiness allowance in this case. For the majority, Meta’s near three-week delay in removing the content meant the risk of offline harm had diminished sufficiently for a newsworthiness allowance to be applied. Moreover, the Board recommends that Meta assess the effectiveness and timeliness of its responses to content escalated through the Trusted Partner program.

About the Case

In May 2023, a Facebook user posted a video showing people in civilian clothing entering a police station, attempting to break into a cell holding a man – who is a suspected gang member, according to Meta – and shouting “we’re going to break the lock” and “they’re already dead.” Towards the end of the video, someone yells “bwa kale na boudaw,” which Meta interpreted as a call for the group to “to take action against the person ‘bwa kale style’ – in other words, to lynch him.” Meta also interpreted “bwa kale” as a reference to the civilian movement in Haiti that involves people taking justice into their own hands. The video is accompanied by a caption in Haitian Creole that includes the statement, “the police cannot do anything.” The post was viewed more than 500,000 times and the video around 200,000 times.

Haiti is experiencing unprecedented insecurity, with gangs taking control of territory and terrorizing the population. With police unable to address the violence and, in some instances, said to be complicit, a movement has emerged that has seen “more than 350 people [being] lynched by local people and vigilante groups” in a four-month period this year, according to the UN High Commissioner for Human Rights. In retaliation, gangs have taken revenge on those believed to be in or sympathetic to the movement.

A Trusted Partner flagged the video to Meta as potentially violating 11 days after it was posted, warning the content might incite further violence. Meta’s Trusted Partner program is a network of non-governmental organizations, humanitarian agencies and human rights researchers from 113 countries. Meta told the Board that the “greater the level of risk [of violence in a country], the higher the priority for developing relationships with Trusted Partners,” who can report content to the company. About eight days after the Trusted Partner’s report in this case, Meta determined the video included both a statement of intent to commit and a call for high severity violence and removed the content from Facebook. Meta referred this case to the Board to address the difficult moderation questions raised by content related to the “Bwa Kale” movement in Haiti. Meta did not apply the newsworthiness allowance because the company found the risk of harm was high and outweighed the public interest value of the post, noting the ongoing pattern of violent reprisals and killings in Haiti.

Key Findings

The Board finds the content did violate Facebook’s Violence and Incitement Community Standard because there was a credible threat of offline harm to the person in the cell as well as to others. However, the majority of the Board disagrees with Meta on the application of the newsworthiness allowance in this case. Given the delay of nearly three weeks between posting and enforcement, Meta should have applied the newsworthiness allowance to keep up the content, with the Board concluding the risk of harm and public interest involved in any newsworthiness analysis should be assessed at the time Meta is considering issuing any allowance, rather than at the time content is posted. The Board finds that Meta should update its language on the newsworthiness allowance to make this clear to users.

For the majority of Board Members, Meta’s near three-week delay in removing the content meant the risk of offline harm had diminished sufficiently for a newsworthiness allowance to be applied. This group considered the context in Haiti, the extent and reach of the post, and the likelihood of harm given the delay in enforcement. By that time, when the video already had 200,000 views, the risk the content posed had already likely materialized. Furthermore, in a situation of protracted widespread violence and breakdown in public order, sharing information becomes even more important to allow communities to react to events, with the video holding the potential to inform people in both Haiti and abroad about the realities in the country.

However, a minority of Board Members find Meta was right not to apply the allowance. Since the content was posted during a period of heightened risk, the threat of the video leading to additional and retaliatory violence had not passed when Meta reviewed the content. These Board Members consider removal necessary to address these risks.

The Board is concerned about Meta’s ability to moderate content in Haiti in a timely manner during this period of heightened risk. The delay in this case appears to be the result of the company’s failure to invest adequate resources in moderating content in Haiti. Meta was not able to provide a timely assessment of the report from its Trusted Partner. Reports from Trusted Partners are one of the main tools Meta relies on in Haiti to identify potentially violating content. A recent report by a Trusted Partner found that Meta does not adequately resource its own teams to review content identified by Trusted Partners and there is significant irregularity in response times.

Finally, the Board notes Meta failed to activate its Crisis Policy Protocol in Haiti. While Meta told the Board it already had risk-mitigation measures in place, the Board is concerned the lengthy delay in this case indicates that existing measures are inadequate. If the company fails to use this protocol in such situations, it will not deliver timely or principled moderation, undermining the company’s and the public’s ability to assess the effectiveness of the protocol in meeting its aims.

The Oversight Board's Decision

The Oversight Board overturns Meta's decision to take down this content, requiring the post to be restored.

The Board recommends that Meta:

  • Assess the timeliness and effectiveness of its responses to content escalated through the Trusted Partner Program, to address the risk of harm particularly where Meta has no or limited proactive moderation tools, processes or measures to identify and assess content.
  • The Board also takes this opportunity to remind Meta of a previous recommendation, from the Russian Poem case, that calls for the company to make public an exception to its Violence and Incitement policy. This exception allows for content that “condemns or raises awareness of violence,” but Meta requires the user to make it clear they are posting the content for either of these two reasons.

For Further Information

To read the full decision, click here.

To read a synopsis of public comments for this case, please click the attachment below

Attachments

Haitian Police Station Video Public Comments Appendix
Download
Back to news and articles