Oversight Board upholds Meta’s decision in ‘Sudan graphic video’ case (2022-002-FB-MR)
The Oversight Board has upheld Meta’s decision to restore a Facebook post depicting violence against a civilian in Sudan. The content raised awareness of human rights abuses and had significant public interest value. The Board recommended that Meta add a specific exception on raising awareness of or documenting human rights abuses to the Violent and Graphic Content Community Standard.
About the case
On December 21, 2021, Meta referred a case to the Board concerning a graphic video which appeared to depict a civilian victim of violence in Sudan. The content was posted to the user’s Facebook profile page following the military coup in the country on October 25, 2021.
The video shows a person lying next to a car with a significant head wound and a visibly detached eye. Voices can be heard in the background saying in Arabic that someone has been beaten and left in the street. A caption, also in Arabic, calls on people to stand together and not to trust the military, with hashtags referencing documenting military abuses and civil disobedience.
After being identified by Meta’s automated systems and reviewed by a human moderator, the post was removed for violating Facebook’s Violent and Graphic Content Community Standard. After the user appealed, however, Meta issued a newsworthiness allowance exempting the post from removal on October 29, 2021. Due to an internal miscommunication, Meta did not restore the content until nearly five weeks later. When Meta restored the post, it placed a warning screen on the video.
The Board agrees with Meta’s decision to restore this content to Facebook with a warning screen. However, Meta’s Violent and Graphic Content policy is unclear on how users can share graphic content to raise awareness of or document abuses.
The rationale for the Community Standard, which sets out the aims of the policy, does not align with the rules of the policy. While the policy rationale states that Meta allows users to post graphic content “to help people raise awareness” about human rights abuses, the policy itself prohibits all videos (whether shared to raise awareness or not) “of people or dead bodies in non-medical settings if they depict dismemberment.”
The Board also concludes that, while it was used in this case, the newsworthiness allowance is not an effective means of allowing this kind of content on Facebook at scale. Meta told the Board that it “documented 17 newsworthy allowances in connection with the Violent Graphic Content policy over the past 12 months (12 months prior to March 8, 2022). The content in this case represents one of those 17 allowances.” By comparison, Meta removed 90.7 million pieces of content under this Community Standard in the first three quarters of 2021.
The Board finds it unlikely that, over one year, only 17 pieces of content related to this policy should have been allowed to remain on the platform as newsworthy and in the public interest. To ensure such content is allowed on Facebook, the Board recommends that Meta amends the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared to raise awareness or document abuses.
Meta must also be prepared to respond quickly and systematically to conflicts and crisis situations around the world. The Board’s decision on “Former President Trump’s Suspension” recommended that Meta “develop and publish a policy that governs Facebook’s response to crises.” While the Board welcomes the development of this protocol, which Meta says it has adopted, the company must implement the protocol more quickly and provide as much detail as possible on how it will operate.
The Oversight Board’s decision
The Oversight Board upholds Meta’s decision to restore the post with a warning screen that prevents minors from seeing the content.
As a policy advisory opinion, the Board recommends that Meta:
- Amend the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. This content should be allowed with a warning screen so that people are aware that content may be disturbing.
- Undertake a policy development process that develops criteria to identify videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses.
- Make explicit in its description of the newsworthiness allowance all the actions it may take (for example, restoration with a warning screen) based on this policy.
- Notify users when it takes action on their content based on the newsworthiness allowance including the restoration of content or application of a warning screen. The user notification may link to the Transparency Center explanation of the newsworthiness allowance.
For further information:
To read the full decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.