UPHELD
2022-002-FB-MR

Sudan graphic video

The Oversight Board has upheld Meta’s decision to restore a Facebook post depicting violence against a civilian in Sudan.
UPHELD
2022-002-FB-MR

Sudan graphic video

The Oversight Board has upheld Meta’s decision to restore a Facebook post depicting violence against a civilian in Sudan.
Policies and topics
News events, Safety
Violent and graphic content
Region and countries
Subsaharan Africa
Sudan
Platform
Facebook
Policies and topics
News events, Safety
Violent and graphic content
Region and countries
Subsaharan Africa
Sudan
Platform
Facebook

Case summaryCase summary

The Oversight Board has upheld Meta’s decision to restore a Facebook post depicting violence against a civilian in Sudan. The content raised awareness of human rights abuses and had significant public interest value. The Board recommended that Meta add a specific exception on raising awareness of or documenting human rights abuses to the Violent and Graphic Content Community Standard.

About the case

On December 21, 2021, Meta referred a case to the Board concerning a graphic video which appeared to depict a civilian victim of violence in Sudan. The content was posted to the user’s Facebook profile page following the military coup in the country on October 25, 2021.

The video shows a person lying next to a car with a significant head wound and a visibly detached eye. Voices can be heard in the background saying in Arabic that someone has been beaten and left in the street. A caption, also in Arabic, calls on people to stand together and not to trust the military, with hashtags referencing documenting military abuses and civil disobedience.

After being identified by Meta’s automated systems and reviewed by a human moderator, the post was removed for violating Facebook’s Violent and Graphic Content Community Standard. After the user appealed, however, Meta issued a newsworthiness allowance exempting the post from removal on October 29, 2021. Due to an internal miscommunication, Meta did not restore the content until nearly five weeks later. When Meta restored the post, it placed a warning screen on the video.

Key findings

The Board agrees with Meta’s decision to restore this content to Facebook with a warning screen. However, Meta’s Violent and Graphic Content policy is unclear on how users can share graphic content to raise awareness of or document abuses.

The rationale for the Community Standard, which sets out the aims of the policy, does not align with the rules of the policy. While the policy rationale states that Meta allows users to post graphic content “to help people raise awareness” about human rights abuses, the policy itself prohibits all videos (whether shared to raise awareness or not) “of people or dead bodies in non-medical settings if they depict dismemberment.”

The Board also concludes that, while it was used in this case, the newsworthiness allowance is not an effective means of allowing this kind of content on Facebook at scale. Meta told the Board that it “documented 17 newsworthy allowances in connection with the Violent Graphic Content policy over the past 12 months (12 months prior to March 8, 2022). The content in this case represents one of those 17 allowances.” By comparison, Meta removed 90.7 million pieces of content under this Community Standard in the first three quarters of 2021.

The Board finds it unlikely that, over one year, only 17 pieces of content related to this policy should have been allowed to remain on the platform as newsworthy and in the public interest. To ensure such content is allowed on Facebook, the Board recommends that Meta amends the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared to raise awareness or document abuses.

Meta must also be prepared to respond quickly and systematically to conflicts and crisis situations around the world. The Board’s decision on “Former President Trump’s Suspension” recommended that Meta “develop and publish a policy that governs Facebook’s response to crises.” While the Board welcomes the development of this protocol, which Meta says it has adopted, the company must implement the protocol more quickly and provide as much detail as possible on how it will operate.

The Oversight Board’s decision

The Oversight Board upholds Meta’s decision to restore the post with a warning screen that prevents minors from seeing the content.

As a policy advisory opinion, the Board recommends that Meta:

  • Amend the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. This content should be allowed with a warning screen so that people are aware that content may be disturbing.
  • Undertake a policy development process that develops criteria to identify videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses.
  • Make explicit in its description of the newsworthiness allowance all the actions it may take (for example, restoration with a warning screen) based on this policy.
  • Notify users when it takes action on their content based on the newsworthiness allowance including the restoration of content or application of a warning screen. The user notification may link to the Transparency Center explanation of the newsworthiness allowance.

*Case summaries provide an overview of the case and do not have precedential value.

Full Case DecisionFull Case Decision

1. Decision summary

The Oversight Board upholds Meta’s decision to restore to Facebook a post containing a video, with a caption, that depicts violence against a civilian in Sudan. The post was restored under the newsworthiness allowance with a warning screen marking the content as sensitive, making it generally inaccessible to minors, and requiring all other users to click through to see the content. The Board finds that this content, which sought to raise awareness of or document human rights abuses, had significant public interest value. While the initial removal of the content was in line with the rules in the Violent and Graphic Content Community Standard, Meta’s decision to restore the content with a sensitivity screen is consistent with its policies, values, and human rights responsibilities.

The Board notes, however, that Meta’s use of the newsworthiness allowance is not an effective means to keep up or restore content such as this at scale. The Board therefore recommends that Meta add a specific exception on raising awareness of or documenting abuses to the Violent and Graphic Content Community Standard. The Board further urges Meta to prioritize implementation of its earlier recommendation to introduce a policy on collection, preservation, and sharing of content that may evidence violations of international law.

2. Case description and background

On December 21, 2021, Meta referred a case to the Board concerning a graphic video which appeared to depict a civilian victim of violence in Sudan. The content was posted to the user's Facebook profile page on October 26, 2021, following a military coup in the country on October 25, 2021 and the start of protests against the military takeover of the government.

The video shows a person with a significant head wound and a visibly detached eye lying next to a car. Voices can be heard in the background saying in Arabic that someone has been beaten and left in the street. The post includes a caption, also in Arabic, calling on the people to stand together and not to trust the military, with hashtags referencing documenting military abuses and civil disobedience.

Meta explained that its technology identified the content as potentially violating its Violent and Graphic Content Community Standard on the same day that it was posted, October 26, 2021. Following human review, Meta determined that it violated Facebook’s Violent and Graphic Content policy and removed it. The content creator subsequently disagreed with the decision. On October 28, 2021, the content was escalated to policy and subject matter experts for their additional review. Following the review, Meta issued a newsworthiness allowance exempting the post from removal under the Violent and Graphic Content policy on October 29, 2021. However, due to an internal miscommunication, Meta did not actually restore the content until December 2, 2021, nearly five weeks later. When it restored the content, it also placed a warning screen on the video marking it as sensitive and requiring users to click through to view the content. The warning screen prohibits users under the age of 18 from viewing the video.

The post was viewed fewer than 1,000 times and no users reported the content.

The following factual background is relevant to the Board’s decision. Following the military takeover of the civilian government in Sudan in October 2021 and the start of civilian protests, security forces in the country fired live ammunition, used tear gas, and arbitrarily arrested and detained protesters, according to the UN High Commissioner for Human Rights. Security forces have also targeted journalists and activists, searching their homes and offices. Journalists have been attacked, arrested, and detained.

According to experts consulted by the Board, with the military takeover of state media and crackdown on Sudanese papers and broadcasters, social media became a crucial source of information and venue to document the violence carried out by the military. The military shut down the internet simultaneously with the arrest of civilian leadership on October 25, 2021, and consistent access to the internet since then has been regularly disrupted across the country.

3. Oversight Board Authority and Scope

The Board has authority to review decisions that Meta submits for review (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1.1). The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include policy advisory statements with non-binding recommendations that Meta must respond to (Charter Article 3, Section 4; Article 4).

4. Sources of authority

The Oversight Board considered the following sources of authority:

I.Oversight Board decisions:

In previous decisions, the Board has considered and provided recommendations on Meta’s policies and processes. The most relevant include:

  • Case decision 2021-010-FB-UA (“Colombia Protests”): In this case, the Board noted that Meta does not make its criteria for escalating content to be reviewed for the newsworthiness allowance (which is the only way for the standard to be applied) publicly available and stated that “in an environment where outlets for political expression are limited, social media has provided a platform for all people, including journalists, to share information about the protests.” The Board recommended that Meta “develop and publicize clear criteria for content reviewers to escalate for additional review of public interest content... These criteria should cover content depicting large protests on political issues, in particular in contexts where states are accused of violating human rights and where maintaining public record of events is of heightened importance.”
  • Case decision 2021-001-FB-FBR (“Former President Trump’s Suspension”): In this case, the Board recommended that Meta “develop and publish a policy that governs Facebook’s response to crises or novel situations where its regular processes would not prevent or avoid imminent harm.” In January 2022, Meta held a Policy Forum on the “Crisis Policy Protocol” which was developed in response to the Board’s recommendation. However, Meta confirmed in one of its responses to questions posed by the Board that this protocol was not in place at the time of the coup in Sudan and subsequent content removal and restoration decisions in this case.

II.Meta’s content policies:

Facebook’s Community Standards:

Under the rationale for its Violent and Graphic Content policy, Meta states that it removes any content that "glorifies violence or celebrates suffering" but allows graphic content "to help people raise awareness." The rules of the policy prohibit posting "videos of people or dead bodies in non-medical settings if they depict dismemberment." According to its newsworthiness allowance, Meta allows violating content on its platforms if it is newsworthy and "if keeping it visible is in the public interest."

III. Meta’s values:

Meta's values are outlined in the introduction to Facebook's Community Standards. The values relevant to this case are those of “Voice,” “Safety,” “Privacy,” and “Dignity.” The value of "Voice" is described as "paramount":

The goal of our Community Standards has always been to create a place for expression and give people a voice. [We want] people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Meta limits “Voice” in service of four other values, and three are relevant here:

“Safety”: Content that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.

“Privacy”: We’re committed to protecting personal privacy and information. Privacy gives people the freedom to be themselves, choose how and when to share on Facebook and connect more easily.

“Dignity”: We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.

IV: International human rights standards:

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, where it reaffirmed its commitment to respecting human rights in accordance with the UNGPs. The Board's analysis of Meta’s human rights responsibilities in this case was informed by the following human rights standards:

  • The right to freedom of expression, including the ability to seek and receive information: Article 19, International Covenant on Civil and Political Rights ( ICCPR), General Comment No. 34, Human Rights Committee, 2011, UN Special Rapporteur on freedom of opinion and expression, reports: A/HRC/38/35 (2018) and A/73/348 (2018).
  • The best interest of the child: Articles 13 and 17, Convention on the Rights of the Child ( CRC); General Comment No. 25, Committee on the Rights of the Child, 2021, on children’s rights in relation to the digital environment.
  • The right to privacy: Article 17, ICCPR.
  • Access to effective remedy: Article 2, ICCPR; General Comment No. 31, Human Rights Committee, (2004); UNGPs, Principles 22, 29, 31.

5. User submissions

Following Meta's referral and the Board's decision to accept the case, the user was sent a message notifying them of the Board's review and providing them with an opportunity to submit a statement to the Board. The user did not submit a statement.

6. Meta’s submissions

In its referral, Meta stated that the decision on this content was difficult because it highlights the tension between the public interest value of documenting human rights violations and the risk of harm associated with sharing such graphic content. Meta also highlighted the importance of allowing users to document human rights violations during a coup and the shutdown of internet access in the country.

Meta informed the Board that immediately after the military coup occurred in Sudan, Meta created a crisis response cross-functional team to monitor the situation and communicate emerging trends and risks. According to Meta, this team observed “spikes in relation to reports of content depicting Graphic Violence and Violence and Incitement at times when protests were most active. [The team] was instructed to escalate requests to allow instances of graphic violence that would otherwise violate the Graphic and Violent Content policy, including content depicting state-backed human rights abuses.”

Meta noted that this video was taken in the context of widespread protests and real concerns regarding press freedom in Sudan. Meta also noted that this type of content could “warn users in the area of a threat to their safety and is particularly important during an internet blackout where journalists’ access to the location may be limited.”

Meta also stated that the decision to restore the content was in line with its values, especially the value of "Voice," which is paramount. Meta cited prior Board decisions stating that political speech is central to the value of "Voice." Case decisions 2021-010-FB-UA (“Colombia Protests”); 2021-003-FB-UA (“Punjabi concern over the RSS in India”); 2021-007-FB-UA (“Myanmar Bot”); and 2021-009-FB-UA (“Shared Al Jazeera post”).

Meta told the Board that it determined that its initial decision to remove the content was inconsistent with Article 19 of the ICCPR, specifically with the principle of necessity. Therefore, it restored the content pursuant to the newsworthiness allowance. To mitigate any potential risk of harm involved in allowing the graphic content to remain on the platform once restored, Meta restricted access to it to people over the age of 18 and applied a warning screen. Meta also noted in its case rationale that the decision to reinstate the content was consistent with the report of the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, particularly “the right to access information on human rights violations.”

Meta also noted that because it applied a warning screen that does not permit users under the age of 18 from seeing the content, it also considered the impact of the decision on the rights of the child. Meta told the Board when making its decision that it considered Article 13 of the Convention on the Rights of the Child and General Comment No. 25 On Children’s Rights in Relation to the Digital Environment, in protecting the child’s right to freedom of expression, including the “freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers.” Meta explained in its case rationale that its decision to restrict the visibility of the content to adults served the legitimate aim of protecting the safety of minors and was proportional to that aim.

The Board asked Meta 21 questions. Meta responded to 17 fully and 4 partially. The partial responses were to do with questions on measuring the impact of Meta’s automated system on content on the platform and why the Violent and Graphic Content Community Standard does not contain a raising awareness exception.

7. Public comments

The Board received five public comments for this case. Two comments were from Europe, one from Sub-Saharan Africa, and two from the United States and Canada.

The submissions covered the following themes: the need to adopt a more context-sensitive approach that would set a higher threshold for removal of content in regions subject to armed conflicts, so that less content is removed; the need to preserve materials for potential future investigations or to hold violators of human rights accountable; and that the newsworthiness allowance is likely to be applied in an ad hoc and contestable manner and that this practice should be reconsidered.

In March 2022, as part of ongoing stakeholder engagement, the Board spoke with approximately 50 advocacy organization representatives and individuals working on reporting and documenting human rights abuses, academics researching ethics, human rights, and documentation, and stakeholders interested in engaging with the Board on issues arising from the Violent and Graphic Content Community Standard and its enforcement in crisis or protest contexts. These ongoing engagements are held under the Chatham House Rule in order to ensure frank discussion and to protect the participants. The discussion touched on a number of themes including the vital role of social media within countries controlled by repressive regimes for documenting human rights violations and bringing international media and public attention to state-sanctioned violence; shared concerns that a universal standard on violent and graphic content is in practice a US-focused standard; and observed that the use of warning screens is useful to address the real problem of trauma, though some organizations reported that warning screens may limit the reach of their content.

To read public comments submitted for this case, please click here.

8. Oversight Board analysis

The Board looked at the question of whether this content should remain on the platform through three lenses: Meta's content policies, the company's values, and its human rights responsibilities.

8.1. Compliance with Meta’s content policies

I.Content rules

The Board agrees with Meta’s decision to restore this content to the platform with a warning screen and age restriction, but notes that there is a lack of clarity in Meta’s content policies and no effective means of implementing this response to similar content at scale.

Meta’s initial decision to remove the content was consistent with the rules within its Violent and Graphic Content Community Standard – the content violated the policy by depicting human dismemberment in a non-medical setting (a person with a significant head wound and a visibly detached eye). However, the policy rationale of the Violent and Graphic Content Community Standard states that “[Meta] allow[s] graphic content (with some limitations) to help people raise awareness about issues. [Meta] know[s] that people value the ability to discuss important issues such as human rights abuses or acts of terrorism.” Despite this reference in the policy rationale, the specific rules within the Community Standard do not include a “raising awareness” exception. Meta’s internal standards for its reviewers also do not include an exception for content seeking to raise awareness of or document human rights abuses.

In the absence of a specific exception within the Community Standard, the Board agrees with Meta’s decision to restore the content using the newsworthiness allowance. Meta states that it allows violating content to remain on the platform under the newsworthiness allowance if it determines that it is newsworthy and “keeping it visible is in the public interest [and] after conducting a balancing test that weighs the public interest against the risk of harm.”

II.Enforcement action

The Board notes that although Meta made the decision to issue a newsworthiness allowance and restore the post with a warning screen on October 29, 2021, the post was not actually restored to the platform until nearly five weeks later, on December 2, 2021. Meta said that communication about the final decision on the content occurred outside of its normal escalation management tools, “leading to the delay in taking appropriate action on the content.” The Board finds this explanation and the delay extremely troubling and emphasizes the importance of Meta taking timely action in relation to decisions such as this one, in the context of a public crisis and when the freedom of the press has been severely restricted. When Meta initially removed this content, it applied a 30-day feature limit preventing the user from creating new content, during a period when protestors in the streets and journalists reporting on the coup and the military crackdown were being met with severe violence and repression.

8.2. Compliance with Meta’s values

The Board concludes that keeping this content on the platform with a warning screen is consistent with Meta’s values of “Voice” and “Safety.”

The Board recognizes the importance of “Dignity” and “Privacy” in the context of protecting victims of human rights abuses. The content affects the dignity and privacy of the injured person in the video and their family; the person depicted is identifiable and they, or their family or loved ones, may not have wished for this type of footage of them to be broadcast.

The Board also notes the relevance of “Safety” in this context, which aims to protect users from content that poses a “risk of harm to the physical security of persons.” On one hand, the user sought to raise awareness of the ongoing coup, which could contribute to improving safety of persons in that region. On the other hand, the content may also create risks for the person shown in the video and/ or their family.

The Board concludes that in a context where civic space and media freedom is curtailed by the state, the value of “Voice” becomes even more important. Here, “Voice” also serves to enhance the value of “Safety” by ensuring people have access to information and state violence is exposed.

8.3. Compliance with Meta’s human rights responsibilities

The Board finds that keeping the content on the platform with a warning screen is consistent with Meta’s human rights responsibilities. However, the Board concludes that Meta’s policies should be amended to better respect the right to freedom of expression for users seeking to raise awareness of or document abuses.

Freedom of expression (Article 19 ICCPR)

Article 19 of the ICCPR provides broad protection for freedom of expression, including the right to seek and receive information. However, the right may be restricted under certain specific conditions, known as the three-part test of legality (clarity), legitimacy, and necessity and proportionality. Although the ICCPR does not create obligations for Meta as it does for states, Meta has committed to respecting human rights as set out in the UNGPs. This commitment encompasses internationally recognized human rights as defined, among other instruments, by the ICCPR.

I. Legality (clarity and accessibility of the rules)

Any restriction on freedom of expression should be accessible and clear enough to provide guidance as to what is permitted and what is not. The Board concludes that the Violent and Graphic Content policy does not make clear how Meta permits users to share graphic content to raise awareness of or document abuses. The rationale for the Community Standard, which sets out the aims of the policy, does not align with the rules of the policy. The policy rationale states that Meta allows users to post graphic content “to help people raise awareness about” human rights abuses but the policy prohibits all videos (whether it is shared to raise awareness or not) “of people or dead bodies in non-medical settings if they depict dismemberment.” While Meta correctly relied on the broader newsworthiness allowance to restore this content, the Violent and Graphic Content Community Standard does not make clear whether this type of content will be allowed on the platform.

The Board also concludes that the newsworthiness allowance does not make clear when content documenting human rights abuses or atrocities will benefit from the allowance. While we agree with Meta that determining newsworthiness can be “highly subjective,” the rule in question does not even define the term. The policy states that the company assigns “special value to content that surfaces imminent threats to public health or safety or that gives voice to perspectives currently being debated as part of a political process.” Emblematic examples and clear principles should guide the exercise of discretion in applying this allowance. Absent those, its use is likely to be inconsistent and arbitrary. Furthermore, the newsworthiness allowance makes no reference to the use of warning screens (or interstitials) for content that otherwise violates Meta’s policies.

Lastly, the Board, in a previous case (“Colombia Protests”), recommended that Meta “develop and publicize clear criteria for content reviewers to escalate for additional review public interest content.” Meta responded that it has already publicized the criteria for escalation through the Transparency Center article on newsworthiness. However, this article focuses on factors Meta considers in applying the newsworthiness allowance, and not criteria provided to moderators for when to escalate content (ie. send it for additional review). If the newsworthiness allowance is intended to be part of the company's scaled content moderation system, processes for escalation and use ought to facilitate that aim. The Board notes that the lack of clarity surrounding when, and how, the newsworthiness allowance is applied is likely to invite arbitrary application of this policy.

II. Legitimate aim

Restrictions on freedom of expression should pursue a legitimate aim, which includes the protection of the rights of others, such as the right to privacy of the depicted individual (General Comment 34, para. 28) and the right to physical integrity. Meta also notes in the rationale for the policy that “content that glorifies violence or celebrates the suffering or humiliation of others...may create an environment that discourages participation.” The Board agrees that the Violent and Graphic Content policy pursues several legitimate aims.

III. Necessity and proportionality

Restrictions on expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interests to be protected” (General Comment 34, para. 34).

In this case, the Board concludes that placing a warning label on the content was a necessary and proportionate restriction on freedom of expression. The warning screen does not place an undue burden on those who wish to see the content while informing others about the nature of the content and allowing them to decide whether to see it or not. The warning screen also adequately protects the dignity of the individual depicted and their family.

The Board also notes that, as discussed in Section 8.1, Meta’s restoration of the post was delayed by nearly five weeks. This delay had a disproportionate impact on freedom of expression in the context of ongoing violence and the restricted media environment in Sudan. A delay of this length undermines the benefits of this speech, which is to provide a warning to civilians and to raise awareness.

The Board also concludes that because the newsworthiness allowance is used infrequently, it is not an effective mechanism through which to allow content documenting abuses or seeking to raise awareness on the platform at scale. Meta told the Board that it “documented 17 newsworthy allowances in connection with the Violent Graphic Content policy over the past 12 months (12 months prior to March 8, 2022). The content in this case represents one of those 17 allowances.” By comparison, Meta removed 90.7 million pieces of content under this Community Standard in the first three quarters of 2021. The Board finds it unlikely that only 17 pieces of content related to this policy, globally, over a year, should have been allowed to remain on the platform as newsworthy and in the public interest. The newsworthiness allowance does not provide an adequate mechanism for preserving content of this nature on the platform. In order to avoid censoring protected expressions, Meta should amend the Violent and Graphic Content policy itself to allow such content to remain on the platform.

In contexts of war or political unrest, there will be more graphic and violent content captured by users and shared on the platform for the purpose of raising awareness of or documenting abuses. This content is important for promoting accountability. The Board, in the "Former President Trump’s Suspension" case, noted that Meta has a responsibility to “collect, preserve and, where appropriate, share information to assist in the investigation and potential prosecution of grave violations of international criminal, human rights and humanitarian law by competent authorities and accountability mechanisms.” The Board also recommended that Meta clarify and state in its Corporate Human Rights Policy protocols for how to make previously public content available to researchers while respecting international standards and data protection laws. In response, Meta committed to briefing the Board on ongoing efforts to address the issue. Since the Board published this recommendation on May 5, 2021, Meta has not reported any progress on this issue. The Board finds a delay of this length and the lack of progress concerning, given the role the platform plays in situations of violent conflict (e.g. the current war in Ukraine where users are documenting abuses through social media) and political unrest around the globe.

Finally, the Board recalls its recommendation from the "Former President Trump’s Suspension" case for Meta to “develop and publish a policy that governs Facebook’s response to crises or novel situations where its regular processes would not prevent or avoid imminent harm.” Meta reported in the Q4 2021 Update on the Oversight Board that the company has prepared a proposal for a new Crisis Protocol in response to the Board’s recommendation and it was adopted. Meta also stated that it will soon provide information on this protocol on the Transparency Center. Meta informed the Board that this protocol was not in place at the time of the coup in Sudan, nor was it operational during the review of this case. The company plans to launch the protocol later in 2022. A well-designed protocol should guide Meta in developing and implementing necessary and proportional responses in crisis situations. Meta should move more quickly to implement this protocol and provide as much detail as possible on how this protocol will operate and interact with existing Meta processes. Meta’s platforms play a prominent role in conflicts and crisis situations around the world and the company must be prepared to respond quickly and systematically to prevent mistakes.

9. Oversight Board decision

The Oversight Board upholds Meta's decision to leave up the content with a screen that restricts access to those over 18.

10. Policy advisory statement

Content policy

1. Meta should amend the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. This content should be allowed with a warning screen so that people are aware that content may be disturbing. The Board will consider this recommendation implemented when Meta updates the Community Standard.

2. Meta should undertake a policy development process that develops criteria to identify videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. The Board will consider this recommendation implemented when Meta publishes the findings of the policy development process, including information on the process and criteria for identifying this content at scale.

3. Meta should make explicit in its description of the newsworthiness allowance all the actions it may take (for example, restoration with a warning screen) based on this policy. The Board will consider this recommendation implemented when Meta updates the policy.

Enforcement

4. To ensure users understand the rules, Meta should notify users when it takes action on their content based on the newsworthiness allowance including the restoration of content or application of a warning screen. The user notification may link to the Transparency Center explanation of the newsworthiness allowance. The Board will consider this implemented when Meta rolls out this updated notification to users in all markets and demonstrates that users are receiving this notification through enforcement data.

*Procedural note:

The Oversight Board’s decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of more than 50 social scientists on six continents, as well as more than 3,200 country experts from around the world. The Board was also assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology.

Policies and topics
News events, Safety
Violent and graphic content
Region and countries
Subsaharan Africa
Sudan
Platform
Facebook
Policies and topics
News events, Safety
Violent and graphic content
Region and countries
Subsaharan Africa
Sudan
Platform
Facebook

Case summaryCase summary

The Oversight Board has upheld Meta’s decision to restore a Facebook post depicting violence against a civilian in Sudan. The content raised awareness of human rights abuses and had significant public interest value. The Board recommended that Meta add a specific exception on raising awareness of or documenting human rights abuses to the Violent and Graphic Content Community Standard.

About the case

On December 21, 2021, Meta referred a case to the Board concerning a graphic video which appeared to depict a civilian victim of violence in Sudan. The content was posted to the user’s Facebook profile page following the military coup in the country on October 25, 2021.

The video shows a person lying next to a car with a significant head wound and a visibly detached eye. Voices can be heard in the background saying in Arabic that someone has been beaten and left in the street. A caption, also in Arabic, calls on people to stand together and not to trust the military, with hashtags referencing documenting military abuses and civil disobedience.

After being identified by Meta’s automated systems and reviewed by a human moderator, the post was removed for violating Facebook’s Violent and Graphic Content Community Standard. After the user appealed, however, Meta issued a newsworthiness allowance exempting the post from removal on October 29, 2021. Due to an internal miscommunication, Meta did not restore the content until nearly five weeks later. When Meta restored the post, it placed a warning screen on the video.

Key findings

The Board agrees with Meta’s decision to restore this content to Facebook with a warning screen. However, Meta’s Violent and Graphic Content policy is unclear on how users can share graphic content to raise awareness of or document abuses.

The rationale for the Community Standard, which sets out the aims of the policy, does not align with the rules of the policy. While the policy rationale states that Meta allows users to post graphic content “to help people raise awareness” about human rights abuses, the policy itself prohibits all videos (whether shared to raise awareness or not) “of people or dead bodies in non-medical settings if they depict dismemberment.”

The Board also concludes that, while it was used in this case, the newsworthiness allowance is not an effective means of allowing this kind of content on Facebook at scale. Meta told the Board that it “documented 17 newsworthy allowances in connection with the Violent Graphic Content policy over the past 12 months (12 months prior to March 8, 2022). The content in this case represents one of those 17 allowances.” By comparison, Meta removed 90.7 million pieces of content under this Community Standard in the first three quarters of 2021.

The Board finds it unlikely that, over one year, only 17 pieces of content related to this policy should have been allowed to remain on the platform as newsworthy and in the public interest. To ensure such content is allowed on Facebook, the Board recommends that Meta amends the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared to raise awareness or document abuses.

Meta must also be prepared to respond quickly and systematically to conflicts and crisis situations around the world. The Board’s decision on “Former President Trump’s Suspension” recommended that Meta “develop and publish a policy that governs Facebook’s response to crises.” While the Board welcomes the development of this protocol, which Meta says it has adopted, the company must implement the protocol more quickly and provide as much detail as possible on how it will operate.

The Oversight Board’s decision

The Oversight Board upholds Meta’s decision to restore the post with a warning screen that prevents minors from seeing the content.

As a policy advisory opinion, the Board recommends that Meta:

  • Amend the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. This content should be allowed with a warning screen so that people are aware that content may be disturbing.
  • Undertake a policy development process that develops criteria to identify videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses.
  • Make explicit in its description of the newsworthiness allowance all the actions it may take (for example, restoration with a warning screen) based on this policy.
  • Notify users when it takes action on their content based on the newsworthiness allowance including the restoration of content or application of a warning screen. The user notification may link to the Transparency Center explanation of the newsworthiness allowance.

*Case summaries provide an overview of the case and do not have precedential value.

Full Case DecisionFull Case Decision

1. Decision summary

The Oversight Board upholds Meta’s decision to restore to Facebook a post containing a video, with a caption, that depicts violence against a civilian in Sudan. The post was restored under the newsworthiness allowance with a warning screen marking the content as sensitive, making it generally inaccessible to minors, and requiring all other users to click through to see the content. The Board finds that this content, which sought to raise awareness of or document human rights abuses, had significant public interest value. While the initial removal of the content was in line with the rules in the Violent and Graphic Content Community Standard, Meta’s decision to restore the content with a sensitivity screen is consistent with its policies, values, and human rights responsibilities.

The Board notes, however, that Meta’s use of the newsworthiness allowance is not an effective means to keep up or restore content such as this at scale. The Board therefore recommends that Meta add a specific exception on raising awareness of or documenting abuses to the Violent and Graphic Content Community Standard. The Board further urges Meta to prioritize implementation of its earlier recommendation to introduce a policy on collection, preservation, and sharing of content that may evidence violations of international law.

2. Case description and background

On December 21, 2021, Meta referred a case to the Board concerning a graphic video which appeared to depict a civilian victim of violence in Sudan. The content was posted to the user's Facebook profile page on October 26, 2021, following a military coup in the country on October 25, 2021 and the start of protests against the military takeover of the government.

The video shows a person with a significant head wound and a visibly detached eye lying next to a car. Voices can be heard in the background saying in Arabic that someone has been beaten and left in the street. The post includes a caption, also in Arabic, calling on the people to stand together and not to trust the military, with hashtags referencing documenting military abuses and civil disobedience.

Meta explained that its technology identified the content as potentially violating its Violent and Graphic Content Community Standard on the same day that it was posted, October 26, 2021. Following human review, Meta determined that it violated Facebook’s Violent and Graphic Content policy and removed it. The content creator subsequently disagreed with the decision. On October 28, 2021, the content was escalated to policy and subject matter experts for their additional review. Following the review, Meta issued a newsworthiness allowance exempting the post from removal under the Violent and Graphic Content policy on October 29, 2021. However, due to an internal miscommunication, Meta did not actually restore the content until December 2, 2021, nearly five weeks later. When it restored the content, it also placed a warning screen on the video marking it as sensitive and requiring users to click through to view the content. The warning screen prohibits users under the age of 18 from viewing the video.

The post was viewed fewer than 1,000 times and no users reported the content.

The following factual background is relevant to the Board’s decision. Following the military takeover of the civilian government in Sudan in October 2021 and the start of civilian protests, security forces in the country fired live ammunition, used tear gas, and arbitrarily arrested and detained protesters, according to the UN High Commissioner for Human Rights. Security forces have also targeted journalists and activists, searching their homes and offices. Journalists have been attacked, arrested, and detained.

According to experts consulted by the Board, with the military takeover of state media and crackdown on Sudanese papers and broadcasters, social media became a crucial source of information and venue to document the violence carried out by the military. The military shut down the internet simultaneously with the arrest of civilian leadership on October 25, 2021, and consistent access to the internet since then has been regularly disrupted across the country.

3. Oversight Board Authority and Scope

The Board has authority to review decisions that Meta submits for review (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1.1). The Board may uphold or overturn Meta’s decision (Charter Article 3, Section 5), and this decision is binding on the company (Charter Article 4). Meta must also assess the feasibility of applying its decision in respect of identical content with parallel context (Charter Article 4). The Board’s decisions may include policy advisory statements with non-binding recommendations that Meta must respond to (Charter Article 3, Section 4; Article 4).

4. Sources of authority

The Oversight Board considered the following sources of authority:

I.Oversight Board decisions:

In previous decisions, the Board has considered and provided recommendations on Meta’s policies and processes. The most relevant include:

  • Case decision 2021-010-FB-UA (“Colombia Protests”): In this case, the Board noted that Meta does not make its criteria for escalating content to be reviewed for the newsworthiness allowance (which is the only way for the standard to be applied) publicly available and stated that “in an environment where outlets for political expression are limited, social media has provided a platform for all people, including journalists, to share information about the protests.” The Board recommended that Meta “develop and publicize clear criteria for content reviewers to escalate for additional review of public interest content... These criteria should cover content depicting large protests on political issues, in particular in contexts where states are accused of violating human rights and where maintaining public record of events is of heightened importance.”
  • Case decision 2021-001-FB-FBR (“Former President Trump’s Suspension”): In this case, the Board recommended that Meta “develop and publish a policy that governs Facebook’s response to crises or novel situations where its regular processes would not prevent or avoid imminent harm.” In January 2022, Meta held a Policy Forum on the “Crisis Policy Protocol” which was developed in response to the Board’s recommendation. However, Meta confirmed in one of its responses to questions posed by the Board that this protocol was not in place at the time of the coup in Sudan and subsequent content removal and restoration decisions in this case.

II.Meta’s content policies:

Facebook’s Community Standards:

Under the rationale for its Violent and Graphic Content policy, Meta states that it removes any content that "glorifies violence or celebrates suffering" but allows graphic content "to help people raise awareness." The rules of the policy prohibit posting "videos of people or dead bodies in non-medical settings if they depict dismemberment." According to its newsworthiness allowance, Meta allows violating content on its platforms if it is newsworthy and "if keeping it visible is in the public interest."

III. Meta’s values:

Meta's values are outlined in the introduction to Facebook's Community Standards. The values relevant to this case are those of “Voice,” “Safety,” “Privacy,” and “Dignity.” The value of "Voice" is described as "paramount":

The goal of our Community Standards has always been to create a place for expression and give people a voice. [We want] people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Meta limits “Voice” in service of four other values, and three are relevant here:

“Safety”: Content that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.

“Privacy”: We’re committed to protecting personal privacy and information. Privacy gives people the freedom to be themselves, choose how and when to share on Facebook and connect more easily.

“Dignity”: We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.

IV: International human rights standards:

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Meta announced its Corporate Human Rights Policy, where it reaffirmed its commitment to respecting human rights in accordance with the UNGPs. The Board's analysis of Meta’s human rights responsibilities in this case was informed by the following human rights standards:

  • The right to freedom of expression, including the ability to seek and receive information: Article 19, International Covenant on Civil and Political Rights ( ICCPR), General Comment No. 34, Human Rights Committee, 2011, UN Special Rapporteur on freedom of opinion and expression, reports: A/HRC/38/35 (2018) and A/73/348 (2018).
  • The best interest of the child: Articles 13 and 17, Convention on the Rights of the Child ( CRC); General Comment No. 25, Committee on the Rights of the Child, 2021, on children’s rights in relation to the digital environment.
  • The right to privacy: Article 17, ICCPR.
  • Access to effective remedy: Article 2, ICCPR; General Comment No. 31, Human Rights Committee, (2004); UNGPs, Principles 22, 29, 31.

5. User submissions

Following Meta's referral and the Board's decision to accept the case, the user was sent a message notifying them of the Board's review and providing them with an opportunity to submit a statement to the Board. The user did not submit a statement.

6. Meta’s submissions

In its referral, Meta stated that the decision on this content was difficult because it highlights the tension between the public interest value of documenting human rights violations and the risk of harm associated with sharing such graphic content. Meta also highlighted the importance of allowing users to document human rights violations during a coup and the shutdown of internet access in the country.

Meta informed the Board that immediately after the military coup occurred in Sudan, Meta created a crisis response cross-functional team to monitor the situation and communicate emerging trends and risks. According to Meta, this team observed “spikes in relation to reports of content depicting Graphic Violence and Violence and Incitement at times when protests were most active. [The team] was instructed to escalate requests to allow instances of graphic violence that would otherwise violate the Graphic and Violent Content policy, including content depicting state-backed human rights abuses.”

Meta noted that this video was taken in the context of widespread protests and real concerns regarding press freedom in Sudan. Meta also noted that this type of content could “warn users in the area of a threat to their safety and is particularly important during an internet blackout where journalists’ access to the location may be limited.”

Meta also stated that the decision to restore the content was in line with its values, especially the value of "Voice," which is paramount. Meta cited prior Board decisions stating that political speech is central to the value of "Voice." Case decisions 2021-010-FB-UA (“Colombia Protests”); 2021-003-FB-UA (“Punjabi concern over the RSS in India”); 2021-007-FB-UA (“Myanmar Bot”); and 2021-009-FB-UA (“Shared Al Jazeera post”).

Meta told the Board that it determined that its initial decision to remove the content was inconsistent with Article 19 of the ICCPR, specifically with the principle of necessity. Therefore, it restored the content pursuant to the newsworthiness allowance. To mitigate any potential risk of harm involved in allowing the graphic content to remain on the platform once restored, Meta restricted access to it to people over the age of 18 and applied a warning screen. Meta also noted in its case rationale that the decision to reinstate the content was consistent with the report of the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, particularly “the right to access information on human rights violations.”

Meta also noted that because it applied a warning screen that does not permit users under the age of 18 from seeing the content, it also considered the impact of the decision on the rights of the child. Meta told the Board when making its decision that it considered Article 13 of the Convention on the Rights of the Child and General Comment No. 25 On Children’s Rights in Relation to the Digital Environment, in protecting the child’s right to freedom of expression, including the “freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers.” Meta explained in its case rationale that its decision to restrict the visibility of the content to adults served the legitimate aim of protecting the safety of minors and was proportional to that aim.

The Board asked Meta 21 questions. Meta responded to 17 fully and 4 partially. The partial responses were to do with questions on measuring the impact of Meta’s automated system on content on the platform and why the Violent and Graphic Content Community Standard does not contain a raising awareness exception.

7. Public comments

The Board received five public comments for this case. Two comments were from Europe, one from Sub-Saharan Africa, and two from the United States and Canada.

The submissions covered the following themes: the need to adopt a more context-sensitive approach that would set a higher threshold for removal of content in regions subject to armed conflicts, so that less content is removed; the need to preserve materials for potential future investigations or to hold violators of human rights accountable; and that the newsworthiness allowance is likely to be applied in an ad hoc and contestable manner and that this practice should be reconsidered.

In March 2022, as part of ongoing stakeholder engagement, the Board spoke with approximately 50 advocacy organization representatives and individuals working on reporting and documenting human rights abuses, academics researching ethics, human rights, and documentation, and stakeholders interested in engaging with the Board on issues arising from the Violent and Graphic Content Community Standard and its enforcement in crisis or protest contexts. These ongoing engagements are held under the Chatham House Rule in order to ensure frank discussion and to protect the participants. The discussion touched on a number of themes including the vital role of social media within countries controlled by repressive regimes for documenting human rights violations and bringing international media and public attention to state-sanctioned violence; shared concerns that a universal standard on violent and graphic content is in practice a US-focused standard; and observed that the use of warning screens is useful to address the real problem of trauma, though some organizations reported that warning screens may limit the reach of their content.

To read public comments submitted for this case, please click here.

8. Oversight Board analysis

The Board looked at the question of whether this content should remain on the platform through three lenses: Meta's content policies, the company's values, and its human rights responsibilities.

8.1. Compliance with Meta’s content policies

I.Content rules

The Board agrees with Meta’s decision to restore this content to the platform with a warning screen and age restriction, but notes that there is a lack of clarity in Meta’s content policies and no effective means of implementing this response to similar content at scale.

Meta’s initial decision to remove the content was consistent with the rules within its Violent and Graphic Content Community Standard – the content violated the policy by depicting human dismemberment in a non-medical setting (a person with a significant head wound and a visibly detached eye). However, the policy rationale of the Violent and Graphic Content Community Standard states that “[Meta] allow[s] graphic content (with some limitations) to help people raise awareness about issues. [Meta] know[s] that people value the ability to discuss important issues such as human rights abuses or acts of terrorism.” Despite this reference in the policy rationale, the specific rules within the Community Standard do not include a “raising awareness” exception. Meta’s internal standards for its reviewers also do not include an exception for content seeking to raise awareness of or document human rights abuses.

In the absence of a specific exception within the Community Standard, the Board agrees with Meta’s decision to restore the content using the newsworthiness allowance. Meta states that it allows violating content to remain on the platform under the newsworthiness allowance if it determines that it is newsworthy and “keeping it visible is in the public interest [and] after conducting a balancing test that weighs the public interest against the risk of harm.”

II.Enforcement action

The Board notes that although Meta made the decision to issue a newsworthiness allowance and restore the post with a warning screen on October 29, 2021, the post was not actually restored to the platform until nearly five weeks later, on December 2, 2021. Meta said that communication about the final decision on the content occurred outside of its normal escalation management tools, “leading to the delay in taking appropriate action on the content.” The Board finds this explanation and the delay extremely troubling and emphasizes the importance of Meta taking timely action in relation to decisions such as this one, in the context of a public crisis and when the freedom of the press has been severely restricted. When Meta initially removed this content, it applied a 30-day feature limit preventing the user from creating new content, during a period when protestors in the streets and journalists reporting on the coup and the military crackdown were being met with severe violence and repression.

8.2. Compliance with Meta’s values

The Board concludes that keeping this content on the platform with a warning screen is consistent with Meta’s values of “Voice” and “Safety.”

The Board recognizes the importance of “Dignity” and “Privacy” in the context of protecting victims of human rights abuses. The content affects the dignity and privacy of the injured person in the video and their family; the person depicted is identifiable and they, or their family or loved ones, may not have wished for this type of footage of them to be broadcast.

The Board also notes the relevance of “Safety” in this context, which aims to protect users from content that poses a “risk of harm to the physical security of persons.” On one hand, the user sought to raise awareness of the ongoing coup, which could contribute to improving safety of persons in that region. On the other hand, the content may also create risks for the person shown in the video and/ or their family.

The Board concludes that in a context where civic space and media freedom is curtailed by the state, the value of “Voice” becomes even more important. Here, “Voice” also serves to enhance the value of “Safety” by ensuring people have access to information and state violence is exposed.

8.3. Compliance with Meta’s human rights responsibilities

The Board finds that keeping the content on the platform with a warning screen is consistent with Meta’s human rights responsibilities. However, the Board concludes that Meta’s policies should be amended to better respect the right to freedom of expression for users seeking to raise awareness of or document abuses.

Freedom of expression (Article 19 ICCPR)

Article 19 of the ICCPR provides broad protection for freedom of expression, including the right to seek and receive information. However, the right may be restricted under certain specific conditions, known as the three-part test of legality (clarity), legitimacy, and necessity and proportionality. Although the ICCPR does not create obligations for Meta as it does for states, Meta has committed to respecting human rights as set out in the UNGPs. This commitment encompasses internationally recognized human rights as defined, among other instruments, by the ICCPR.

I. Legality (clarity and accessibility of the rules)

Any restriction on freedom of expression should be accessible and clear enough to provide guidance as to what is permitted and what is not. The Board concludes that the Violent and Graphic Content policy does not make clear how Meta permits users to share graphic content to raise awareness of or document abuses. The rationale for the Community Standard, which sets out the aims of the policy, does not align with the rules of the policy. The policy rationale states that Meta allows users to post graphic content “to help people raise awareness about” human rights abuses but the policy prohibits all videos (whether it is shared to raise awareness or not) “of people or dead bodies in non-medical settings if they depict dismemberment.” While Meta correctly relied on the broader newsworthiness allowance to restore this content, the Violent and Graphic Content Community Standard does not make clear whether this type of content will be allowed on the platform.

The Board also concludes that the newsworthiness allowance does not make clear when content documenting human rights abuses or atrocities will benefit from the allowance. While we agree with Meta that determining newsworthiness can be “highly subjective,” the rule in question does not even define the term. The policy states that the company assigns “special value to content that surfaces imminent threats to public health or safety or that gives voice to perspectives currently being debated as part of a political process.” Emblematic examples and clear principles should guide the exercise of discretion in applying this allowance. Absent those, its use is likely to be inconsistent and arbitrary. Furthermore, the newsworthiness allowance makes no reference to the use of warning screens (or interstitials) for content that otherwise violates Meta’s policies.

Lastly, the Board, in a previous case (“Colombia Protests”), recommended that Meta “develop and publicize clear criteria for content reviewers to escalate for additional review public interest content.” Meta responded that it has already publicized the criteria for escalation through the Transparency Center article on newsworthiness. However, this article focuses on factors Meta considers in applying the newsworthiness allowance, and not criteria provided to moderators for when to escalate content (ie. send it for additional review). If the newsworthiness allowance is intended to be part of the company's scaled content moderation system, processes for escalation and use ought to facilitate that aim. The Board notes that the lack of clarity surrounding when, and how, the newsworthiness allowance is applied is likely to invite arbitrary application of this policy.

II. Legitimate aim

Restrictions on freedom of expression should pursue a legitimate aim, which includes the protection of the rights of others, such as the right to privacy of the depicted individual (General Comment 34, para. 28) and the right to physical integrity. Meta also notes in the rationale for the policy that “content that glorifies violence or celebrates the suffering or humiliation of others...may create an environment that discourages participation.” The Board agrees that the Violent and Graphic Content policy pursues several legitimate aims.

III. Necessity and proportionality

Restrictions on expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interests to be protected” (General Comment 34, para. 34).

In this case, the Board concludes that placing a warning label on the content was a necessary and proportionate restriction on freedom of expression. The warning screen does not place an undue burden on those who wish to see the content while informing others about the nature of the content and allowing them to decide whether to see it or not. The warning screen also adequately protects the dignity of the individual depicted and their family.

The Board also notes that, as discussed in Section 8.1, Meta’s restoration of the post was delayed by nearly five weeks. This delay had a disproportionate impact on freedom of expression in the context of ongoing violence and the restricted media environment in Sudan. A delay of this length undermines the benefits of this speech, which is to provide a warning to civilians and to raise awareness.

The Board also concludes that because the newsworthiness allowance is used infrequently, it is not an effective mechanism through which to allow content documenting abuses or seeking to raise awareness on the platform at scale. Meta told the Board that it “documented 17 newsworthy allowances in connection with the Violent Graphic Content policy over the past 12 months (12 months prior to March 8, 2022). The content in this case represents one of those 17 allowances.” By comparison, Meta removed 90.7 million pieces of content under this Community Standard in the first three quarters of 2021. The Board finds it unlikely that only 17 pieces of content related to this policy, globally, over a year, should have been allowed to remain on the platform as newsworthy and in the public interest. The newsworthiness allowance does not provide an adequate mechanism for preserving content of this nature on the platform. In order to avoid censoring protected expressions, Meta should amend the Violent and Graphic Content policy itself to allow such content to remain on the platform.

In contexts of war or political unrest, there will be more graphic and violent content captured by users and shared on the platform for the purpose of raising awareness of or documenting abuses. This content is important for promoting accountability. The Board, in the "Former President Trump’s Suspension" case, noted that Meta has a responsibility to “collect, preserve and, where appropriate, share information to assist in the investigation and potential prosecution of grave violations of international criminal, human rights and humanitarian law by competent authorities and accountability mechanisms.” The Board also recommended that Meta clarify and state in its Corporate Human Rights Policy protocols for how to make previously public content available to researchers while respecting international standards and data protection laws. In response, Meta committed to briefing the Board on ongoing efforts to address the issue. Since the Board published this recommendation on May 5, 2021, Meta has not reported any progress on this issue. The Board finds a delay of this length and the lack of progress concerning, given the role the platform plays in situations of violent conflict (e.g. the current war in Ukraine where users are documenting abuses through social media) and political unrest around the globe.

Finally, the Board recalls its recommendation from the "Former President Trump’s Suspension" case for Meta to “develop and publish a policy that governs Facebook’s response to crises or novel situations where its regular processes would not prevent or avoid imminent harm.” Meta reported in the Q4 2021 Update on the Oversight Board that the company has prepared a proposal for a new Crisis Protocol in response to the Board’s recommendation and it was adopted. Meta also stated that it will soon provide information on this protocol on the Transparency Center. Meta informed the Board that this protocol was not in place at the time of the coup in Sudan, nor was it operational during the review of this case. The company plans to launch the protocol later in 2022. A well-designed protocol should guide Meta in developing and implementing necessary and proportional responses in crisis situations. Meta should move more quickly to implement this protocol and provide as much detail as possible on how this protocol will operate and interact with existing Meta processes. Meta’s platforms play a prominent role in conflicts and crisis situations around the world and the company must be prepared to respond quickly and systematically to prevent mistakes.

9. Oversight Board decision

The Oversight Board upholds Meta's decision to leave up the content with a screen that restricts access to those over 18.

10. Policy advisory statement

Content policy

1. Meta should amend the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. This content should be allowed with a warning screen so that people are aware that content may be disturbing. The Board will consider this recommendation implemented when Meta updates the Community Standard.

2. Meta should undertake a policy development process that develops criteria to identify videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. The Board will consider this recommendation implemented when Meta publishes the findings of the policy development process, including information on the process and criteria for identifying this content at scale.

3. Meta should make explicit in its description of the newsworthiness allowance all the actions it may take (for example, restoration with a warning screen) based on this policy. The Board will consider this recommendation implemented when Meta updates the policy.

Enforcement

4. To ensure users understand the rules, Meta should notify users when it takes action on their content based on the newsworthiness allowance including the restoration of content or application of a warning screen. The user notification may link to the Transparency Center explanation of the newsworthiness allowance. The Board will consider this implemented when Meta rolls out this updated notification to users in all markets and demonstrates that users are receiving this notification through enforcement data.

*Procedural note:

The Oversight Board’s decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of more than 50 social scientists on six continents, as well as more than 3,200 country experts from around the world. The Board was also assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology.