Oversight Board Upholds Meta’s Decision in Weapons Post Linked to Sudan’s Conflict Case

The Oversight Board has upheld Meta’s decision to remove a post containing a graphic of a gun cartridge, accompanied by a caption providing instructions on how to create and throw a Molotov cocktail. The Board finds the post violated Facebook’s Violence and Incitement Community Standard, posing an imminent risk of harm that could exacerbate ongoing violence in Sudan. This case has raised broader concerns about Meta’s human rights responsibilities for content containing instructions for weapons shared during armed conflicts. To meet these responsibilities, Meta should ensure exceptions to its violence and incitement rules are clearer. Additionally, Meta should develop tools to correct its own mistakes when it has sent the wrong notification to users about which Community Standard their content violated.

About the Case

In June 2023, a Facebook user posted an illustration of a gun cartridge, with the components identified in Arabic. The post’s caption provides instructions on how to create a Molotov cocktail using the components and advises wearing a helmet when throwing the incendiary device. It concludes with a call for victory for the Sudanese people and the Sudanese Armed Forces (SAF).

Two months before the content was posted, fighting broke out in Sudan between the SAF and the Rapid Support Forces (RSF), a paramilitary group designated as dangerous by Meta in August 2023. Sudan’s armed conflict is ongoing and has spread across the country, with both sides having used explosive weapons in areas densely populated by civilians.

Meta’s automated systems detected the content, determining that it violated Facebook’s Violence and Incitement Community Standard. Meta removed the post, applying a standard strike to the user’s profile. The user immediately appealed. This led to one of Meta’s human reviewers finding that the post violated the Restricted Goods and Services policy. The user then appealed to the Board, after which Meta determined the content should have been removed but, as per its original decision, under the Violence and Incitement Community Standard.

Key Findings

The Board finds the post violated the Violence and Incitement policy in two ways. First, the combined effect of the image and caption violated the rule that prohibits “instructions on how to make or use weapons where there is language explicitly stating the goal to seriously injure or kill people.” Regardless of the intent of the person who created the post, the step-by-step guide on how to build a Molotov cocktail and the advice to “use a helmet” indicates the content is calling on people to act on the instructions. Second, resorting to violence in support of the SAF during the ongoing armed conflict does not relate to a non-violent purpose. The Violence and Incitement policy prohibits instructions on how to make weapons, unless there is “context that the content is for a non-violent purpose.”

The rule that prohibits instructions on making and using weapons does include an exception for content when it is shared for “recreational self-defense, military training purposes, commercial video games or news coverage.” Stakeholders consulted by the Board as well as news reports have claimed that Meta allows such instructions in exercise of self-defense for some armed conflicts. Meta has denied this is true. The Board is not in a position to determine the truth of these competing claims.

What is essential, however, is that Meta’s rules on such an important issue are clear, and enforced consistently and rigorously. Given the use of Meta’s platforms by combatants and civilians during conflicts to share information on the use of weapons, or violent content for self-defense, Meta should clarify what the “recreational self-defense” and “military training” exceptions mean. The Board disagrees with Meta that these terms in the public language of the Violence and Incitement Community Standard have “a plain meaning.” To improve clarity, Meta should clarify which actors can benefit from “recreational self-defense” and in which settings this exception applies. Additionally, the public language of the policy on instructions to make or use weapons or explosives fails to expressly state that self-defense contexts are not considered during armed conflicts.

This case also highlights another unclear exception to the Violence and Incitement Community Standard, which allows threats directed at terrorists and other violent actors. This is insufficiently clear because Meta does not clarify whether this applies to all organizations and individuals it designates under its separate Dangerous Organizations and Individuals policy. This is relevant to this case since the RSF was designated for a relevant period in 2023. However, it is impossible for users to know whether their post could be removed or not on this basis since the list of designated organizations and individuals is not available publicly. The Board has already raised concerns about such lack of clarity in our Haitian Police Station Video decision.

The Board is also concerned that Meta’s notification system does not allow the company to rectify its own mistakes when it does not correctly communicate which Community Standard a user has violated. Being able to correctly inform users of their violation is crucial, guaranteeing fairness. Incorrect notifications undermine the user’s ability to appeal and access remedy. In this case, the user was informed in error that their post was removed for hate speech, even though it had been taken down for violating the Violence and Incitement Community Standard. Therefore, the Board encourages Meta to explore technically feasible ways in which it can make corrections to user notifications.

The Oversight Board’s Decision

The Oversight Board has upheld Meta’s decision to remove the post.

The Board recommends that Meta:

  • Amend its Violence and Incitement policy to include a definition of “recreational self-defense” and “military training” as exceptions to its rules prohibiting users from providing instructions on making or using weapons, and clarify that it does not allow any self-defense exception for such instructions in an armed conflict.
  • Develop tools to rectify its own mistakes when sending users messages that notify them about the Community Standard they violated, so that users can correctly understand which policies their content violated.

For Further Information

To read the full decision, click here.

To read a synopsis of public comments for this case, please click here.

Return to News