PUBLISHED
2023-040-FB-UA, 2023-041-FB-UA

Mention of Al-Shabaab

In this summary decision, the Board reviewed two posts referring to the terrorist group Al-Shabaab.
PUBLISHED
2023-040-FB-UA, 2023-041-FB-UA

Mention of Al-Shabaab

In this summary decision, the Board reviewed two posts referring to the terrorist group Al-Shabaab.

2 cases included in this bundle

OVERTURNED

FB-41ERXHF1
Case about dangerous individuals and organizations on Facebook


Facebook
Dangerous individuals and organizations
Somalia
Published on 22nd November 2023

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors, and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

In this summary decision, the Board reviewed two posts referring to the terrorist group Al-Shabaab. After the Board brought these two appeals to Meta’s attention, the company reversed its original decisions and restored both posts.

Case Description and Background

For the first case, in July 2023, a Facebook user, who appears to be a news outlet, posted a picture showing a weapon and military equipment lying on the ground at soldiers' feet with a caption saying "Somali government forces" and "residents" undertook a military operation and killed Al-Shabaab forces in the Mudug region of Somalia.

For the second case, also in July 2023, a Facebook user posted two pictures with a caption. The first picture shows a woman painting a black color over a blue pillar. The second picture shows a black Al-Shabaab emblem painted over the pillar. The caption says, “the terrorists that used to hide have come out of their holes, and the world has finally seen them.”

Harakat al-Shabaab al-Mujahideen, popularly known as Al-Shabaab, or “the Youth,”(in Arabic) is an Islamist terrorist group with links to al-Qa’ida working to overthrow the Somali government. The group mainly operates in Somalia and has carried out several attacks in neighboring countries.

Meta originally removed the post from Facebook, citing its Dangerous Organizations and Individuals (DOI) policy, under which the company removes content that "praises,” “substantively supports,” or “represents” individuals and organizations the company designate as dangerous. However, the policy recognizes that “users may share content that includes references to designated dangerous organizations and individuals to report on, condemn or neutrally discuss them or their activities.”

In their appeal to the Board, both users argued that their content did not violate Meta’s Community Standards. The user in the first case described their account as a news outlet and stated that the post is a news report about the government operation against the terrorist group Al-Shabaab. The user in the second case stated that the aim of the post is to inform and raise awareness about the activities of Al-Shabaab and condemn it.

After the Board brought these two cases to Meta’s attention, the company determined that the posts did not violate its policies. Although the posts refer to Al-Shabaab, a designated dangerous organization, they do not praise Al-Shabaab but instead report on and condemn the group. Meta concluded that its initial removal was incorrect as the posts fell into the exception to the DOI policy and restored both pieces of content to the platform.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for people who use Facebook and Instagram.

Case Significance

These cases highlight an over-enforcement in Meta's enforcement of its DOI policy in a country experiencing armed conflict and terrorist attacks. This kind of error undermines genuine efforts to condemn, report, and raise awareness about terrorist organizations, including alleged human right abuses and atrocities committed by such groups.

Previously, the Board has issued several recommendations regarding Meta's DOI policy. These include a recommendation to “assess the accuracy of reviewers enforcing the reporting allowance under the DOI policy to identify systemic issues causing enforcement errors,” on which Meta showed progress on implementation (“ Mention of the Taliban in News Reporting,” recommendation no. 5). The Board has also recommended that Meta “add criteria and illustrative examples to Meta’s DOI policy to increase understanding of exceptions, specifically around neutral discussion and news reporting,” a recommendation for which Meta demonstrated implementation through published information (“ Shared Al Jazeera Post,” recommendation no. 1). Furthermore, the Board has recommended Meta "implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes,” ( Breast Cancer Symptoms and Nudity, recommendation no.5). Meta described this recommendation as work it already does but did not publish information to demonstrate implementation.

Decision

The Board overturns Meta’s original decisions to remove the two pieces of content. The Board acknowledges Meta’s correction of its initial error once the Board brought these cases to Meta’s attention.

2 cases included in this bundle

OVERTURNED

FB-41ERXHF1
Case about dangerous individuals and organizations on Facebook


Facebook
Dangerous individuals and organizations
Somalia
Published on 22nd November 2023

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors, and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

In this summary decision, the Board reviewed two posts referring to the terrorist group Al-Shabaab. After the Board brought these two appeals to Meta’s attention, the company reversed its original decisions and restored both posts.

Case Description and Background

For the first case, in July 2023, a Facebook user, who appears to be a news outlet, posted a picture showing a weapon and military equipment lying on the ground at soldiers' feet with a caption saying "Somali government forces" and "residents" undertook a military operation and killed Al-Shabaab forces in the Mudug region of Somalia.

For the second case, also in July 2023, a Facebook user posted two pictures with a caption. The first picture shows a woman painting a black color over a blue pillar. The second picture shows a black Al-Shabaab emblem painted over the pillar. The caption says, “the terrorists that used to hide have come out of their holes, and the world has finally seen them.”

Harakat al-Shabaab al-Mujahideen, popularly known as Al-Shabaab, or “the Youth,”(in Arabic) is an Islamist terrorist group with links to al-Qa’ida working to overthrow the Somali government. The group mainly operates in Somalia and has carried out several attacks in neighboring countries.

Meta originally removed the post from Facebook, citing its Dangerous Organizations and Individuals (DOI) policy, under which the company removes content that "praises,” “substantively supports,” or “represents” individuals and organizations the company designate as dangerous. However, the policy recognizes that “users may share content that includes references to designated dangerous organizations and individuals to report on, condemn or neutrally discuss them or their activities.”

In their appeal to the Board, both users argued that their content did not violate Meta’s Community Standards. The user in the first case described their account as a news outlet and stated that the post is a news report about the government operation against the terrorist group Al-Shabaab. The user in the second case stated that the aim of the post is to inform and raise awareness about the activities of Al-Shabaab and condemn it.

After the Board brought these two cases to Meta’s attention, the company determined that the posts did not violate its policies. Although the posts refer to Al-Shabaab, a designated dangerous organization, they do not praise Al-Shabaab but instead report on and condemn the group. Meta concluded that its initial removal was incorrect as the posts fell into the exception to the DOI policy and restored both pieces of content to the platform.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for people who use Facebook and Instagram.

Case Significance

These cases highlight an over-enforcement in Meta's enforcement of its DOI policy in a country experiencing armed conflict and terrorist attacks. This kind of error undermines genuine efforts to condemn, report, and raise awareness about terrorist organizations, including alleged human right abuses and atrocities committed by such groups.

Previously, the Board has issued several recommendations regarding Meta's DOI policy. These include a recommendation to “assess the accuracy of reviewers enforcing the reporting allowance under the DOI policy to identify systemic issues causing enforcement errors,” on which Meta showed progress on implementation (“ Mention of the Taliban in News Reporting,” recommendation no. 5). The Board has also recommended that Meta “add criteria and illustrative examples to Meta’s DOI policy to increase understanding of exceptions, specifically around neutral discussion and news reporting,” a recommendation for which Meta demonstrated implementation through published information (“ Shared Al Jazeera Post,” recommendation no. 1). Furthermore, the Board has recommended Meta "implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes,” ( Breast Cancer Symptoms and Nudity, recommendation no.5). Meta described this recommendation as work it already does but did not publish information to demonstrate implementation.

Decision

The Board overturns Meta’s original decisions to remove the two pieces of content. The Board acknowledges Meta’s correction of its initial error once the Board brought these cases to Meta’s attention.