Multiple Case Decision

Proud Boys News Article

The Board reviewed two Facebook posts, removed by Meta, which linked to a news report about the criminal sentencing of Proud Boys members. After the Board brought these two appeals to Meta’s attention, the company reversed its original decisions and restored both posts.

2 cases included in this bundle

Overturned

FB-2JHTL3QD

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Freedom of expression,Journalism,News events
Standard
Dangerous individuals and organizations
Location
United States
Date
Published on February 27, 2024
Overturned

FB-ZHVJLX60

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Freedom of expression,Journalism,News events
Standard
Dangerous individuals and organizations
Location
United States
Date
Published on February 27, 2024

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errorsand inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not consider public comments and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

The Board reviewed two Facebook posts, removed by Meta, which linked to a news report about the criminal sentencing of Proud Boys members. After the Board brought these two appeals to Meta’s attention, the company reversed its original decisions and restored both posts.

Case Description and Background

In September 2023, two Facebook users posted a link to a news article about the conviction and sentencing of a member of the Proud Boys who participated in the January 6, 2021, attack on the U.S. Capitol. As part of the article, there is a picture of a group of men each wearing a T-shirt on which the text "proud boys" is shown and the group's logo. Neither user added a comment or caption when sharing the link.

The Proud Boys is a far-right group founded in 2016 that has quickly become known for violence and extremism, including playing a significant role in the January 6, 2021, attack on the U.S. Capitol, for which many group members have been prosecuted.

Meta originally removed the post from Facebook, citing its Dangerous Organizations and Individuals policy, under which the company prohibits representation of and certain speech about individuals and organizations that Meta designates as dangerous, as well as unclear references to them. However, the policy recognizes that “users may share content that includes references to designated dangerous organizations and individuals to report on, condemn or neutrally discuss them or their activities.”

In their appeals to the Board, both users argued that their content did not violate Meta’s Community Standards. The user in the first case claimed that the news article was reposted to inform people about the conviction of the Proud Boys leader and stated that if the content had been reviewed by a human, instead of a bot, they would have concluded that the content did not violate Meta's Community Standards. In the second case, the user stated that the purpose of the post was to inform people that justice had been done with regard to an act of terrorism. They also emphasized the importance of human moderation in such instances, since Meta's automated systems made an incorrect decision, likely influenced by the words used in the article instead of looking at the context.

After the Board brought these two cases to Meta’s attention, the company determined that the posts did not violate its policies. Although the posts refer to the Proud Boys, a designated organization, they simply report on the group. Meta concluded that its initial removal was incorrect as the posts fall into the exception that permits users “to report on, condemn or neutrally discuss” dangerous organizations and individuals. Meta restored both pieces of content to the platform.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

These cases illustrate the challenges associated with enforcement of exceptions for news reporting, as set out in Meta's Dangerous Organizations and Individuals Community Standard. This kind of error directly impacts users' ability to share external links from a news outlet, when they relate to a designated group or organization, on Meta's platforms, even when these are neutral and have public value.

Previously, the Board has issued several recommendations regarding Meta's Dangerous Organizations and Individuals policy and news reporting. These include a recommendation to “add criteria and illustrative examples to Meta’s DOI policy to increase understanding of exceptions, specifically around neutral discussion and news reporting,” which Meta has implemented as demonstrated through published information ( Shared Al Jazeera Post, recommendation no.1). The Board has urged Meta to “assess the accuracy of reviewers enforcing the reporting allowance under the DOI policy to identify systemic issues causing enforcement errors,” ( Mention of the Taliban in News Reporting, recommendation no.5). Furthermore, the Board has recommended that Meta ‘’should conduct a review of the HIPO ranker [high-impact false positive override system] to examine if it can more effectively prioritize potential errors in the enforcement of allowances to the Dangerous Organizations and Individuals policy, including news reporting content, where the likelihood of false-positive removals that impacts freedom of expression appears to be high,’’ ( Mention of the Taliban in News Reporting, recommendation no.6). Meta reported implementation on the last two recommendations without publishing further information and thus this implementation cannot be verified.

The Board remains concerned that despite Meta’s report that it has implemented all these recommendations, these two cases underscore the need for more effective measures in accordance with the Board’s recommendations.

The Board emphasizes that full adoption of these recommendations, alongside Meta publishing information to demonstrate they have been successfully implemented, could reduce the number of incorrect removals of news reports under Meta’s Dangerous Organizations and Individuals policy.

Decision

The Board overturns Meta’s original decisions to remove these two pieces of content. The Board acknowledges Meta’s correction of its initial errors once the Board brought these cases to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions