Multiple Case Decision

Goebbels Quote

In this summary decision, the Board considers four posts together. Four separate users appealed Meta’s decisions to remove posts that contain a quote attributed to Joseph Goebbels.

4 cases included in this bundle

Overturned

FB-EZ2SSLB1

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Freedom of expression,Misinformation
Standard
Dangerous individuals and organizations
Location
Canada,United Kingdom,United States
Date
Published on December 18, 2023
Overturned

FB-GI0MEB85

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Freedom of expression,Misinformation
Standard
Dangerous individuals and organizations
Location
Germany,United States
Date
Published on December 18, 2023
Overturned

FB-2X73FNY9

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Freedom of expression,Misinformation
Standard
Dangerous individuals and organizations
Location
Australia
Date
Published on December 18, 2023
Overturned

FB-PFP42GAJ

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Freedom of expression,Misinformation
Standard
Dangerous individuals and organizations
Location
United States
Date
Published on December 18, 2023

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comments process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

In this summary decision, the Board considers four posts together. Four separate users appealed Meta’s decisions to remove posts that contain a quote attributed to Joseph Goebbels, the Nazi’s propaganda chief. Each post shared the quote to criticize the spread of false information in the present day. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and restored each of the posts.

Case Description and Background

The four posts contain a variation of the same quote attributed to Joseph Goebbels, which states: “A lie told once remains a lie, but a lie told a thousand times becomes the truth.” Each user added a caption to accompany the quote. The captions contain the users’ opinions on perceived historical parallels between Nazi Germany and present-day political discourse, as well as threats to free expression posed by the normalization of false information.

Meta originally removed the four posts from Facebook, citing its Dangerous Organizations and Individuals policy, under which the company removes content that “praises,” “substantively supports” or “represents” individuals and organizations it designates as dangerous, including the Nazi party. The policy allows content that discusses a dangerous organization or individual in a neutral way or that condemns its actions.

The four users appealed the removal of their content to the Board. In their appeals, they each stated that they included the quote not to endorse Joseph Goebbels or the Nazi party, but to criticize the negative effect of false information on their political systems. They also highlighted the relevance of historical lessons to the issue of the dangers of propaganda.

After the Board brought these cases to Meta’s attention, the company determined that the content did not violate Meta’s Dangerous Organizations and Individuals policy and the removals of the four posts were incorrect. The company then restored the content to Facebook. Meta stated that the content did not contain any support for the Nazi party, but rather includes descriptions of “the Nazi regime’s campaign to normalize falsehoods in order to highlight the importance of ethics and epistemic standards for free speech.”

Board Authority and Scope

The Board has authority to review Meta's decisions following appeals from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

These cases highlight Meta’s failure to distinguish between supportive references to organizations it designates as dangerous, which are prohibited, and neutral or condemning references that the company allows. The Board has previously issued multiple recommendations on Meta’s Dangerous Organizations and Individuals policy. Continued errors in applying the exceptions of this Community Standard appear to significantly limit important free expression by users, making this a crucial area for further improvement by the company.

In a previous decision, the Board recommended that “Meta should assess the accuracy of reviewers enforcing the reporting allowance under the Dangerous Organizations and Individuals policy in order to identify systemic issues causing enforcement errors,” ( Mention of the Taliban in News Reporting decision, recommendation no. 5). The Board further urged Meta to “conduct a review of the high impact false positive override (HIPO) ranker to examine if it can more effectively prioritize potential errors in the enforcement of allowances to the Dangerous Organizations and Individuals policy,” ( Mention of the Taliban in News Reporting decision, recommendation no. 6). This ranker system prioritizes content decisions for additional review, which Meta uses to identify cases in which it has acted incorrectly, for example, by wrongly removing content. Meta is still assessing the feasibility of this recommendation. And the Board asked Meta to “enhance the capacity allocated to HIPO review across languages to ensure more content decisions that may be enforcement errors receive additional human review,” ( Mention of the Taliban in News Reporting decision, recommendation no. 7). Meta has reported that this recommendation is work the company already does, without publishing information to demonstrate this.

In addition, the Board recommended that Meta “explain and provide examples of the application of key terms used in the Dangerous Organizations and Individuals policy, including the meanings of ‘praise,’ ‘support’ and ‘representations,’” and said those public explanations “should align with the definitions used in Facebook's Internal Implementation Standards,” ( Nazi Quote decision, recommendation no. 2). Meta implemented this recommendation.

As these cases illustrate, the use of an analogy to a notoriously dangerous figure for the purposes of criticism of a current person or practice is a common and entirely legitimate form of political discourse. These four cases illustrate the need for more effective measures along the lines of the Board’s recommendations.

Decision

The Board overturns Meta’s original decisions to remove the content. The Board acknowledges Meta’s correction of its initial errors once the Board brought these cases to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions