Overturned
Statement Against the Former President of Senegal
July 31, 2025
A user appealed Meta's decision to remove a Facebook that criticized Senegal's debt levels under former President Macky Sall and stated that Mr. Sall deserves "the public guillotine." After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company's attention and include information about Meta's acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta's decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
A user appealed Meta's decision to remove a Facebook post that criticized Senegal's debt levels under former President Macky Sall and stated that Mr. Sall deserves "the public guillotine." After the Board brought the appeal to Meta's attention, the company reversed its original decision and restored the post.
About the Case
In February 2025, a user posted on Instagram a series of crying emojis followed by a statement that Senegal's national debt under former Senegalese President Macky Sall appeared to be nearly equivalent to the entirety of Senegal's gross domestic product and that Mr. Sall deserves "the public guillotine."
Around the time the content was posted, Senegal's Court of Auditors released a report that showed former President Sall's government had misreported economic data, including debt figures.
The user who appealed the case to the Board claimed that the removal of their content demonstrated "unfair restrictive measures that undermine our freedom of expression."
Meta's Violence and Incitement policy states that the company removes "threats of violence that could lead to death (or other forms of high severity violence)" targeted at any individual, including public figures. However, the policy rationale states that the company tries to "consider the language and context in order to distinguish casual or awareness-raising statements from content that constitutes a credible threat to public or personal safety." The policy rationale also highlights that Meta "considers additional information such as a person’s public visibility and the risks to their physical safety" to determine whether threats are credible.
After the Board brought this case to Meta's attention, the company determined that the content did not violate Meta's Violence and Incitement policy and that its removal was incorrect. The company stated that given the "broader social and political situation in Senegal" the post appears to "express condemnation of the government's misreporting of key economic data rather than a genuine call for violence against Sall." Furthermore, the company noted that Mr. Sall currently lives outside of Senegal and that the guillotine "is not a modern method of execution," when assessing whether the content represented a credible threat. The company then restored the content to Facebook.
Board Authority and Scope
The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Case
This case highlights the over-enforcement of Meta's Violence and Incitement policy, and how the company's shortcomings in differentiating between credible and non-credible violent threats at scale may impact political speech. Enforcement errors of this nature are particularly concerning when they impact political speech in countries such as Senegal, where press freedom, and, more generally, freedom of expression face constraints.
Meta's errors when distinguishing, at scale, between credible and non-credible threats against public figures have been discussed by the Board in previous cases. In the Iran Protest Slogan decision, the Board determined that a widely used protest slogan – which translates literally as a call for the death of Iran's Supreme Leader Ayatollah Khamenei – was used rhetorically to express disapproval. Additionally, in the Statement About the Japanese Prime Minister decision, the Board highlighted that "the threat against a political leader [former Japanese Prime Minister Fumio Kishida] was intended as non-literal political criticism calling attention to alleged corruption, using strong language." The Board also expressed concern that Meta’s Violence and Incitement policy "does not clearly distinguish literal from figurative threats."
The Board has issued recommendations in the Iran Protest Slogan and Statements About the Japanese Prime Minister decisions that are relevant to this case. Firstly, the Board recommended that Meta "amend the Violence and Incitement Community Standard to (i) explain that rhetorical threats like "death to X" statements are generally permitted, except when the target of the threat is a high-risk person; (ii) include an illustrative list of high-risk persons, explaining they may include heads of state; (iii) provide criteria for when threatening statements directed at heads of state are permitted to protect clearly rhetorical political speech in protest contexts that does not incite to violence" ( Iran Protest Slogan decision, recommendation no. 1). Meta has reported progress towards the implementation of this recommendation, stating that the company is undertaking a policy development process to evaluate the enforcement of "calls for death" on its platforms (Meta’s H2 2024 Bi-Annual Report on the Oversight Board – Appendix).
Additionally, the Board recommended that Meta "update [the company’s] internal guidelines for at-scale reviewers about calls for death using the phrase "death to" when directed against high-risk persons," specifically to "allow posts that, in the local context and language, express disdain or disagreement through non-serious and casual ways of threatening violence” ( Statement About the Japanese Prime Minister, recommendation no. 2). Meta has also reported progress towards the implementation of this recommendation, explaining that the company’s policy development related to "calls for death" includes re-examining it to "consider potential non-serious and figurative statements." Meta stated the company is "exploring refinements" to the Violence and Incitement policy to "enable more nuance" (Meta’s H2 2024 Bi-Annual Report on the Oversight Board – Appendix).
The Board believes that the full implementation of both recommendations would contribute to decreasing the number of errors in the enforcement of Meta's Violence and Incitement policy by making the company's assessment of whether threats are credible more nuanced and context-focused.
Decision
The Board overturns Meta's original decision to remove the content. The Board acknowledges Meta's correction of its initial error once the Board brought the case to Meta's attention.