Décision sur plusieurs affaires

Eating Disorder Awareness Posts

The Oversight Board, in analyzing two cases on posts raising awareness about eating disorders during Eating Disorders Awareness Week, finds Meta should strengthen its preparedness for spiked engagement during these periods, when public interest speech could be wrongly removed.

2 cas inclus dans ce lot

Renversé

IG-XMV96DJO

Cas relatif au suicide et à l’automutilation sur Instagram

Plate-forme
Instagram
Sujet
Liberté d’expression,Santé
Standard
Suicide et automutilation
Emplacement
États-Unis
Date
Publié le 4 novembre 2025
Renversé

IG-3YAH57M8

Cas relatif au suicide et à l’automutilation sur Instagram

Plate-forme
Instagram
Sujet
Liberté d’expression,Santé
Standard
Suicide et automutilation
Emplacement
États-Unis
Date
Publié le 4 novembre 2025

Summary

The Oversight Board, in analyzing two cases on posts raising awareness about eating disorders during Eating Disorders Awareness Week, finds Meta should strengthen its preparedness for spiked engagement during these periods, when public interest speech could be wrongly removed. The Board is concerned about the potential impact on the visibility of awareness-raising content. Meta’s platforms should allow users to support one another rather than impede their freedom of expression around helpful content. Users need to be able to provide additional context in appeals on whether their content falls into exceptions to the Suicide, Self-Injury and Eating Disorders policy to improve reviews and reduce enforcement errors. The Board overturns Meta's original decision to take down the content in these cases.

About the Cases

In the first case, an Instagram user in the United States posted a photo carousel including photos of the user. The caption shared a personal account of experiencing an eating disorder, a desire to educate people on such disorders and gratitude for support.

In the second case, another Instagram user, also in the United States, shared a photo carousel involving images with text providing advice on how to talk about people perceived to be skinny or underweight. The third image in the carousel advised people not to guess someone’s clothing size and to avoid commenting that people may be wasting away.

Both posts had commonly used awareness-raising hashtags and were shared during Eating Disorders Awareness Week, separately, in 2023 and 2025.

Meta’s automated systems in 2025 identified the post in the first case and the third image in the second case as potentially violating. Human reviewers determined they both violated the Suicide, Self-Injury and Eating Disorders policy. For the first post, the full photo carousel was visible to the reviewer, and for the second post, only the third image was visible. Meta removed the first post entirely and the third image in the second post.

After both users appealed the removals, Meta eventually upheld its decisions. The users then appealed to the Board. When the Board selected these cases, Meta concluded the posts were shared in non-violating contexts and restored them.

Key Findings

The Board finds that the posts did not violate Meta’s Suicide, Self-Injury and Eating Disorders Community Standard. Removing them was also inconsistent with Meta’s human rights responsibilities, as it was not necessary and proportionate to protect public health.

These cases indicate three areas of improvement for fulfillment of Meta’s human rights commitments related to awareness-raising and supportive content: preparedness for global recurring awareness-raising periods; visibility of awareness-raising content; and improvements to appeal review. Meta’s platforms should allow users to support one another rather than impede their freedom of expression around helpful content.

Meta should strengthen its preparedness for awareness-raising weeks as predictable, recurring periods when public interest speech could be wrongly removed. As a global company, Meta should develop a calendar of global awareness-raising periods and use it to adjust enforcement practices.

Adequate tooling is necessary, and users need to be able to provide additional context in their appeals to improve appeal review and reduce enforcement errors. The Board is concerned that, on appeal, secondary reviewers in both cases were unable to complete the review due to content loading failures in internal tooling. While another reviewer had already found both posts to be non-violating, the initial decision to remove both posts was enforced. Meta should provide adequate tooling for holistic initial human review and appeal review across all content types and product features, including photo carousels.

The Oversight Board’s Decision

The Oversight Board overturns Meta's original decision to take down the content.

The Board also recommends that Meta:

  • Share the specific measures it takes to prevent overenforcement of content during awareness-raising periods and whether these measures differ from those applied at other times in the enforcement of such content.

The Board reiterates the importance of its previous recommendations on reviewer accuracy-rate assessments, that Meta should:

  • Conduct regular assessments on reviewer accuracy rates.
  • Improve its transparency reporting by increasing public information on error rates via making this information viewable by country and language for each Community Standard.

*Case summaries provide an overview of cases and do not have precedential value.

Full Case Decision

  1. Case Description and Background

This decision addresses two cases from the United States with posts in English featuring multiple images that Meta terms a “photo carousel.” Both posts also had captions and commonly used awareness-raising hashtags and were shared during Eating Disorders Awareness Week, seperately, in 2023 and 2025.

In the first case, an Instagram user posted a photo carousel that included photos of the user. The caption shared the user’s personal experience of an eating disorder, a desire to educate people on such disorders, gratitude for support and included hashtags raising awareness of eating disorders and of the Eating Disorders Awareness Week.

In the second case, another Instagram user, who identifies as a mental health professional, shared a photo carousel involving several images with text providing advice on how to talk about the weight and size of people perceived to be skinny or underweight. The images included examples of inappropriate statements with alternative suggestions on how to address these issues more sensitively. The third image in the carousel advised people not to guess someone’s clothing size and to avoid commenting that people may be wasting away. The caption noted that while the user had not personally experienced an eating disorder, people have made comments to them about their perceived low weight. The caption contained several hashtags, including to raise awareness of eating disorders and Eating Disorders Awareness Week.

In March 2025, more than two years after it was posted, Meta’s automated systems identified the first post as potentially violating and sent it for human review. Similarly, in late February 2025, the day after the second carousel was posted, Meta’s automated systems identified its third image as potentially violating and sent it for human review. The reviewers determined both posts violated Meta’s Suicide, Self-Injury and Eating Disorders policy. For the first post, the full photo carousel was visible to the reviewer, and for the second post, only the third image of the carousel was visible.

Meta removed the first post entirely, and the third image in the second post, leaving the rest of that carousel on Instagram. The user who posted the first post did not receive a strike as the reviewer determined the content was shared in a “positive” promotional context. The second user received a severe strike and a 30-day feature limit, preventing them from going live and posting ads, as the reviewer concluded the post was shared in an “encouraging” promotional context. In its submissions to the Board, Meta said that it differentiates between content that “encourages” eating disorders, either explicitly or through means such as providing instructions, and content that speaks “positively” about an eating disorder, without encouraging others into disordered eating. Both are subject to removal, with only the former resulting in a strike.

Both users appealed Meta’s decisions. During the first review of each post, reviewers found them non-violating. The posts were then sent for a second review, but the additional reviewers were unable to complete their review as the images failed to load in the moderator’s internal review tool. While the first reviewers had already found both posts to be non-violating, Meta upheld its initial removal decisions. The users appealed Meta’s decisions to the Board.

When the Board selected these cases, Meta’s subject matter experts reviewed the posts again and concluded that they were shared in non-violating contexts. The company reversed its original decisions, restored both posts in their entirety and reversed the strike on the second user’s account.

2. User Submissions

The users who appealed to the Board explained that they intended to raise awareness about eating disorders and recovery. The first user noted that they shared a personal story without any graphic imagery. The second user stated that they contrasted harmful expressions with advice on how to communicate more sensitively.

3. Meta’s Content Policies and Submissions

I. Meta’s Content Policies

The Suicide, Self-Injury and Eating Disorders policy prohibits people from “intentionally or unintentionally celebrat[ing] or promot[ing] suicide, self-injury or eating disorders,” but allows users to “share their experiences, raise awareness about these issues and seek support from one another.”

Written or verbal admissions of suicide, self-injury or eating disorders are allowed, but only adults aged 18 and over can view the content, and Meta sends resources to posting users. Meta’s internal guidelines to human reviewers allow such content only if there is no graphic self-injury imagery present.

The policy rationale notes that content about recovery from suicide, self-injury or eating disorders is allowed. According to the internal guidelines, content is shared in a recovery context if it contains a clear statement that the user has or is healing from a past eating disorder. The internal guidelines specify that content about past eating disorders is allowed if there is a clear indication of recovery and no graphic imagery or healed cuts are present.

The internal guidelines also allow content created as a support resource for victims of eating disorders, if there is no self-injury imagery. Support resources are defined as sharing information about treatment options including therapy and inpatient programs, contact information to crisis helplines, names of organizations and websites offering support, and encouraging others to seek medical help or professional advice.

II. Meta’s Submissions

After the Board selected these cases, Meta found that neither post violated the Suicide, Self-Injury and Eating Disorders Community Standard.

Meta considered that the first post shares non-graphic imagery of the user and describes the user’s personal experience with eating disorders in a non-graphic way. Meta noted that the post was made during Eating Disorders Awareness Week and contains an awareness-raising caption about eating disorder recovery.

Meta determined that the second post does not promote or encourage eating disorders. Rather, the content aimed “to provide helpful resources to those dealing with eating disorders and their loved ones to help encourage thoughtful, productive conversation.”

The Board asked Meta 14 questions on the specifics and enforcement of the Suicide, Self-Injury and Eating Disorders Community Standard. Meta did not answer any of the Board’s questions.

4. Public Comments

The Oversight Board received four public comments that met the terms for submission. Three of the comments were submitted from the United States and Canada and one from Asia Pacific and Oceania. To read public comments submitted with consent to publish, click here.

The submissions covered the following themes: approaches to distinguish and flag content promoting eating disorders from recovery-focused ones; studies on the effects of sharing or receiving information or resources on eating disorders on social media; the importance of prioritizing eating disorder content for human review and considering alternative approaches beyond content removal.

5. Oversight Board Analysis

The Board selected these cases to assess how Meta’s policies and enforcement practices address awareness-raising content or support resources related to eating disorders and recovery. These cases fall within the Board’s priority of Automated Enforcement of Policies and Curation of Content.

The Board analyzed Meta’s decision in this case against Meta’s content policies, values and human rights responsibilities. The Board also assessed the implications of this case for Meta’s broader approach to content governance.

5.1 Compliance With Meta’s Content Policies

Content Rules

The Board finds that the posts did not violate Meta’s Suicide, Self-Injury and Eating Disorders Community Standard and were clearly made to raise awareness and provide helpful resources on eating disorders.

The first post contains a personal story of recovery from an eating disorder. Posting a recovery account during Eating Disorders Awareness Week, together with awareness-raising hashtags in the caption, clearly indicates that the user was spreading awareness about eating disorder recovery. Therefore, this post falls under the awareness-raising exception under the Suicide, Self-Injury and Eating Disorders policy.

In the second post, the user, who identifies as a mental health professional, shared advice on how to talk about the weight and size of people perceived to be skinny or underweight. In the third image, the user was unambiguously contrasting harmful expressions with advice on how to communicate more thoughtfully. Even if viewed in isolation, it is difficult to understand how this image could have been considered violating. Sharing the post during Eating Disorders Awareness Week, coupled with awareness-raising hashtags and explanations in the caption, additionally confirms that the user was sharing supportive advice and resources.

5.2. Compliance With Meta’s Human Rights Responsibilities

The Board finds that removing the content was not consistent with Meta’s human rights responsibilities.

Freedom of Expression (Article 19 ICCPR)

Article 19 of the International Covenant on Civil and Political Rights (ICCPR) provides for broad protection of expression. This right includes the “freedom to seek, receive and impart information and ideas of all kinds.” Access to information is a key part of freedom of expression. Article 12 of the International Covenant on Economic, Social and Cultural Rights (ICESCR) guarantees the right to health, including the right to access health-related education and information (Article 12, ICESCR; Committee on Economic, Social and Cultural Rights, General Comment No. 14 (2000), para. 3).

When restrictions on expression are imposed by a state, they must meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). These requirements are often referred to as the “three-part test.” The Board uses this framework to interpret Meta’s human rights responsibilities in line with the United Nations (UN) Guiding Principles on Business and Human Rights, which Meta itself has committed to in its Corporate Human Rights Policy. The Board does this both in relation to the individual content decision under review and what this says about Meta’s broader approach to content governance. As the UN Special Rapporteur on freedom of expression has stated, although “companies do not have the obligations of governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users' right to freedom of expression” ( A/74/486, para. 41).

I. Legality (Clarity and Accessibility of the Rules)

The principle of legality requires rules limiting expression to be accessible and clear, formulated with sufficient precision to enable an individual to regulate their conduct accordingly (General Comment No. 34, para. 25). Additionally, these rules “may not confer unfettered discretion for the restriction of freedom of expression on those charged with [their] execution” and must “provide sufficient guidance to those charged with their execution to enable them to ascertain what sorts of expression are properly restricted and what sorts are not” (ibid.). The UN Special Rapporteur on freedom of expression has stated that when applied to private actors’ governance of online speech, rules should be clear and specific (A/HRC/38/35, para. 46). People using Meta’s platforms should be able to access and understand the rules and content reviewers should have clear guidance regarding their enforcement.

The Board finds that the rules on admission of and recovery from eating disorders are sufficiently clear as applied to these cases.

II. Legitimate Aim

Any restriction on freedom of expression should also pursue one or more of the legitimate aims listed under Article 19, para. 3 of the ICCPR. In these cases, the Board finds that the Suicide, Self-Injury and Eating Disorders Community Standard regarding eating disorder content serves the legitimate aim of protecting public health and respecting the rights of others to physical and mental health, especially of young people and adolescents (Article 12 ICESCR, Article 19, CRC; General Comment no. 13, (2011), para. 28; General Comment No. 15, (2013), para. 84; see also Fruit Juice Diet).

III. Necessity and Proportionality

Under ICCPR Article 19(3), necessity and proportionality requires that restrictions on expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” (General Comment No. 34, para. 34).

Without the benefit of having had Meta’s answers to the Board’s questions in these cases, the Board’s ability to explore in-depth the enforcement errors and provide more specific guidance on Meta’s human rights responsibilities, especially those related to awareness-raising content or content providing supportive resources, was limited. The removal of these posts was clearly neither necessary nor proportionate to protect public health. Both posts contained unambiguous signals that the users were sharing awareness-raising and supportive content during an Eating Disorders Awareness Week.

The Board has previously stated that, in order to show that Meta had selected the least intrusive means in pursuing a legitimate objective, the company should consider the following questions: (1) whether the public interest objective can be addressed through measures that do not infringe on speech; (2) if not, whether, among the measures that infringe on speech, the platform has selected the least intrusive measure; and (3) whether the selected measure actually helps achieve the legitimate aim ( Claimed Covid Cure, citing A/74/486, para. 52). Consideration of these questions with respect to these cases indicate three areas of improvement to fulfill Meta’s commitment relating to respecting freedom of expression by not resorting to speech removals that are not necessary: preparedness for global recurring awareness-raising periods; visibility of awareness-raising content; and improvements to appeal review. Improvements in each of these areas will help Meta display that it is using the UN Special Rapporteur’s framework to demonstrate affirmative due diligence steps to avoid unnecessarily removing speech from its platforms.

Preparedness for Awareness-Raising Events and Enforcement Accuracy

These cases raise questions about Meta’s preparedness for periodic public awareness-raising weeks with spiked engagement, including whether Meta sufficiently leverages enforcement accuracy assessments to assist the company in improving its readiness for these recurring periods.

Awareness-raising events are dedicated periods intended to educate people about issues of public importance. These events provide a focused opportunity to raise understanding, share information, attract attention and generate support through community engagement, education, and fundraising efforts, among others. Social media is one of the most important means by which people can communicate and express themselves freely. Enforcement errors that restrict content raising awareness or sharing resources negatively hinders such conversations and opportunities. This is especially so when, as happened in the second case, strikes and account restrictions are imposed on posting users, impeding them from openly engaging in such conversations during these important periods.

Meta should therefore strive to ensure its platform policies and enforcement practices are appropriate during awareness events. This includes those both of a local and global nature, such as eating disorders awareness, or breast cancer awareness, among others.

As part of its research into this case, the Board searched Meta’s Content Library for eating disorders awareness and recovery content on Instagram between 2023 and 2025. The Board analyzed a representative sample of more than 18,000 posts, which only included posts visible on the platform. This analysis revealed consistent spikes in the volume of such content during Eating Disorders Awareness Week, indicating a recurring pattern of increased user engagement in these periods. More than half of the posts published encouraged the public to share content or participate in awareness events. The remaining posts primarily involved individuals sharing lived experiences with eating disorders, including personal stories of diagnosis, struggle and recovery, as well as posts offering therapeutic guidance. This indicates the importance of the company being attentive to and preparing for such awareness-raising events.

The Board has assessed Meta’s preparedness for other periods of spiked engagement that involve public interest discussions and awareness-raising moments associated with public health campaigns. For example, the Board examined 15 posts raising awareness on breast cancer, shared during Breast Cancer Awareness Month in 2024. The Board called on Meta to continue its efforts to implement the Board’s recommendations from the B reast Cancer Symptoms and Nudity decision to prevent and reverse errors and continue to improve its ability to accurately detect content that falls within exceptions to the Adult Nudity and Sexual Activity policy.

The Board has also addressed the need for Meta to establish mechanisms that evaluate the effectiveness of efforts to enforce its policies in other periods of heightened need for accurate enforcement, such as elections. In response to recommendations in Brazilian General’s Speech, Meta developed a framework for evaluating the company’s election integrity efforts, establishing eight core election integrity pillars, including election risk management processes, cooperation with external stakeholders and tools to support civic engagement (see H1 2025 Bi-Annual Report Appendix).

While efforts should be made to preserve awareness-raising content every day, Meta should strengthen its preparedness for awareness-raising weeks, as predictable, recurring periods when public interest speech could be wrongly removed. As a global company, Meta should develop a calendar of global awareness-raising periods and use it to adjust enforcement practices. Specific measures during these periods could include establishing procedures to actively oversee eating-disorder content during these awareness raising periods, monitoring campaign hashtags to reduce wrongful removals of supportive posts or implementing a fast-track review process for appeals related to violations connected to awareness-raising events. Additional measures could include real-time data monitoring to detect spikes in erroneous removals.

The Board has previously addressed this issue in several cases across different policy areas, highlighting the importance of periodically conducting accuracy audits. This enables the company to assess and report on reviewer accuracy and use these results to inform its enforcement operations and policy development. This is particularly important with respect to the enforcement of the relevant policies preceding and following specific awareness-raising periods. The full implementation of the Board’s prior relevant recommendations (see below) addressing reviewer accuracy and enforcement errors is necessary for Meta to ensure awareness-raising speech is preserved, which may help to combat harmful narratives relating to eating disorders and recovery. The Board also calls on Meta to share the specific measures it takes to prevent overenforcement of content during awareness-raising periods, indicating whether these measures differ from those applied at other times in the enforcement of such content. This is important not only for raising awareness on eating disorders and recovery, but also for other public engagement periods that focus on harm prevention and awareness raising.

The Board has repeatedly called on Meta to conduct regular assessments of reviewer accuracy rates, and to share how these results would inform improvements to policy development and enforcement (see, e.g., Asking for Adderall®, recommendation no.3). In response, Meta stated that it “already collect[s] and assess[es] data on the basis of takedowns and restoration,” and “report[s] the amount of appealed content and content that is restored on Facebook and Instagram in its Community Standards Enforcement Report” (see Meta’s Q3 2023 Quarterly Update on the Oversight Board). The Board considers this recommendation was omitted or reframed, as takedown and restoration metrics related to user appeals Meta referenced are not the same as reviewer accuracy metrics mentioned in the recommendation. The Board also notes that the company did not share how the data it collects informs improvement of enforcement operations and policy development.

The Board also called on Meta to increase public information on error rates by making this viewable by country and language for each Community Standard in its transparency reporting ( Punjabi Concern Over RSS in India, recommendation no. 3). In response, Meta confidentially shared with the Board a summary of enforcement data, which includes “an overview of enforcement accuracy data” (see H1 2025 Bi-Annual Report Appendix). The Board considers this recommendation as omitted or reframed. The data Meta shared is not the same as providing more detailed transparency reports on error rates that are viewable by country and language for each Community Standard. The Board reiterates that more detailed transparency reports will help the public spot areas where errors are more common, including potential specific impacts on minority groups, and alert Meta to correct them.

Visibility of Awareness-Raising Content

The Board is concerned about the potential impact on visibility of awareness-raising content that is already limited to adult users.

Without the benefit of having Meta’s answers to the Board’s questions in these cases, the Board’s ability to explore how alternative content moderation interventions, beyond removals, impact content raising awareness or sharing supportive content on eating disorders was limited.

The Board has learned that often such alternative interventions are accompanied by limitations to content visibility. For instance, content rated False or Altered by third-party fact-checkers is demoted, meaning it ranks lower in users’ feeds (see, e.g., Altered Video of President Biden;Posts Supporting UK Riots). Similarly, when a content is obscured with a warning screen, it is removed from recommendations to users who do not follow the posting account (See e.g., Al-Shifa Hospital; Hostages Kidnapped from Israel;Candidate for Mayor Assassinated in Mexico).

While Meta allows people to use its platforms to support one another and share helpful resources on eating disorders and recovery, Meta’s systems should not impede such content from reaching the intended audiences, especially when the content is visible only to adult users. For instance, the Board previously found that the exclusion from recommendation of awareness-raising content placed behind a warning screen, and visible only to people over the age of 18, was not a necessary or proportionate restriction on freedom of expression ( Al-Shifa Hospital; Hostages Kidnapped from Israel).

Appeal Review

These cases indicate the need to ensure adequate tooling and allow users to provide additional context in their appeals, to improve appeal review and reduce enforcement errors.

It is concerning that on appeal, secondary reviewers in both cases were unable to complete the review of the carousels, as images failed to load in the reviewer’s internal tool. While another reviewer had already found both posts to be non-violating, the initial decision to remove both posts was nevertheless enforced.

Without the benefit of having Meta’s answers to the Board’s questions in these cases, the Board’s ability to understand what caused the enforcement errors, whether and when the tooling malfunction was fixed, and the nature and scale of its impact on content posted during Eating Disorders Awareness Week was limited. Meta should ensure that it has systems in place to quickly fix any problems within the tools it provides its reviewers with.

6. The Oversight Board’s Decision

The Oversight Board overturns Meta's original decision to take down the content in both cases under review.

7. Recommendations

Transparency

  1. Meta should share the specific measures it takes to prevent overenforcement of content during awareness-raising periods, such as the Eating Disorders Awareness Week, and whether these measures differ from those applied at other times in the enforcement of such content.

The Board will consider this recommendation implemented when Meta shares the concrete steps it has taken to avoid overenforcement regarding awareness-raising periods.

The Board also reiterates the importance of its previous recommendations on reviewer accuracy-rate assessments. In line with those recommendations, Meta should:

  • Conduct regular assessments on reviewer accuracy rates (Asking for Adderall®, recommendation no. 3).
  • Improve its transparency reporting by increasing public information on error rates via making this information viewable by country and language for each Community Standard (Punjabi Concern Over RSS in India, Recommendation no. 3).

*Procedural Note:

  • The Oversight Board’s decisions are made by panels of five Members and approved by a majority vote of the full Board. Board decisions do not necessarily represent the views of all Members.
  • Under its Charter, the Oversight Board may review appeals from users whose content Meta removed, appeals from users who reported content that Meta left up, and decisions that Meta refers to it (Charter Article 2, Section 1). The Board has binding authority to uphold or overturn Meta’s content decisions (Charter Article 3, Section 5; Charter Article 4). The Board may issue non-binding recommendations that Meta is required to respond to (Charter Article 3, Section 4; Article 4). Where Meta commits to act on recommendations, the Board monitors their implementation.

Retour aux décisions de cas et aux avis consultatifs politiques