A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

Oversight Board announces Fruit Juice Diet cases and a case about Violence in the Indian State of Odisha


August 2023

Today, the Board is announcing new cases for consideration. As part of this, we are inviting people and organizations to submit public comments.

Case selectionCase selection

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta's policies.

The cases that we are announcing today are:

Video of Communal Violence in Indian State of Odisha

2023-018-FB-MR

Case referred by Meta

Submit public comments here.

To read this announcement in Odia, click here.

ଏହି ଘୋଷଣାକୁ ଓଡ଼ିଆରେ ପଢ଼ିବା ପାଇଁ, ଏଠାରେ କ୍ଲିକ୍ କରନ୍ତୁ

In April 2023, a Facebook user posted a video that depicts a street procession in the Indian state of Odisha related to the Hindu festival of Hanuman Jayanti. The video caption reads “Sambalpur,” which is a town in Odisha, where communal violence broke out between Hindus and Muslims during the festival. These clashes were followed by arrests, curfew and suspension of internet services in Odisha.

The video starts with a depiction of the procession of people carrying saffron-coloured flags. The camera zooms in and shows a person standing on a building nearby, who then throws what appears to be a stone at the procession. In response, people from the procession start throwing stones back at the building. They also call for the person on the building to be “beaten” or “hit.” The content has been viewed about 2,000 times, has received fewer than 1000 comments and reactions, and has not been shared or reported by anyone.

Shortly after the events depicted in the video, Meta received a report from Odisha law enforcement, requesting that another video, identical to the one later referred to the Board, be taken down. This video had a different caption and was posted by a different user. Upon review, Meta found that this content violated the spirit of its Violence and Incitement Community Standard. The policy rationale to the Violence and Incitement policy provides that Meta “aim(s) to prevent potential offline harm that may be related to content on Facebook” and that Meta “remove[s] content, disable[s] accounts, and work[s] with law enforcement when we believe there is a genuine risk of physical harm or direct threats to public safety.” This content was added to a Media Matching Services (“MMS”) bank which locates and flags for possible further action content that is identical or nearly identical to previously flagged photos, videos, and text.

However, the creator of the identical content deleted the video before Meta could remove it. Thereafter, Meta identified and removed this case’s content, as described above, under its Violence and Incitement Community Standard. Meta explains that the content is violating, as it contains clear and accessible calls for high-severity violence. Under this policy, Meta prohibits “[t]hreats that could lead to death (and other forms of high-severity violence)...targeting people or places,” including “[c]alls for high-severity violence including content where no target is specified but a symbol represents the target and/or includes a visual of an armament or method that represents violence.”

Meta also explained that the content “was not shared to condemn or raise awareness” since there was no academic or news report context, nor discussion of the author’s experience being a target of violence. Additionally, Meta noted that the caption does not condemn nor express “any kind of negative perspective about the events depicted in the video.” The company highlighted, however, that even if the content had included an awareness raising or condemning caption, Meta would still have removed it “given the significant safety concerns and ongoing risk of Hindu and Muslim communal violence.”

Meta referred the identical content to the Board, stating that this case is difficult due to the tensions between Meta’s values of “Voice” and “Safety,” and because of the context required to fully assess and appreciate the risk of harm posed by the video. Meta asked the Board to assess whether Meta’s decision to remove the content represents an appropriate balancing of Facebook’s values of “Privacy,” “Safety,” “Dignity,” and “Voice,” and whether it is consistent with international human rights standards.

The Board selected this case to assess Meta’s moderation policies and practices in contexts involving communal violence. This case falls within the Board’s “crisis and conflict situations,” “hate speech against marginalized groups” and “government use of Meta’s platforms” strategic priorities.

The Board would appreciate public comments that address:

  • How social media platforms may be used to contribute to violence and discrimination against religious and ethnic groups in India and elsewhere.
  • Insights into the socio-political context regarding the treatment of religious and ethnic groups in India, including the Indian government’s policies and practices.
  • How Meta's Violence and Incitement policy should treat video content depicting scenes of communal violence, and how to assess whether such content may cause or contribute to offline violence.
  • How social media platforms should manage law enforcement requests for the review or removal of content that may not violate national laws but may breach platforms content rules.
  • How social media platforms should incorporate law enforcement requests for content removal, especially requests not based on alleged illegality, into their transparency reporting.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Fruit Juice Diet

2023-019-FB-UA, 2023-020-FB-UA

User appeals to remove content from Facebook

Submit public comments here.

These cases concern two content decisions made by Meta, which the Oversight Board will address together. Both cases involve videos posted on the same Facebook page, described as featuring content about life, culture, and food in Thailand. In the videos, a man interviews a woman about her experience observing a fruit juice-only diet and shares her social media page about the diet at the end of each video.

In the first video, posted in 2022, the woman explains why she only drinks fruit juice, eating nothing solid. According to the woman, she used to suffer from skin problems and swollen legs, which she described as huge and heavy. The woman claims that switching to a fruit juice-only diet has brought her multiple health benefits, including increased mental focus, improved skin, happiness and a “feeling of lightness.” After saying that some may comment that this is anorexia, the woman states that dietary changes are often accompanied by sudden weight loss, which explained why she initially lost more than 10 kilograms (22 pounds). The woman states her weight has now “normalized.”

In the second video, posted in 2023, the same man asks the same woman, who appears extremely thin, how she feels after observing the fruit juice-only diet for almost a year. The woman laments that she would soon break her fast and eat solid fruit. When asked about her weight, the woman states she has not lost any more weight, but "four kilos of impurities.” The woman encourages the man to try the diet. The woman shares that she would now be a “fruitarian” and wants to begin “prana.” She describes this as not eating or drinking and instead living “only on energy.”

The first post received about 3,000 reactions, about 1,000 comments, and over 200,000 views. The second post received about 8,000 reactions, about 14,000 comments, and over two million views. The Facebook page where both videos were posted has about 130,000 followers.

Both posts were reported multiple times to Meta for violating Meta’s Suicide and Self Injury Community Standard. A separate user in each case ultimately appealed to the Board. These users' initial appeals to Meta to remove the content were immediately closed through automation because prior human reviews had found the content non-violating. The users then appealed the decisions further with Meta, and in both cases human reviewers left the videos on Facebook. In their subsequent appeals to the Board, the users stated that the content promotes an unhealthy lifestyle and may encourage others, especially teenagers, to do the same. They described the content as “inaccurate” and presenting anorexia “as a good thing,” which can pose health risks to people exposed to the content.

The Board selected these cases to address how Meta’s content policies and enforcement practices address diet, fitness, and eating disorder-related content on Facebook.

The Board would appreciate public comments that address:

  • Information about fruit juice-only, fruitarian, and pranic diets, their health impact, and whether these types of diets should be understood as eating disorders.
  • Whether Meta’s policy on self-injury sufficiently addresses the harms to physical and mental health posed by content that praises or promotes eating disorders.
  • How Meta should moderate content relating to eating disorders to protect vulnerable users, including teenagers, while also respecting freedom of expression.
  • Information on social media policy and regulatory approaches around the world to address eating disorder, fitness, and diet-related content posted by influential users.
  • Insights on the impact of diet and fitness-related content on social media to physical and mental health, particularly of teenagers, globally.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to these cases.

Public commentsPublic comments

If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for these cases is open for 14 days, closing at 23:59 your local time on Tuesday, August 15.

What's nextWhat's next

Over the next few weeks, Board members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website.

To receive updates when the Board announces new cases or publishes decisions, sign up here.

Attachments

Odia translation
Download
Back to news and articles