Oversight Board upholds Meta’s decision in “India sexual harassment video” case
The Oversight Board has upheld Meta’s decision to restore a post to Instagram containing a video of a woman being sexually assaulted by a group of men. The Board has found that Meta’s “newsworthiness allowance” is inadequate in resolving cases such as this at scale and that the company should introduce an exception to its Adult Sexual Exploitation policy.
About the case
In March 2022, an Instagram account describing itself as a platform for Dalit perspectives posted a video from India showing a woman being assaulted by a group of men. “Dalit” people have previously been referred to as “untouchables,” and have faced oppression under the caste system. The woman’s face is not visible in the video and there is no nudity. The text accompanying the video states that a "tribal woman" was sexually assaulted in public, and that the video went viral. “Tribal” refers to indigenous people in India, also referred to as Adivasi.
After a user reported the post, Meta removed it for violating the Adult Sexual Exploitation policy, which prohibits content that “depicts, threatens or promotes sexual violence, sexual assault or sexual exploitation."
A Meta employee flagged the content removal via an internal reporting channel upon learning about it on Instagram. Meta's internal teams then reviewed the content and applied a “newsworthiness allowance.” This allows otherwise violating content to remain on Meta’s platforms if it is newsworthy and in the public interest. Meta restored the content, placing the video behind a warning screen which prevents anyone under the age of 18 from viewing it, and later referred the case to the Board.
The Board finds that restoring the content to the platform, with the warning screen, is consistent with Meta’s values and human rights responsibilities.
The Board recognizes that content depicting non-consensual sexual touching can lead to a significant risk of harm, both to individual victims and more widely, for example by emboldening perpetrators and increasing acceptance of violence.
In India, Dalit and Adivasi people, especially women, suffer severe discrimination, and crime against them has been rising. Social media is an important means of documenting such violence and discrimination and the content in this case appears to have been posted to raise awareness. The post therefore has significant public interest value and enjoys a high degree of protection under international human rights standards.
Given that the video does not include explicit content or nudity, and the majority of the Board finds the victim is not identifiable, a majority finds that the benefits of allowing the video to remain on the platform, behind a warning screen, outweigh the risk of harm. Where a victim is not identifiable, their risk of harm is reduced significantly. The warning screen, which prevents people under-18 from viewing the video, helps to protect the dignity of the victim, and protects children and victims of sexual harassment from exposure to disturbing or traumatizing content.
The Board agrees the content violates Meta's Adult Sexual Exploitation policy and that the newsworthiness allowance could apply. However, echoing concerns raised in the Board’s “Sudan graphic video” case, the Board finds that the newsworthiness allowance is inadequate for dealing with cases such as this at scale.
The newsworthiness allowance is rarely used. In the year ending June 1, 2022, Meta only applied it 68 times globally, a figure that was made public following a recommendation by the Board. Only a small portion of those were issued in relation to the Adult Sexual Exploitation Community Standard. The newsworthiness allowance can only be applied by Meta’s internal teams. However, this case shows that the process for escalating relevant content to those teams is not reliable. A Meta employee flagged the content removal via an internal reporting channel upon learning about it on Instagram.
The newsworthiness allowance is vague, leaves considerable discretion to whoever applies it, and cannot ensure consistent application at scale. Nor does it include clear criteria to assess the potential harm caused by content that violates the Adult Sexual Exploitation policy. The Board finds that Meta’s human rights responsibilities require it to provide clearer standards and more effective enforcement processes for cases such as this one. A policy exception is needed which can be applied at scale, that is tailored to the Adult Sexual Exploitation policy. This should provide clearer guidance to distinguish posts shared to raise awareness from those intended to perpetuate violence or discrimination, and help Meta to balance competing rights at scale.
The Oversight Board's decision
The Oversight Board upholds Meta's decision to restore the post with a warning screen.
The Board also recommends that Meta:
- Include an exception to the Adult Sexual Exploitation Community Standard for depictions of non-consensual sexual touching. This would only be applied by Meta’s internal teams and would permit content where the victim is not identifiable, and that Meta judges is shared to raise awareness, is not shared in a sensationalized context, and does not involve nudity.
- Update its internal guidance to at-scale reviewers on when to escalate content reviewed under the Adult Sexual Exploitation Community Standard that may be eligible for the above policy exception.
For further information
To read the full decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.