وصف حالة
In September 2025, an Instagram user posted a short video of a woman in a form-fitting dress. In the video, the woman is adjusting her dress and moving her body, with her underwear visible in a few frames.
The next day, Meta’s automated system that detects and prioritizes content that may pose harm to individuals, especially content with a high likelihood of virality, identified the post. However, the report was not prioritized for human review.
A few days later, two users reported the content for pornography, but their reports were not prioritized for human review either, and the content remained on Instagram. One of the users who had reported the content appealed to Meta to take the content down, but once again the post was not prioritized for human review. The user then appealed to the Board.
When the Board brought the case to Meta’s attention, Meta’s subject matter experts reviewed the post and concluded that it did not violate the company’s Adult Nudity and Sexual Activity Community Standard, as it did not contain any “visual representations of a sexual encounter” or “language that facilitates or encourages sexual encounter.” The company therefore did not remove the content but determined it should only be visible to adults. Meta highlighted that under this policy, the company restricts visibility of “photorealistic or digital imagery of persons where crotch, buttock or female breast(s) are the focus of the image” to users aged 18 and up. It further stated that the individual featured in the video, whom Meta does not consider a public figure, draws attention to her crotch area and underwear by moving her legs and adjusting the clothing. The company added that keeping the content on the platform with age restrictions reflects their fundamental commitment to expression, while preventing younger audiences from material that may be inappropriate for their age.
The Board selected this case to assess Meta’s moderation practices in enforcing its Adult Nudity and Sexual Activity and related Community Standards, particularly in the context where a person’s image or likeness is used in a sexually explicit or suggestive manner without their consent. This case falls within the Board’s Automated Enforcement of Policies and Curation of Content and Gender strategic priorities.
The Board would appreciate public comments that address:
- Contextual information about the use and prevalence of AI-generated sexually explicit or suggestive imagery that is created without the consent of the person that is depicted.
- Views on protection of image or likeness in the context of AI-generated sexually explicit or suggestive imagery, while preserving, e.g., artistic expression.
- Best practices and recommendations for enforcing policies on AI-generated non-consensual intimate explicit or suggestive imagery, including approaches to determining non-public figures’ consent on image use.
- Views on the effectiveness of age-gating approaches to AI-generated non-consensual intimate explicit or suggestive
- Approaches to designing effective mechanisms to report AI-generated content where a person’s image or likeness is used in a sexually explicit or suggestive manner without their consent
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Public Comments
If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the case announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Thursday 26 February.
What’s Next
Over the next few weeks, Board Members will be deliberating this case. Once they have reached their decision, we will post it on the Decisions page.