Public Comments Portal

Videos of Teachers Hitting Children

May 7, 2025 Case Selected
May 21, 2025 Public Comments Closed
July 31, 2025 Decision Published
Upcoming Meta implements decision

Comments


Organization
UNICEF
Country
United States
Language
English
Attachments
20250520_UNICEF_Oversight-Board_violent-discipline.pdf
Name
Bilyana Petkova
Organization
University of National and World Economy, Bulgaria and Yale ISP
Country
Bulgaria
Language
English
Attachments
Submission-before-the-Meta-Oversight-Board.docx
Name
River Saxton
Country
United States
Language
English

While videos depicting non-sexual abuse of minors carry the potential of retraumatizing the victims, removing the videos in a cases such as this (where the consent of the parents has been given and/or the intent is to spread awareness and demand accountability) can be equally harmful. This is an area where human oversight is essential to determine whether the content should be reinstated and if it is appropriate, doing so as quickly as possible so the abuser can be held accountable.

Name
Philip McGinnis
Country
United States
Language
English

I feel like these decisions are extremely complicated and I appreciate you taking the efforts to seek input from the general public in this way. I feel that blurring of the minors' faces is essential in these cases. I also feel like any identifying information that is visible in the room like namecards on desks should not be visible. I think that you need to consider the impacts psychologically that may exist when students get older and this content is still available easily online. My personal opinion is that content like this should be hosted in another secure platform that is specifically for research or controversial content, not mixed in with sports news and birthday posts. I think that harmful content and divisive content should be de-emphasized and that US laws should be changed to have social media policies reflect the same policies as traditional public broadcasting companies were once expected to abide by relating to decency laws and other concepts that seem to have been eroded. Thank you for showing interest in my perspective.

Name
Christian Melby
Country
United States
Language
English

I believe very strongly that such videos should remain on the platform with the current, or added, blurring of the child's features to the largest extent possible.

I believe that the extent of humiliation, or "retraumatizing," attributed to the child is vastly overrated by Meta. There is no shame in being victimized by an irresponsible caretaker or other adult, especially in the case of a child in which there can be no culpability assigned and the sentiment and advocacy is entirely in the child's favor.

Showing such videos in an awareness-raising or condemnation context is exactly what is necessary to arouse righteous indignation, locate where and by whom the abuse is being perpetrated on defenceless children, express civilized society's outrage and condemnation, and change attitudes in areas where such abuse is still seen as unexceptional, and in some areas, normative.

The location of the abuse, the type of abusers, their relationship to the child, and the appropriate worldwide condemnation are crucial to locating facilities and areas in which this mistreatment occurs and to having those areas, types of adults, and their inability to control themselves and whatever institutional, cultural, subcultural, and personal beliefs that lead to this abuse strongly condemned and reformed according to worldwide standards for the treatment of children.

Country
United States
Language
English

It should be allowed. People need to know this. These teachers don't deserve privacy. It should be a parent choice whether or not to spank their kids.

Name
Jim
Country
United States
Language
English

Corporal punishment is a necessary component of raising children, and a school setting used to be a commonplace occurrence of it to keep unruly students in line. Obviously such punishment should be in moderation and only used if extremely necessary, but to call the act inappropriate in general is ludicrous. Facebook almost always gets moderation requests wrong, and routinely penalizes commonplace wording as offensive while blatantly ignoring threats of violence, bullying, and racial abuse. Meta claims to use “technology” to moderate content, but the technology is so broken we’re better off having human moderation or none at all. As for the use of your “child sexual exploitation” rule? Delete it, you misappropriate it so often it’s lost all usefulness. I was penalized for a harmless meme under this very rule that didn’t show any kind of exploitation at all, and almost a year later my account is still listed at risk because “technology” got it wrong and instantly denied the appeal with the same error. Don’t remove those videos, let the world see them.

Case Description

To read this announcement in Punjabi, click here.
ਇਸ ਘੋਸ਼ਣਾ ਨੂੰ ਪੰਜਾਬੀ ਵਿੱਚ ਪੜ੍ਹਨ ਲ, ਇੱਥੇ ਕਲਿੱਕ ਕਰੋ।ਈ

Note: Please be aware before reading that the following summary includes disturbing material dealing with content about violence against minors.

The Oversight Board will address the two cases below together, choosing either to uphold or overturn Meta’s decisions on a case-by-case basis.

Meta has referred two cases to the Board, both about videos that show teachers hitting children in school settings.

The first case involves a video posted on Facebook by a media organization in India. In the video, a teacher yells at a young school student for not studying. She repeatedly hits his head and appears to pull at his turban. The face of the child is superimposed with a blurry patch, but he periodically moves his face out of the blurring range. The teacher and other students are visible. The caption notes a state official has called for accountability.

The post received several thousand views, and 10 people reported the content. Because the account receives cross-check protections, one of the reports was escalated to policy experts who determined the content violated Meta’s Child Sexual Exploitation, Abuse and Nudity policy and removed the content.

The second post, also involving a video on Facebook, was posted by a page in France that appears to share local news. The video shows a group of very young children in an educational setting, with one child crying. The teacher hits the child and she falls to the ground, while the other children watch. All faces are blurred in this video. The caption and video reference the specific neighborhood where this was apparently filmed and reference an investigation.

The post received several thousand views and was both reported by a user and identified by an automated system as potentially violating the Child Sexual Exploitation, Abuse and Nudity policy. The content was then removed without human review. It was later escalated internally to policy experts and that decision was confirmed. Then, when Meta was preparing its submissions to the Board, its policy experts decided to allow the content on the platform with a newsworthy allowance and warning screen. According to Meta, media reported that the child’s parents’ attorney had shared the video. For the company, this meant the public interest value outweighed the harm, as the “parents' consent mitigated the privacy and dignity concerns.”

Meta referred both cases to the Board. Under its Child Sexual Exploitation, Abuse and Nudity policy, the company removes “[v]ideos or photos that depict real or non-real non-sexual child abuse regardless of sharing intent,” with no exceptions. Meta stated it takes “a firm stance against sharing non-sexual child abuse content, regardless of the intent, to prioritize the safety, dignity, and privacy of the minor.” According to Meta: “Allowing non-sexual child abuse content in an awareness-raising or condemnation context risks re-traumatizing the victim, while prohibiting such content may be viewed as infringing on the public's ability to be informed.”

The Board selected these cases to explore the tension between sharing information about non-sexual child abuse, including efforts to promote accountability, and protecting children.

The Board would appreciate public comments that address:

  • The impact on child victims of abuse and their families of having depictions of their abuse circulate online.
  • In what circumstances, if any, is it appropriate for social media companies to allow content that shows children being abused, in light of both the human right to freedom of expression and the human rights principle to respect the best interest of the child.
  • How limiting depictions of child abuse may affect efforts to seek accountability for such abuse.
  • Standards for reporting on child victims of abuse, and whether blurring and/or other measures serve to limit attempts to identify child victims of abuse.

 

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to these cases.

Public Comments

If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Wednesday 21 May.

What’s Next

Over the next few weeks, Board Members will be deliberating these cases. Once they have reached their decision, we will post it on the Decisions page.