Overturned
Post Stating “Stop Hiring Khmer People”
March 3, 2026
A user appealed Meta’s decision to leave up a Facebook post in Thai stating: “Stop hiring Khmer people.”
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
A user appealed Meta’s decision to leave up a Facebook post in Thai stating: “Stop hiring Khmer people.” After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.
About the Case
In July 2025, a Facebook user posted a photo of four uniformed individuals holding what appears to be sand sacks at a construction site, accompanied by a caption in Thai stating: “#Stop hiring Khmer people.” The caption also emphasized that people should not support shops or companies employing Cambodians and asked other users to share the message if they agree. The content was posted in the context of border clashes between Thailand and Cambodia, which resulted in the injuring and death of civilians. Thousands of Cambodian migrants left Thailand and returned to Cambodia fearing discrimination and violence amid the unrest.
In their appeal to the Board, the reporting user noted that the post “spreads ethnic hatred and discrimination against Cambodians,” fueling “social division” and putting “innocent people at risk.”
Under the Hateful Conduct Community Standard, Meta removes “content targeting a person or group of people on the basis of their protected characteristic[s]” with “calls or support for exclusion or segregation or statements of intent to exclude or segregate.” That covers “economic exclusion, which means denying access to economic entitlements and limiting participation in the labor market.”
After the Board brought this case to Meta’s attention, the company determined that the content violated the Hateful Conduct policy and that its original decision to leave up the content was incorrect. Meta found that the post explicitly calls for others not to hire Khmer people, which constitutes a “call to deny them access to the labor market,” and removed it from Facebook.
Board Authority and Scope
The Board has authority to review Meta’s decision following an appeal from the user who reported content that was left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Case
The content in this case provides an example of the underenforcement of Meta’s Hateful Conduct policy, potentially contributing to the discrimination of Cambodians in Thailand, especially in light of the recent clashes between Thailand and Cambodia. The Board is paying particular attention to Meta’s enforcement errors involving this Community Standard since the company’s announcement on it on January 7, 2025.
The Board has issued a recommendation aimed at improving Meta’s enforcement of its Hateful Conduct policy. In the Criminal Allegations Based on Nationality decision, the Board stated that Meta should “share [with the public] the results of the internal audits it conducts to assess the accuracy of human review and performance of automated systems in the enforcement of its Hate Speech [now Hateful Conduct] policy […] in a way that allows these assessments to be compared across languages and/or regions” (recommendation no. 2). In its initial response to the Board, Meta stated that the company will confidentially share data on enforcement accuracy with the Board rather than make it public. In a recent update concerning this recommendation, Meta reported that, “As part of [the company’s] efforts to change how [it] enforce[s] [its] policies to reduce mistakes, [it is] relying more on reports from users instead of proactive detection for many violation types, including Hateful Conduct” (Meta’s H1 2025 Report on the Oversight Board). According to Meta, an assessment will be conducted “at a later time in order to allow teams to fully implement these changes.” The implementation seems to be in progress, with data yet to be shared with the Board.
The Board believes that fully implementing recommendation no. 2 from the Criminal Allegations Based on Nationality decision would improve the company’s ability to reduce underenforcement of harmful content impacting vulnerable groups. It would allow Meta to compare accuracy data across languages and/or regions, allocating resources to improve accuracy rates where necessary. Furthermore, public reporting on the accuracy of reviews under the Hateful Conduct policy would increase transparency and generate engagement with Meta. This engagement has the potential to lead to future improvements as the company takes and acts on stakeholder feedback.
Decision
The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.