This website is currently undergoing maintenance and will be back soon.

Overturned

Human Trafficking in Thailand

A user appealed Meta’s decision to remove a Facebook post calling attention to human trafficking practices in Thailand.

Type of Decision

Summary

Policies and Topics

Topic
Freedom of expression, Safety
Community Standard
Human exploitation

Region/Countries

Location
Thailand

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors, and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post calling attention to human trafficking practices in Thailand. The appeal underlines the importance of designing moderation systems that are sensitive to contexts of awareness-raising, irony, sarcasm, and satire. After the Board brought the appeal to Meta’s attention, the company reversed its earlier decision and restored the post.

Case Description and Background

A Facebook user posted in Thai about a human trafficking business targeting Thais and transporting them for sale in Myanmar. The post discusses what the user believes are common practices that the business employs, such as pressuring victims to recruit others into the business. It also makes ironic statements, such as “if you want to be a victim of human trafficking, don't wait.” The content also contains screenshots of what appears to be messages from the business attempting to recruit victims, and of content promoting the business.

Meta originally removed the post from Facebook, citing its Human Exploitation policy, under which the company removes “[c]ontent that recruits people for, facilitates or exploits people through any of the following forms of human trafficking,” such as “labor exploitation (including bonded labor).” The policy defines human trafficking as “the business of depriving someone of liberty for profit.” It allows “content condemning or raising awareness about human trafficking or smuggling issues.”

After the Board brought this case to Meta’s attention, the company determined that its removal was incorrect and restored the content to Facebook. The company told the Board that, while the images in isolation would violate the Human Exploitation policy, the overall context is clear, making the content non-violating.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case that is under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, to reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

The case underlines the importance of designing moderation systems that are sensitive to contexts of awareness-raising, irony, sarcasm, and satire. These are important forms of commentary, and should not be removed as a result of overly-literal interpretations. In terms of automation, the Board has urged Meta to implement an internal audit procedure to continually analyze a statistically representative sample of automated removal decisions to reverse and learn from enforcement mistakes (“ Breast Cancer Symptoms and Nudity,” recommendation no. 5). In terms of human moderation, the Board asked Meta to ensure that it has adequate procedures in place to assess satirical content and relevant context properly, and that appeals based on policy exceptions be prioritized for human review (" Two Buttons Meme," recommendations nos. 3 and 5). Meta has reported implementing the first two of these recommendations but has not published information to demonstrate complete implementation; for the first recommendation, as of Q4 2022, Meta reported having "completed the global roll out of new, more specific messaging that lets people know whether automation or human review led to the removal of their content from Facebook" but did not furnish information evidencing this. For the third recommendation, Meta reported it had "nearly completed work on ways to allow users to indicate if their appeal falls under a policy exception"; once this is complete, Meta will begin to assess if taking into account policy exceptions is beneficial to the overall prioritization workflow.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.

Return to Case Decisions and Policy Advisory Opinions