Oversight Board announces new cases
Today, the Board is announcing new cases for consideration. As part of this, we are inviting people and organizations to submit public comments.
Case selectionCase selection
As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta's policies.
The cases that we are announcing today are:
Political dispute ahead of Turkish elections
(2023-007-FB-UA, 2023-008-FB-UA, 2023-009-IG-UA)
User appeals to restore content to Facebook and Instagram
Submit public comments here.
These three cases concern content decisions made by Meta, which the Oversight Board intends to address together.
On May 14, 2023, Turkey’s first round of voting in the presidential and parliamentary elections took place. President Recep Tayyip Erdoğan, a member of the Justice and Development Party (AKP) ran against Kemal Kılıçdaroğlu, the leader of Turkey’s main opposition group, the Republican People’s Party (CHP). A key issue in the electoral campaigns has been the public’s attitudes surrounding Turkey’s preparedness for, and response to, the recent earthquakes in the country. On February 6, 2023, a series of powerful earthquakes struck southern and south-eastern Turkey. The disaster killed over 50,000 people, injured more than 100,000, and triggered the displacement of two million people in the provinces most affected by the tremors.
Shortly after the earthquakes, Istanbul Municipality Mayor Ekrem İmamoğlu, a member of the CHP, visited Kahramanmaraş, one of the cities impacted by the disaster. During his visit, another politician, Nursel Reyhanlıoğlu, confronted him. Ms. Reyhanlıoğlu previously served as a Member of Parliament (MP) with the AKP. In the recorded confrontation, former MP Reyhanlıoğlu shouts at Mayor İmamoğlu for making “a show” with his visit, calls him a “British servant” (Turkish: İngiliz uşağı), and demands that he “get out” and return to “his” Istanbul.
Three media outlets, BirGün Gazetesi, Bolu Gündem, and Komedya Haber, were among those that reported on the confrontation by sharing segments of the recording on Instagram and Facebook. One of the Facebook posts was a live stream that became a permanent post after the livestream ended. It included further video footage of Mayor İmamoğlu along with CHP leader and presidential candidate Kemal Kılıçdaroğlu speaking to two members of the public requesting more aid to rescue loved ones trapped under rubble and expressing frustration at the government’s emergency response.
Meta removed all three posts from the media outlets under its Hate Speech Community Standard, which prohibits “the usage of slurs that are used to attack people on the basis of their protected characteristics.” Multiple human moderators reviewed the Facebook posts, which were ultimately found to be violating of Meta’s Hate Speech Community Standard. At the time the videos were removed, the phrase “İngiliz uşağı” was on Meta’s non-public slur list for the Turkish language market. The Facebook posts were reported by several users and underwent multiple human reviews, including one escalated review by an internal Meta team. The Instagram post was reported by a user as well as detected by a classifier designed to identify the “most viral and potentially violating content.”
All three media outlets separately appealed the removal decisions to Meta, and the company maintained its decisions to remove each post. The outlets then individually appealed these removal decisions to the Board. In their statements to the Board, the outlets pointed out that other news channels also shared the video, emphasized the important role of news reporting in crisis situations, and contested that the content included hate speech.
The Board selected these cases to further explore Meta’s policies and moderation practices in times of crisis, including the use of crisis protocols. This is particularly relevant given that media coverage of the earthquake and the government’s response to it occurred in the months leading up to Turkey’s presidential elections in May 2023. These cases fall within the Board’s “elections and civic space,” “crisis and conflict situations,” and “hate speech against marginalized groups” strategic priorities.
As a result of the Board selecting these cases, Meta determined that its removals of all three posts were incorrect. While Meta’s internal policies identified the phrase as a slur when the company originally removed the posts, Meta has informed the Board that it has now removed the phrase from its slur list following an internal audit, as the phrase is “no longer used as a slur.”
The Board would appreciate public comments that address:
- The meaning of the phrase “İngiliz uşağı” in the context of this case and more broadly, as well as the potential consequences resulting from its use.
- The situation for freedom of expression in Turkey since the February 2023 earthquake and ahead of Turkey’s presidential elections in May 2023, in particular observed impacts of Meta’s content moderation on political discourse in this period.
- Meta’s approach to developing, maintaining, and updating its confidential global and market-specific slur lists to enforce its prohibition on hate speech, the potential advantages and risks in this approach, and its alignment with international standards on freedom of expression and other human rights.
- How Meta should approach slurs as a form of hate speech where the target is a public figure, such as a politician in a pre-electoral setting, and/or the usage of such slurs is reported by the media.
- Meta’s enforcement of its policies on Turkish-language content on Facebook and Instagram, including the use of automation to detect or enforce content in Turkish on both platforms.
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board also welcomes public comments proposing recommendations that are relevant to these three cases.
Promoting Ketamine for non-FDA-approved treatments
Case referred by Meta
Submit public comments here.
On December 29, 2022, a verified Instagram user posted a series of related images, including a colorful drawing of an office. The caption on the post explains that the user was given ketamine at the clinic depicted in the drawing as treatment for anxiety and depression, and another username for an account that appears to belong to a well-known ketamine therapy provider is at the top of the post. The user describes their experience with ketamine at the clinic as an entry “into another dimension,” and refers to ketamine as “medicine.” They also explained that they believe that “psychedelics” (a category that includes ketamine, but also other substances) are an important emerging mental health medicine. Other images in the series have text overlaid on drawings, and they are all related to their experience at the clinic. They range from a drawing depicting a person wearing an eye mask and lying under a blanket who is preparing to receive ketamine treatment to a colourful drawing of a person with rainbows, planets, and other objects coming out of their head. The post has about 10,000 likes, fewer than 1,000 comments, and has been viewed around 85,000 times. The user account has about 200,000 followers.
In total, three users reported one or more of the 10 images included in the post, and the content was removed three times under Meta’s Restricted Goods and Services Community Standard. After the first report, the content was removed after human review and the user who posted the content appealed the removal. On appeal and re-review by a human reviewer the content was restored. This process of human review, removal, appeal, and restoration happened twice. The third report was reviewed by an automated system that pull from previous enforcement actions. Based on previous enforcement actions taken on this content, the automated system determined that the content violated the Instagram Community Guidelines, specifically the Restricted Goods policy. After this removal, the content creator, who is a "managed partner,"brought this content to Meta’s attention for an additional review on an escalation level. “Managed partners” are entities across different industries that may receive varying levels of support, including training on how to use Meta’s products and a dedicated partner manager who can work with them to meet their goals on the company’s platforms. Meta then restored the content a third time and referred the case to the Board.
In its referral, Meta states that the increasing use of mind-altering drugs in the United States for purposes that blur the line between medical treatment, self-help, and recreation, makes it particularly difficult to ascertain whether this content should be treated as promoting pharmaceutical drugs, which is generally allowed on the platform, or as endorsing drugs for non-prescribed purposes or in order to achieve a high, which is generally not allowed.
The Board selected this case because it addresses a current issue within the United States and other jurisdictions: the legalization and normalization of certain drugs, specifically for medical uses.
The Board would appreciate public comments that address:
- The impact of Meta’s Restricted Goods and Services policy on the ability of users to share relevant experience and information about new mental health treatments.
- Whether ketamine and other psychedelic drugs should be considered as “pharmaceutical drugs” in the context of discussions about new treatments for mental health issues.
- Current socio-political context and discussions amongst doctors, mental health practitioners, and providers and recipients of ketamine treatment about the drug’s use for treating mental health conditions.
- Fairness considerations related to the “managed partner” status and relevant escalation channels available for content review.
- Transparency considerations related to the level of automation of Meta’s appeal systems and information provided to users in this regard.
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to these cases.
Public commentsPublic comments
If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for both cases is open for 14 days, closing at 23:59 your local time on Thursday 8 June, 2023.
What's nextWhat's next
Over the next few weeks, Board members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website.
To receive updates when the Board announces new cases or publishes decisions, sign up here.