Oversight Board announces two new cases and upholds Meta’s decision in the “Sri Lanka pharmaceuticals” case
Today, the Oversight Board is announcing two new cases for consideration, one related to Brazil’s elections, the other to content containing the testimony of a survivor of gender-based violence, as well as a policy advisory opinion on Meta’s moderation of the Arabic word “shaheed.” We are also publishing our decision in the “Sri Lanka pharmaceuticals” case, in which we uphold Meta’s decision to allow content asking for donations of pharmaceuticals during the country’s financial crisis. Finally, the Board is announcing the appointment of a new Trustee, senior technology executive Marie Wieck.
Two new cases Two new cases
Today, the Board is announcing two new cases for consideration. As part of this, we are inviting people and organisations to submit public comments.
As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta's policies.
The cases that we are announcing today are:
Brazilian general’s speech
User appeal to remove content from Facebook
Submit public comments here.
On January 3, 2023, two days after Luiz Inácio Lula da Silva had been sworn in as Brazil’s president, a Facebook user posted a video with a caption in Portuguese. The caption includes a call to “besiege” Brazil’s congress as "the last alternative.” The video shows part of a speech given by a prominent Brazilian general and supporter of Lula’s electoral opponent, in which he calls for people to “hit the streets” and “go to the National Congress... [and the] Supreme Court.” A sequence of images follows the general’s speech, including one of a fire raging in the Three Powers Plaza in Brasília, which houses Brazil’s presidential offices, Congress, and Supreme Court. Text overlaying the image reads, “Come to Brasília! Let’s Storm it! Let’s besiege the three powers.” Text overlaying another image reads “we demand the source code,” a slogan that protestors have used to question the reliability of Brazil’s electronic voting machines. The video was played over 18,000 times, was not shared, and was reported seven times.
Mr. Lula da Silva’s swearing-in had been accompanied by civil unrest, including protests and roadblocks. On January 8, more than a thousand supporters of former president Jair Bolsonaro broke into the National Congress, Supreme Court, and presidential offices, intimidating the police and destroying property. Meta designated Brazil a temporary high-risk location ahead of the country’s October 2022 general election, and has been removing content “calling for people to take up arms or forcibly invade ...federal buildings” as a consequence. Meta only announced it had done so on January 9.
On the same day the content was posted, a user reported it for violating Meta’s Violence and Incitement Community Standard, which prohibits calls to “forcibly enter locations … where there are temporary signals of a heightened risk of violence or offline harm.” In total, four users reported the content seven times between January 3 and January 4. Following the first report, the content was reviewed by a human reviewer and found not to violate Meta’s policies. The user appealed the decision, but it was upheld by a second human reviewer. The next day, the other six reports were reviewed by five different moderators, all of whom found that it did not violate Meta’s policies. The content was not escalated to policy or subject matter experts for additional review.
One of the users who had reported the content appealed Meta’s decision to the Oversight Board. In their appeal to the Board, they link the content’s potential to incite violence to the movement of people in Brazil “who do not accept the results of elections.”
The Board selected this case to examine how Meta moderates election-related content, and how it is applying its Crisis Policy Protocol in a designated “temporary high-risk location.” Meta developed the Protocol in response to the Board’s recommendation in the “Former President Trump’s suspension” case. This case falls within the Board’s “Elections and civic space” priority.
As a result of the Board selecting this case, Meta determined that its repeated decisions to leave the content on Facebook were in error. Because at-scale reviewers do not record their reasons for making decisions, the company does not have further information about why they found the content did not violate its policies in this case. On January 20, 2023, Meta removed the content, issued a strike against the content creator’s account, and applied a feature-limit, preventing them from creating new content.
The Board would appreciate public comments that address:
- The political situation in Brazil in advance of October’s election, and how it shifted between October 2022 and January 8, 2023.
- The relationship between political violence, election denialism, and calls for offline mobilization on social media.
- When Meta’s election integrity efforts should begin and end, and what criteria should guide decisions about those timeframes, particularly as they relate to transitions of power.
- How Meta should distinguish between legitimate political organizing and harmful coordinated action.
- How Meta should treat content attacking or delegitimizing democratic institutions and processes.
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Violence against women testimony
User appeal to restore content to Instagram
Submit public comments here.
In November 2022, an Instagram user posted a video with a caption in Swedish. The video contains an audio recording, also in Swedish, of a woman describing her experience in a violent intimate relationship, including feeling unable to discuss her situation with her family. The audio does not contain specific descriptions of violence. The caption notes that the woman in the audio recording consented to it being published, and that the voice has been modified. It says that there is a culture of blaming victims of gender-based violence, and little understanding of how difficult it is for women to leave a violent partner. The caption says, “men murder, rape and abuse women mentally and physically - all the time, every day.” It also provides a helpline number and says it hopes women reading the post will realize they are not alone. The post has been viewed about 10,000 times, shared fewer than 20 times and has not been reported by anyone.
Meta removed the content from Instagram under its Hate Speech Community Standard. The Hate Speech Community Standard prohibits making general claims, or “unqualified behavioural statements,” that people of a particular sex or gender are “violent criminals” or “sexual predators.” Meta's automated systems identified the content as potentially violating. After two human reviews, Meta removed the post and applied a “standard strike” to the user’s account. The user appealed, and a third human reviewer upheld the company’s decision. The content was then identified by Meta's automated High Impact False Positive Override (HIPO) system, which aims to identify content that does not violate Meta’s policies which has been wrongfully removed. This sent the content for additional review, where two more moderators found that it violated the Hate Speech policy.
The user then appealed to the Board. In their appeal, they said they frequently speak about men’s violence against women and aim to reach women who have survived violence. As a result of the Board selecting this case, Meta determined that its decision to remove the content was in error, restored the post, and reversed the strike.
The Board selected this case to explore Meta’s policies and practices in moderating content that targets people based on a protected characteristic, such as sex and gender. This case falls within the Board’s “Gender” and “Hate speech against marginalized groups” strategic priorities.
The Board would appreciate public comments that address:
- How Meta's Hate Speech policy may result in the removal of content containing testimonies or condemnation of gender-based violence.
- Insights on potential challenges and benefits of Meta’s approach to power imbalances between different “protected characteristics” in its Hate Speech policy.
- Insights on any challenges faced in sharing testimonies and condemnation of gender-based violence on Facebook and Instagram.
- Insights on the socio-political context in Sweden (and around the world), regarding violence against women, particularly intimate partner violence.
- How Meta’s strike system could be improved to better protect activists, human rights defenders, journalists and others against having their content mistakenly removed and penalties applied to their accounts.
In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to these cases.
If you or your organisation feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. The public comment window for both cases is open for 14 days, closing at 15:00 UTC, Thursday, March 23.
The Board has also announced today that it has accepted a policy advisory opinion on Meta’s moderation of the Arabic term “shaheed. As part of this, it is accepting public comments. The public comment window for the policy advisory opinion request announced today is open until 15:00 UTC on Monday 10 April.
Over the next few weeks, Board members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website.
To receive updates when the Board announces new cases or publishes decisions, sign up here.
Announcing Marie Wieck as a new Trustee Announcing Marie Wieck as a new Trustee
Today, we are also announcing Marie Wieck as a new Trustee of the Oversight Board. Our Trustees play a key role in protecting the Board's independence, and as a senior technology executive who worked at IBM for more than 30 years, Marie brings a wealth of knowledge about the industry to help the Board operate better in the future.
Oversight Board upholds Meta’s decision in the “Sri Lanka pharmaceuticals” case (2022-014-FB-MR) Oversight Board upholds Meta’s decision in the “Sri Lanka pharmaceuticals” case (2022-014-FB-MR)
The Oversight Board has upheld Meta’s decision to leave up a Facebook post asking for donations of pharmaceutical drugs to Sri Lanka during the country’s financial crisis. However, the Board has found that secret, discretionary policy exemptions are incompatible with Meta’s human rights responsibilities, and has made recommendations to increase transparency and consistency around the “spirit of the policy” allowance. This allowance permits content where a strict reading of a policy produces an outcome that is at odds with that policy’s intent.
About the case
In April 2022, an image was posted on the Facebook page of a medical trade union in Sri Lanka, asking for people to donate drugs and medical products to the country, and providing a link for them to do so.
At the time, Sri Lanka was in the midst of a severe political and financial crisis, which emptied the country’s foreign currency reserves. As a result, Sri Lanka, which imports 85% of its medical supplies, did not have the funds to import drugs. Doctors reported that hospitals were running out of medicine and essential supplies, and said they feared an imminent health catastrophe.
The Meta teams responsible for monitoring risk during the Sri Lanka crisis identified the content in this case. The company found that the post violated its Restricted Goods and Services Community Standard, which prohibits content that asks for pharmaceutical drugs but applied a scaled “spirit of the policy” allowance.
“Spirit of the policy” allowances permit content where the policy rationale, and Meta’s values, demand a different outcome to a strict reading of the rules. Scaled allowances apply to entire categories of content, rather than just individual posts. The rationale for the Restricted Goods and Services policy includes “encouraging safety.” Meta referred this case to the Board.
The Oversight Board finds that the post violates the Restricted Goods and Services Community Standard. However, it finds that applying a scaled “spirit of the policy” allowance to permit this and similar content was appropriate, and in line with Meta’s values and human rights responsibilities.
In the context of the Sri Lankan crisis, where people’s health and safety were in grave danger, the allowance pursued the Community Standard’s aim of “encouraging safety,” and the human right to health. Though allowing drug donations can present risks, the acute need in Sri Lanka justified Meta’s actions.
However, the Board is concerned that Meta has said that the “spirit of the policy” allowance “may” apply to content posted in Sinhala outside Sri Lanka, in addition to the Sri Lanka market. Meta should be clear about where its allowances apply. It should also ensure that at-scale allowances are sensitive to the ethnic and linguistic diversity of the people they may impact in order to avoid inadvertent discrimination. Sri Lanka has two official languages, Sinhala and Tamil, the latter largely spoken by Tamil and Muslim minorities.
The Board also finds that, to meet its human rights responsibilities, Meta should take action to increase users’ understanding of the “spirit of the policy” allowance, and to ensure it is applied consistently.
Users who report content are not notified when it benefits from a “spirit of the policy” allowance, nor do users have any way of knowing that the exception exists. The “spirit of the policy” allowance is not mentioned in the Community Standards, and Meta has not published information on it in the Transparency Center, as it has on the newsworthiness exception, partly thanks to recommendations from the Board. Secret, discretionary exemptions to Meta’s policies are incompatible with Meta’s human rights responsibilities.
There appear to be no clear criteria in place to govern when “spirit of the policy” allowances are issued and terminated. The Board emphasizes the importance of such criteria in ensuring decisions are made consistently, and recommends Meta make them public. It also finds that where Meta regularly uses an allowance for the same purpose, it should assess whether a standalone exception to the relevant policy is needed.
The Oversight Board’s decision
The Oversight Board upholds Meta’s decision to leave the post on Facebook.
The Board also recommends that Meta:
- Publish information on the “spirit of the policy” allowance in its Transparency Center, including the criteria Meta uses to decide whether to scale the allowance.
- Explain in the Community Standards that allowances may be made when a policy’s rationale, and Meta’s values, demand a different outcome than a strict reading of the rules. This should link to the “spirit of the policy” allowance information in the Transparency Center.
- Notify users when content they have reported benefits from the “spirit of the policy” allowance.
- Publicly share aggregated data in the Transparency Center on the “spirit of the policy” allowances issued, including the number, and the regions and languages impacted.
For further information
To read the full decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.