Oversight Board overturns original Facebook decision: Case 2021-009-FB-UA
The Oversight Board agrees that Facebook was correct to reverse its original decision to remove content on Facebook that shared a news post about a threat of violence from the Izz al-Din al-Qassam Brigades, the military wing of the Palestinian group Hamas. Facebook originally removed the content under the Dangerous Individuals and Organizations Community Standard, and restored it after the Board selected this case for review. The Board concludes that removing the content did not reduce offline harm and restricted freedom of expression on an issue of public interest.
About the case
On May 10, 2021, a Facebook user in Egypt with more than 15,000 followers shared a post by the verified Al Jazeera Arabic page consisting of text in Arabic and a photo.
The photo portrays two men in camouflage fatigues with faces covered, wearing headbands with the insignia of the Al-Qassam Brigades. The text states "The resistance leadership in the common room gives the occupation a respite until 18:00 to withdraw its soldiers from Al-Aqsa Mosque and Sheikh Jarrah neighborhood otherwise he who warns is excused. Abu Ubaida – Al-Qassam Brigades military spokesman." The user shared Al Jazeera’s post and added a single-word caption “Ooh” in Arabic. The Al-Qassam Brigades and their spokesperson Abu Ubaida are both designated as dangerous under Facebook’s Dangerous Organizations and Individuals Community Standard.
Facebook removed the content for violating this policy, and the user appealed the case to the Board. As a result of the Board selecting this case, Facebook concluded it had removed the content in error and restored it.
After the Board selected this case, Facebook found that the content did not violate its rules on Dangerous Individuals and Organizations, as it did not contain praise, support or representation of the Al-Qassam Brigades or Hamas. Facebook was unable to explain why two human reviewers originally judged the content to violate this policy, noting that moderators are not required to record their reasoning for individual content decisions.
The Board notes that the content consists of republication of a news item from a legitimate news outlet on a matter of urgent public concern. The original Al Jazeera post it shared was never removed and the Al-Qassam Brigades’ threat of violence was widely reported elsewhere. In general, individuals have as much right to repost news stories as media organizations have to publish them in the first place.
The user in this case explained that their purpose was to update their followers on a matter of current importance, and their addition of the expression “Ooh” appears to be neutral. As such, the Board finds that removing the user’s content did not materially reduce offline harm.
Reacting to allegations that Facebook has censored Palestinian content due to Israeli government demands, the Board asked Facebook questions including whether the company had received official and unofficial requests from Israel to remove content related to the April-May conflict. Facebook responded that it had not received a valid legal request from a government authority related to the user’s content in this case, but declined to provide the remaining information requested by the Board.
Public comments submitted for this case included allegations that Facebook has disproportionately removed or demoted content from Palestinian users and content in Arabic, especially in comparison to its treatment of posts threatening anti-Arab or anti-Palestinian violence within Israel. At the same time, Facebook has been criticized for not doing enough to remove content that incites violence against Israeli civilians. The Board recommends an independent review of these important issues, as well as greater transparency with regard to its treatment of government requests.
The Oversight Board’s decision
The Oversight Board affirms Facebook’s decision to restore the content, noting that its original decision to remove the content was not warranted.
In a policy advisory statement, the Board recommends that Facebook:
- Add criteria and illustrative examples to its Dangerous Individuals and Organizations policy to increase understanding of the exceptions for neutral discussion, condemnation and news reporting.
- Ensure swift translation of updates to the Community Standards into all available languages.
- Engage an independent entity not associated with either side of the Israeli-Palestinian conflict to conduct a thorough examination to determine whether Facebook’s content moderation in Arabic and Hebrew, including its use of automation, have been applied without bias. This examination should review not only the treatment of Palestinian or pro-Palestinian content, but also content that incites violence against any potential targets, no matter their nationality, ethnicity, religion or belief, or political opinion. The review should look at content posted by Facebook users located in and outside of Israel and the Palestinian Occupied Territories. The report and its conclusions should be made public.
- Formalize a transparent process on how it receives and responds to all government requests for content removal, and ensure that they are included in transparency reporting. The transparency reporting should distinguish government requests that led to removals for violations of the Community Standards from requests that led to removal or geo-blocking for violating local law, in addition to requests that led to no action.
For further information:
To read the full case decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.