Multiple Case Decision
Kenya’s Cabinet Secretary Remark
December 9, 2025
Two users separately appealed Meta’s decisions to remove two posts reporting on Kenya Interior Cabinet Secretary Kipchumba Murkomen's speech concerning protestors. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and restored both posts.
2 cases included in this bundle
FB-H5X3TL3A
Case about violence and incitement on Facebook
FB-KW8A63JE
Case about violence and incitement on Facebook
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
Two users separately appealed Meta’s decisions to remove two posts reporting on Kenya Interior Cabinet Secretary Kipchumba Murkomen's speech concerning protestors. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and restored both posts.
About the Cases
In the first case, a Facebook user, whose biography says they run a “political news page,” shared a video featuring Kenya Interior and National Administration Cabinet Secretary Kipchumba Murkomen at a press conference. In the video, Murkomen says, "We are telling the police, anyone who approaches a police station, shoot him.” The video includes a caption in Swahili saying Murkomen was “heckled” after making the remark in the video. In the second case, another Facebook user published a post in Swahili that quotes Murkomen calling for protestors who approach police stations to be shot.
Both posts were published in June 2025, following Murkomen's speech. This drew intense criticism from Kenyans and human rights advocates for violating constitutional safeguards. Murkomen made his statement amid a broader government crackdown on nationwide protests that had resulted in violent clashes, casualties and widespread human rights concerns. These protests began in June 2024 in response to a controversial finance bill that proposed significant tax increases. The movement reignited in June 2025 following the death of blogger and teacher Albert Ojwang in police custody.
The user who appealed the first case to the Board explained that they are a news outlet, and that highlighting Murkomen’s remark was not incitement. They also claimed that the removal is “limiting freedom of media.” The user who appealed the second case said that their post was “merely a statement” that was made by Murkomen “during the press conference after the Gen Z protest in Kenya” when addressing the police, and that this statement was “captured by Facebook users and media houses.”
Under its Violence and Incitement policy, Meta removes "threats of violence that could lead to death (or other forms of high-severity violence)." However, the policy explicitly allows threats “when shared in awareness-raising or condemning context.”
After the Board brought these cases to Meta's attention, the company determined that the posts did not violate its Violence and Incitement policy, and that their removal was incorrect. The company explained that the post in the first case is “reporting on the heckling [of Murkomen]" and concluded that it doesn’t violate the policy. In the second case, Meta noted that “although the post does not expressly mention the surrounding context, it was shared during a time when news outlets were reporting on Murkomen's comments.” Additionally, Meta highlighted that the post “contains no other indicators of violent speech. Taken together, this suggests the content was shared to raise awareness –an example of citizen journalism– which is permitted under [its] policy.” The company then restored both pieces of content to Facebook.
Board Authority and Scope
The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Cases
Both pieces of content are examples of overenforcement of Meta's Violence and Incitement policy, which infringe on users' freedom of expression in a country facing protests marked by violent clashes, casualties, and serious human rights concerns. The Board noted that after it brought these cases to Meta’s attention, the company assessed both cases in light of the context in Kenya, with Meta highlighting that the second post was “shared during a time when news outlets were reporting on Murkomen’s statement.” This revised approach is consistent with the Board’s guidance for Meta on how to address policy enforcement against content on its platforms, though it was not followed by its moderators who originally reviewed both posts. In the Call for Women’s Protest in Cuba, the Board underscored the importance of ensuring that this contextual approach to content review is adopted at-scale “for Meta to reduce false positives (mistaken removal of content that does not violate its policies).”
The Board has issued a recommendation regarding Meta’s Violence and Incitement policy and its enforcement that is relevant in this case: “Meta should add to the public-facing language of its Violence and Incitement Community Standard that the company interprets the policy to allow content containing statements with 'neutral reference to a potential outcome of an action or an advisory warning,' and content that 'condemns or raises awareness of violent threats,'" ( Russian Poem” recommendation no. 1). Partial implementation of the recommendation was demonstrated through published information. Meta updated its Violence and Incitement policy (see Meta’s Q4 2023 Report) to include the following: “We do not prohibit threats when shared in awareness-raising or condemning context, when less severe threats are made in the context of contact sports, or when threats are directed against certain violent actors, like terrorist groups.” However, Meta noted in its Q3 2023 Report that the company decided against adding specific language regarding, “neutral references to a potential outcome of an action or an advisory warning, where the poster is not making a violent threat” as this is “generally covered in the rationale of the policy and the definition and framing of a threat.”
Additionally, the Board has issued recommendations to increase its understanding of, and overall transparency, around Meta’s enforcement accuracy and approach to measuring it:
- “In order to inform future assessments and recommendations to the Violence and Incitement policy and enable the Board to undertake its own necessity and proportionality analysis of the trade-offs in policy development,” the Board recommended that Meta “provide the Board with the data that it uses to evaluate its policy enforcement accuracy. This information should be sufficiently comprehensive to allow the Board to validate Meta’s arguments that the type of enforcement errors in these cases are not a result of any systemic problems with Meta’s enforcement processes” (United States Posts Discussing Abortion recommendation no. 1).
- Meta “should improve its transparency reporting to increase public information on error rates by making this information viewable by country and language for each Community Standard.” The Board underscored that “more detailed transparency reports will help the public spot areas where errors are more common, including potential specific impacts on minority groups, and alert [Meta] to correct them” ( Punjabi Concern over the RSS in India recommendation no. 3).
In response to both recommendations, Meta shared a confidential “summary of enforcement data ... including an overview of enforcement accuracy data” for the Violence and Incitement and the Dangerous Organizations and Individuals policies (Meta’s H1 2025 Report on the Oversight Board). Both recommendations were considered by the Board as omitted or reframed, since Meta’s responses did not address the core objective of the recommendations. For the first recommendation, this is because the Board asked Meta for the data the company uses to assess the accuracy of policy enforcement, but Meta only shared the results of an assessment, and not the data used to evaluate the accuracy of policy enforcement. The goal of the second recommendation was considered as not achieved because it asked Meta to make this enforcement data public, and also for the company to break it down by country and language.
While underenforcement of the company’s Violence and Incitement Community Standard can result in Meta’s platforms being exploited to spread threats, as highlighted by the Board in the Cambodian Prime Minister decision, overenforcement of the same policy undermines the ability of users to post and share news reporting and information about events of public interest or concern. Meta should prioritize minimizing these errors. Full implementation of the recommendations mentioned above could reduce the number of incorrect removals under the company’s Violence and Incitement policy. Firstly, it would prevent – and reverse – erroneous takedowns by increasing both users’ and content reviewers’ awareness around policy exceptions. Moreover, it would allow the company to refine its approach to measuring and comparing accuracy data across languages and/or regions, to better allocate resources to improve accuracy rates where needed. Finally, public reporting on the accuracy of reviews under the Violence and Incitement policy would increase transparency and generate engagement with Meta that has the potential to lead to further improvements.
Decision
The Board overturns Meta’s original decisions to remove the two pieces of content. The Board acknowledges Meta’s corrections of its initial errors once the Board brought the case to Meta’s attention.