केस विवरण
In February 2025, a Facebook user commented on a post discussing former Kenyan Prime Minister Raila Odinga’s candidacy in the election for the African Union’s Chairperson. Both posts are largely in English, with some words in Swahili.
The Facebook post contains a photograph of Rigathi Gachagua, former Deputy President of Kenya (2022–2024), with text overlay quoting Gachagua’s support for Odinga and noting why his success in the election would be good for the African continent. The caption claims that Gachagua, who is from the Kikuyu ethnic group, was endorsing Odinga, who is from the Luo ethnic group, to please Odinga’s Luo constituency and increase his own popularity for domestic political gain. The caption suggests Luo people are gullible and would vote for anyone that benefits Odinga, including politicians from other ethnic groups allegedly responsible for the violence committed against the Luo people in the past.
The user’s comment on the post mocks the reaction to Gachagua’s statement and dismisses its explanation as meant for “tugeges,” referring to Gachagua’s Kikuyu supporters. The comment argues Gachagua’s endorsement is for an external rather than a Kenyan audience.
Meta removed the user’s comment for violating the Hateful Conduct Community Standard, and left up the post the user had commented on. Meta designated the term “tugeges,” a prohibited slur under this policy in January 2024, following the term gaining traction in the 2022 presidential elections. Meta translates “tugeges” to mean “retarded Kikuyu,” noting it is the plural form of “kagege,” which is a Kikuyu term to describe “a person who is extremely confused to the point of gaping vacantly at the world.” Meta notes the term stems from the word “gega,” which means to “stare in puzzlement.”
Under the Hateful Conduct Community Standard, Meta removes content that “describes or negatively targets people with slurs.” Meta defines slurs as “words that inherently create an atmosphere of exclusion and intimidation against people on the basis of a protected characteristic, often because these words are tied to historical discrimination, oppression and violence.” The policy has an exception for when content uses slurs to condemn hate speech, report on it, reclaim it or when used self-referentially.
Meta’s automated systems detected the use of a slur in the content and determined it was violating, resulting in the removal of the comment and a standard strike to the account of the commenting user. The user appealed Meta's decision to remove their comment. A human reviewer confirmed the content was violating, so it was not restored.
The user who posted the comment then appealed to the Oversight Board. In their statement to the Board, the user stated they did not insult, threaten or use abusive language in the comment. They were just replying to the post “in a civil way” and made a simple political comment.
The Board selected this case to examine Meta’s respect for political expression in countries with a recent history of intercommunal violence. This case falls within the Board’s Hate Speech Against Marginalized Groups priority.
The Board would appreciate public comments that address:
- The meaning of the term “tugeges” and its contemporary use in Kenyan politics and society, including evidence of its connection to harms or lack thereof.
- The political context in Kenya, including the situation for freedom of expression online and risks related to incitement of intercommunal violence.
- How ethnicity is discussed in relation to current affairs in Kenya, and the prevalence of contested or potentially insulting language or generalizations in everyday discourse.
- How reliance on automated enforcement of Meta’s Hateful Conduct policy impacts human rights, including freedom of expression, especially in countries with a recent history of intercommunal violence.
As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.
Public Comments
If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the case announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Tuesday 17 June.
What’s Next
Over the next few weeks, Board Members will be deliberating this case. Once they have reached their decision, we will post it on the Decisions page.
टिप्पणियाँ
Introduction
We, the National Coalition and Content Moderation in Kenya, welcome this opportunity to offer freedom of expression perspectives to Meta’s decision to take down a post for violating its Hateful Conduct Policy. The Coalition is a collaborative initiative aimed at fostering a digital ecosystem where freedom of expression is upheld in alignment with international human rights standards. We are committed to ensuring that content moderation on social media platforms reflects the unique context and diversity of Kenya, encompassing linguistic, social, cultural, and political dimensions.
We will weigh Meta’s decision against international human rights standards on freedom of expression three part test:
1. Be provided by law; any law or regulation must be formulated with sufficient precision to enable individuals to regulate their conduct accordingly;
2. Pursue a legitimate aim, exclusively: respect of the rights or reputations of others; or the protection of national security or of public order (ordre public), or of public health or morals;
3. Be necessary in a democratic society, requiring the State to demonstrate in a specific and individualised manner the precise nature of the threat, and the necessity and proportionality of the specific action taken, in particular by establishing a direct and immediate connection between the expression and the threat.
1. The meaning of the term “tugeges” and its contemporary use in Kenyan politics and society, including evidence of its connection to harms or lack thereof.
Origins
The term tugege first gained traction in the run up to Kenya’s 2022 elections to describe a person who doesn’t question government decisions. We understand the term to mean people who have no mind of their own and used to critic the people from Central Kenya for voting for the current president, William Ruto. The term is from the Kikuyu language, it was first used by Former Nakuru Governor Lee Kinyanjui, its use has taken a broader political meaning that applies to any person who supports the current government regime.
In 2023, the National Cohesion and Integration Commission, a government body established after the 2007 post election violence to foster national cohesion, refuted claims that the word “tugege” will banned on grounds of hate speech. The NCIC has since changed its position and in June 2024 it warned Kenyans from using the term for being hurtful, demeaning and dehumanising to people who support a certain political side.
2. The political context in Kenya, including the situation for freedom of expression online and risks related to incitement of intercommunal violence.
Elections in Kenya are volatile and ethnically charged. This can be seen from the 2007-2008 election violence that was sparked by contested presidential elections. The post election violence led to the loss of lives, a case being referred to the ICC and most importantly, reforms in Kenya’s legal framework. Kenya promulgated a new constitution in 2010 provides a robust human rights framework and safeguard the right to freedom of expression under Article 33.
Freedom of expression online has deteriorated under the current government regime evidenced by the heightened government surveillance, killings and abductions of bloggers for dissenting political views and excessive use of force by law enforcement officers. On 7th June 2025, Albert Ojwang was arrested in connection with a social media post insulting a police boss, he died while in custody and the murder is being investigated. The Computer Misuse and Cyber Crimes Act, the primary cybersecurity law, has been weaponised to charge dissenting voices.
3. How ethnicity is discussed in relation to current affairs in Kenya, and the prevalence of contested or potentially insulting language or generalizations in everyday discourse.
This is not the first time that a Kikuyu word gains political use,” Wanjiku” which is a common name for females among the Agikuyu people, is used to refer to the ordinary Kenyan citizen in political debates. Ethnicity is a tool of division used by the political class who weaponise online with the use of psyops. There is a tribal psyop every week that is supposed to work alienate politicians from a certain region ahead of the 2027 elections.
4. How reliance on automated enforcement of Meta’s Hateful Conduct policy impacts human rights, including freedom of expression, especially in countries with a recent history of intercommunal violence
The use of automated enforcement of Meta’s hateful conduct policy poses significant risks on freedom of expression. On the one hand, Meta’s automated enforcement of its hateful conduct policy will require some form of nuanced assessment . Shortcomings of automation Failure to capture nuance, tone and context which is crucial in moderated content in line with international human rights standards Meta should conduct a human rights impact assessment of its content moderation automation systems to ensure that human rights, specifically freedom of expression is not unlawfully violated. Given the nuance and context needed in moderating political speech, we recommend that Meta prioritises increasing the number of local content moderators and adhere to labour laws with regard to how content creators are treated and paid. Ultimately, if Meta fails to distinguish between harmful ethnic hatespeech and legitimate political commentary, freedom of expression is jeopardised.