"The Most Important Thing Was to Show That Systems Within Platforms Actually Work"
16 de abril de 2026

In December 2025, we issued a decision overturning Meta’s removal of a comment on Kenyan politics including the word “tugeges.” Meta had incorrectly categorized the word as a slur.
Four months later, we followed up with Rachel Olpengs and Angela Minayo, two people involved in the National Coalition on Freedom of Expression and Content Moderation (FeCoMo) in Kenya, which is made up of representatives from tech, civil society, media, academia and government, and which made a public comment for the case.
They told us about the impact of the case.
+++
Oversight Board: Why was this decision so important?
Rachel Olpengs, Lead Coordinator at FeCoMo:
Because it went beyond looking at just the word as an identified slur, but looked at its contextual application at the time it was used. What was the environment, what was the event, what was the reference point? Bringing these contextual meanings and understanding, even when a word is already identified as problematic. Bringing in the issue of the changing meanings of words, which happens quite often. Even now, as we are entering the [Kenyan election] campaign period, starting from August, you'll find that again, some other words will have different meanings within that context, [that] political situation.
"The Board is one of those accesses to remedies ... Being listened to and having your input considered is something that we wanted to show our stakeholders" – Angela Minayo
Angela Minayo, Programs Officer, Digital Rights and Policy, Article 19 Eastern Africa:
The decision is important in protecting political speech, which is really important in democratic events, like elections, also showing what we've been saying as civil society for a long time, that platforms should not over-moderate or over-remove posts that they may deem to be hate speech when those elements of hate speech have not been established. It was an important decision to show that political expression should be protected, given the way it's important in shaping public discourse and allowing people to express themselves.
[We want] platforms to understand the unique role they play and the fact that moderation is continuous. It's also a call to platforms to invest in having more technical or moderation teams based here, or at least people who understand the political context in this country. We understand that Meta is not going to consult us all the time, that these decisions are made at the highest level in the company. [But] we hope that this case shows the need to have teams and to invest in moderation of content in the Global South.
Why was it important for FeCoMo to make a public comment?
Angela Minayo:
The most important reason for this submission was to show that systems within platforms actually work. This appeal mechanism at the Oversight Board is one of those accesses to remedies that we talk to our stakeholders about. And the best way to show them how it works is for us to actually participate in the process, so that they don't feel like these platforms are in the Global North, and they have nothing to do with us, and they cannot listen to us. Being listened to and having your input considered is something that we wanted to show our stakeholders.
Rachel Olpengs:
[In Kenya], you find that depending on current events, for example, an election period, speech may be limited, and the laws that we have are applied arbitrarily. And you're confined in what you can say and what you cannot say. So having an avenue through which you can appeal or correct some of these [erroneous content moderation decisions] – it is a win. Because you can say that it's something that you can act against, despite being constrained within the environment that we operate in.
"Sometimes platforms change policies, and they don't even inform users, while, for instance, Meta shows the changes when it says it is following the decisions of the Oversight Board" - Angela Minyao
How much of an opportunity was the case for you, and how often do you get opportunities like that for redress of content moderation issues?
Angela Minayo:
This was a big opportunity because we are in the Global South; Kenya is in East Africa. The challenge of content moderation in the Global South and in Kenya specifically is, of course, the language barrier. The fact is that we have 47 languages, tribes and ethnic groups. We understand that Meta is operating in a global context, and it might not be able to cater to all these different contexts and nuances that exist. While we are based here, we work here and we operate here; we understand the nuance and context much better.
It's important in the sense that we are able to shape freedom of expression norms and standards at the highest level, which is at the Oversight Board level. It was also an opportunity to share the context, that we use vernacular language a lot in this country, and it does not necessarily mean that it's been used in a bad context.
An automated review removed the word in this case. Why did you highlight in your public comment that there can be problems with automation?
Angela Minayo:
Moderation is a complex process. You're making a decision on whether a post stays up or not. But you will be impacting lives, you could be changing political positions in a country. We understand the use of automation, the need for it, but we are asking that we [have human] review of how we are using automation in moderation, especially when we look at things like political speech. Maybe ... certain speech should not be moderated automatically, but should be given to a human reviewer, or just paused to give room for consultation. We are asking that removal should not be the first step and that automation should not be happening to all types of speech, given that certain speech is complex and needs more review.
Rachel Olpengs:
We cannot do away with automation. It makes moderation efficient. But we must bring in that human aspect as well to add the nuances of certain contexts, like, in our case, that it was a volatile time. So, it was a time for [human] review. That will help to make it more effective.
How many opportunities for users and those in your region generally are there for interaction with Meta?
Angela Minayo:
I would say that the engagement is still low, and we still need to keep pushing and asking for more engagement and opportunities like this to share our perspectives. That's why I was very happy that this user [who appealed to the Board about Meta removing the comment] asked for the case. We still need to do more as stakeholders to inform users that these opportunities exist. There are many times when civil society organizations themselves lose accounts or their posts are removed. [It’s important] to bring back a sense of agency on the part of the users, that platform content moderation policies are not just shaped by the platforms themselves, but by the users and the communities that are using these platforms as well.
"It can be replicated for other platforms. Just seeing the Oversight Board making such a decision shows this is something that can be ... put in place" - Rachel Olpengs
Industry-wide, how does moderation need to improve?
Angela Minayo:
Engagement and transparency [for users] on policies is where we see a lapse. For instance, when platforms don't even engage Global South actors when coming up with these rules on moderation. Or, when we feel that platforms are too aligned with government requests, and therefore almost become almost an arm of government agencies that want a post to be taken down, and whatnot.
What we ask for is a system that is transparent. What we ask for is an access-to-remedy process that has all the rules of natural law and fairness. So, if your account is removed or there isn't a moderation action being taken, there is a chance to appeal, and the company responds to the appeal, to give reasons and justifications for their decision-making. We want more consultations with actors, consultations that actually shape policies and are not cosmetic consultations that have no follow through, have no transparency on what was agreed and what was changed. For instance, sometimes platforms change policies, and they don't even inform users, while, for instance, Meta shows the changes when it says it is following the decisions of the Oversight Board to make changes.
We are asking for transparency and efficiency because we believe that Global South markets, the Kenyan market, African markets are shaping platforms and they need a voice at the table.
Rachel Olpengs:
It can be replicated for other platforms. Just seeing the Oversight Board making such a decision shows this is something that can be considered and put in place. They know that there are bodies that could be involved in the whole process.
** The conversation was lightly edited for brevity and clarity.
Special thanks to Rhodri Davies, staff member of the Oversight Board, for conducting the interview for this impact story.