The untold impacts of online abuse on Ethiopian women
7 min read
CIR


Illustration by Richard Drury via Getty Images
Lella Misikir uses her TikTok and YouTube accounts to talk about various topics, from wellness to women’s rights, pet peeves, and things she would like to see change in the world. “It’s really my way of talking to the universe,” she explains.
From this description, it is hard to foresee that her content would elicit an online response so severe that it would eventually force her to flee her country out of fear for her safety.
However, as a vocal advocate for gender equality who shares her views on public platforms, Lella became a target for online hate, doxxing, and even death threats. She is not alone – what Lella has experienced is part of an alarming pattern of online abuse against Ethiopian women in the public eye.
CIR’s research into Technology-Facilitated Gender Based Violence – or TFGBV – has revealed how unchecked online abuse, harmful stereotypes, and mockery are driving Ethiopian women and girls out of public life.
Researchers manually annotated over 17,000 comments from YouTube and TikTok across four languages (Amharic, Afaan Oromo, Tigrigna, and English) on over 600 YouTube channels and TikTok accounts that are popular and influential in Ethiopia. CIR also held workshops with Ethiopian subject matter experts and survivors to better understand the abuse women and girls are exposed to.
CIR found that Ethiopian women and girls are being routinely dismissed or ridiculed on TikTok and YouTube. This builds on CIR’s previous findings of widespread TFGBV on Facebook, X and Telegram in Ethiopia.
Unfolding amid a crisis in content moderation and regulatory enforcement, this unchecked online misogyny is shaping who gets to speak, lead, and exist safely in digital spaces, in turn discouraging Ethiopian women’s participation in public life.
To gain insight into the personal impact of this, CIR compared the broader dataset on TFGBV to the abuse aimed directly at Lella in the comments on her TikTok and YouTube accounts. Sitting down with Lella, it became clear that comments are only part of the picture, with online abuse leaving long-lasting impacts, and reflecting broader social issues in Ethiopia.
Latest reports, direct to your inbox
Be the first to know when we release new reports - subscribe below for instant notifications.

Ethiopian wellness expert and influencer Lella Misikir
Traditional ideas of femininity
Lella began posting on TikTok in 2021. The platform gave her access to a community she felt she could talk to, enabling her to say things she had been “waiting to say forever”. Within time, Lella’s following had grown to 86K, many of whom value her advice and musings on life, growth, and being a woman.
However, Lella soon received insults, threats, aggressive language, and slurs in her comments and inbox. Much of the abuse focused on her appearance, views, and work, as well as attempts to discredit her by questioning her sexuality and her ability to provide for herself.
There’s a saying they use a lot, 'wey alamersh wey alafersh', and it means you’re not pretty but you’re not even shameful […] you’re not beautiful enough to speak like this. They’ve created a tiny box of femininity, and you’re either doing too much or too little, so you’ll never fit in the box.
Lella’s experience mirrors CIR’s findings on trends in abuse targeting Ethiopian women and girls: insults, sexualisation, degrading stereotypes, and discrediting tactics that reinforce traditional gender narratives. Women who challenge traditional social norms – such as sportswomen, or those in leadership positions or with public platforms – are particularly targeted.
Feminists like Lella face additional attacks, often being discredited through accusations of financial fraud or a conflation with lesbianism or trans rights – with their views perceived as a threat to traditional values.
‘Not Ethiopian enough’
Often fuelled by political events and conflicts, gendered hate speech intersects with ethnicity, religion, and politics, creating multi-layered abuse. For example, increased ethnic hate speech was observed following the outbreak of the Tigray conflict in November 2020, and in the run-up to Ethiopia’s 2021 General Election.
Religious rhetoric is used to demonise women and justify misogyny, such as linking women to Eve’s “original sin”. In Lella’s case, she was labelled a “devil worshipper” to associate her with black magic, and some users commented biblical references like “obey your husband” on her posts.
In CIR’s workshops with TFGBV experts and survivors, colourism and anti-Blackness were identified as pervasive but underexplored themes in online abuse in Ethiopia.
Rooted in historical norms and reinforced by Western beauty standards, these biases manifest in everyday microaggressions – where lighter skin is often seen as more beautiful or advantageous, and darker-skinned individuals face heightened abuse.
However, in some regions, those perceived as “too white” are accused of not belonging in Ethiopia, whilst dark skin is conversely viewed as a marker of belonging to Ethiopia and the broader African continent.
Lella herself experienced colourism, recalling that people used comparisons to other African countries to insult her – claiming she does not look or act “Ethiopian enough”. She was also frequently told that Ethiopian women should not act like her and that her views do not reflect the realities of “normal” Ethiopian women.
‘It has changed me forever’
Lella recalls that the recent abuse she experienced has had “severe mental impacts” on her, causing considerable anxiety for both her and her family. Even day-to-day activities like travelling through a city now make her anxious.
Ever since, she says she has felt a loss of loyalty to Ethiopia and its people, leaving her with a feeling that she is “exiled” with “nowhere to go”. What scares her the most, however, is that the abuse will affect her in the future in ways she does not yet know.
When asked how Lella coped with the online hate, she responded, “I left the country and just kind of shut down for a while so I could tune everyone out. I didn’t respond to messages for months”. From Lella’s recollection of how she stopped posting online and “shut down”, it appears that she experienced the silencing effect of TFGBV that CIR has observed on a wider scale.
When she eventually began posting again, Lella recalls having an “audience” in her mind that she “needed to appease”, a potential side effect of trying to avoid further online backlash by trying to fit her viewers’ expectations.
However, upon further reflection, Lella realised that by censoring herself on her social media, she was experiencing an “extension of Ethiopian society’s policing and expectations of women”. Now, she has stopped “censoring” and has decided to be unashamedly herself.
They say I’m not Ethiopian enough, that I’m not Ethiopian woman enough. They ask me to speak Amharic, then say I’m not good enough at speaking Amharic.
‘Real-world violence’
The online abuse Lella faced reflects a society governed by rigid gender norms that diminish women’s agency both online and offline. Although threats and aggressive speech appeared less frequently in the broader dataset, they still indicate serious risks to women’s safety.
“In Ethiopia, what happens online does not stay online,” says Felicity Mulford, who manages CIR’s research into TFGBV in Ethiopia. “The doxxing of individuals during the Tigray conflict resulted in real-world violence and even killings.”
Today, the normalisation of hate speech has serious offline repercussions, from the silencing of women in leadership to real acts of violence, including acid attacks. This is not an abstract issue.
CIR found that TFGBV does not just affect women. Men and boys were also constrained by gender roles, with hate speech targeting perceived weakness or a lack of masculinity. Notably, men who publicly support gender equality were particularly targeted with gendered abuse.
Sentiment analysis of online hate speech in Ethiopia revealed that men and boys face a higher proportion of offensive and aggressive hate speech, while women and girls are more often subjected to stereotypical abuse and mockery. As such, even the nature of TFGBV differs based on societal expectations of masculinity and femininity.

Pie chart showing the sentiments of hate speech for each gender subgroup in CIR’s dataset
A content moderation crisis
Based on her own experience, Lella reflects that TFGBV is partly fueled by social media users spreading harmful abuse without considering its real-world repercussions. She notes that TikTok accounts are often anonymised, allowing TFGBV perpetrators to avoid scrutiny. Additionally, Lella criticises the tendency for authorities to not recognise the severity of the issue, letting online abuse persist.
Workshop participants similarly criticised the culture of anonymous hate speech on TikTok, with one stating “TikTok is the most toxic, violent and scary platform for women – influencers say outrageous things to take advantage of the virality it gives”. CIR’s research also found that while there was more politicised hate on YouTube, more hate speech and objectification was observed on TikTok.
Despite platform guidelines against hate speech, CIR’s research shows that enforcement remains inadequate. Workshop participants shared concerns that YouTube and TikTok would follow X and Meta in increasingly relying on automated large language models (LLMs) for content moderation. These are less effective in what are considered “low-resource” languages, such as Amharic, Afaan Oromo, and Tigrigna.
Felicity agrees that widespread online abuse in Ethiopia is partly due to the lack of sophistication in LLMs to detect abuse in less widely spoken languages.
“Automated content moderation simply does not work effectively in Ethiopian languages, allowing unchecked misogyny and vitriolic messaging to continue to thrive,” she adds. “This is especially the case when individuals use a mixture of languages within a single comment – a common technique to evade detection.”
Combatting online hate
There is little doubt that violence against women is an immense and likely growing problem in Ethiopia, both offline and online. This is unfolding amid global setbacks in content moderation and regulatory enforcement, exacerbated by a widespread lack of trust in platforms’ abilities to tackle online abuse effectively.
“With Ethiopia’s 2026 election on the horizon, failure to act risks further polarisation, violence, and the exclusion of women from public discourse,” says Felicity. “This would undermine democratic participation. Platforms must act decisively before it’s too late.”
Lella suggests that integrating more Ethiopian languages into LLMs could help identify cases of hate speech in Ethiopia, alongside stronger education initiatives to raise awareness of its severe impacts and improve authorities’ ability to hold perpetrators accountable.
CIR calls for a similar “multi-pronged” approach to combat online hate in Ethiopia, combining educational initiatives with improved content moderation, stronger regulatory enforcement and greater platform accountability, as well as support for those impacted.
Lella’s experience highlights that TFGBV is deeply harmful, with long-term consequences for survivors, making the need for solutions more urgent than ever.
CIR’s full report ‘No safe scroll: Investigating gendered hate speech on TikTok and YouTube in Ethiopia’ can be found here. It is available in English, Amharic and Tigrigna, with an Afaan Oromo version coming soon.