As the world marks the 2025 International Day to End Impunity for Crimes against Journalists (IDEI), the Media Foundation for West Africa (MFWA) is calling for urgent action to confront a growing human rights crisis, the weaponisation of artificial intelligence (AI) to perpetrate gender-based violence (GBV) against women journalists.
This year’s UNESCO theme, “Chat GBV: Raising Awareness on AI-facilitated Gender-Based Violence against Women Journalists,” sheds light on how emerging technologies are being misused to silence, discredit, and endanger women in the media. Across West Africa, digital tools once celebrated for connecting people are now being deployed to harass, intimidate, and erase women’s voices from public discourse.
The rise of AI-generated deepfakes, image manipulation, impersonation, and automated hate campaigns has opened a new front in the battle for gender equality and press freedom. These technologies are being exploited to reinforce stereotypes, humiliate women journalists, and punish them for speaking truth to power.
The weaponisation of AI amplifies gender bias and creates new layers of vulnerability for women whose work already attracts scrutiny and hostility. Such abuse violates women’s rights to safety, privacy, dignity, and equality. These are rights guaranteed under the Universal Declaration of Human Rights, the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW), and the Maputo Protocol.
In many cases, women journalists are targeted through doctored sexual images, fake quotes, or online impersonation that damages their credibility and personal lives. These attacks push many women into self-censorship, force them off digital platforms, or drive them out of journalism altogether, a situation that weakens democratic debate and media diversity.
Despite the growing scale of AI-facilitated gender-based violence, most countries in West Africa lack specific laws or mechanisms to address it. Cybercrime and media laws rarely cover AI-generated content or gendered online abuse. Even when victims report such violations, impunity remains the norm. Investigations are scarce, perpetrators are seldom prosecuted, and survivors receive little legal or psychosocial support.
This failure to protect women journalists emboldens abusers and deepens inequality. It sends a chilling message that digital violence is consequence-free and that women’s rights end where the internet begins.
The MFWA has documented violations against women journalists across Burkina Faso, Mali, Niger, Guinea, Nigeria, and Ghana. Our recent country-level studies on the safety of women journalists reveal patterns of harassment, sexual exploitation, and online abuse, with AI-powered manipulation likely to worsen the situation.
On this year’s IDEI, MFWA urges governments, technology companies, media institutions, and human rights bodies to treat AI-facilitated GBV as a grave human rights violation and act decisively to end impunity:
- Governments must review and strengthen digital safety and gender-protection laws to cover AI-generated abuse and ensure prompt investigation and prosecution of offenders.
- Technology companies must improve content moderation, swiftly remove AI-generated sexual or defamatory material, and establish survivor-friendly redress mechanisms.
- Media organisations must adopt and enforce anti-harassment and digital-safety policies, provide confidential reporting systems, and offer psychosocial and legal support to survivors.
- Regional bodies such as ECOWAS, the African Union, and the African Commission on Human and Peoples’ Rights must integrate AI-facilitated GBV into press freedom, digital rights, and gender equality frameworks.
The Foundation continues to provide legal assistance, and digital safety training for affected journalists while engaging policymakers and regional institutions to strengthen protection frameworks.


