AI in Ghanaian Journalism: Navigating opportunity and uncertainty

By: CLEMENT EDWARD KUMSAH, AI enthusiast and a journalist with The Fourth Estate

This opinion piece was inspired by a recent invitation I received from the Ghana Broadcasting Corporation (GBC) to speak on their flagship 8:00 PM news analysis program, “Behind the News.” The discussion focused on the rise of Artificial Intelligence (AI) in journalism; its opportunities, its risks, and its implications for the media landscape in Ghana. This write-up is my attempt to crystalize my thoughts on that important conversation.

In 2015, I was assigned to cover a press conference at the headquarters of one of Ghana’s biggest political parties in Accra. TV and radio stations were broadcasting it live, leaving little room for real-time fact-checking. Meanwhile, print reporters, myself included, scrambled to jot down quotes that would headline the next day’s newspapers, only after a round of editorial back-and-forth.

Fast forward to today, and something has shifted. The chaos is still there, but it now shares space with tools powered by lines of code called algorithms that recommend headlines, bots that transcribe interviews, and AI models that can summarize court documents and press releases in seconds. Journalism, as we know it, is changing.

In Ghana, where press freedom walks a delicate rope and public trust is easily swayed by viral misinformation, the rise of artificial intelligence in the media space is a defining moment. We stand at the crossroads of opportunity and uncertainty.

As a Journalist who has worked in both radio and online newsrooms, I see AI not just as a tool, but as a test  of our ethics, our adaptability, and our courage to remain human in a world that’s becoming more machine-led.

Now let’s take the five questions sent to me by GBC one at a time.

Given the rapid rise of AI in the global media space, what risks do you foresee for media freedom and public trust in Ghana?

AI in journalism feels like a breakthrough. The idea that a machine can transcribe an interview in seconds or generate story angles based on trending conversations sounds like a dream. But beneath the convenience lies a real concern: that truth, freedom, and trust which are the very pillars of journalism could be unknowingly eroded if we’re not careful.

Ghana’s media ecosystem is very active, but also vulnerable. Pressure from politicians, financial over dependence on advertisers, and the constant battle with misinformation already make it difficult for journalists to operate freely. Now, with AI entering newsrooms, there’s a new risk: that the tools designed to help us could be used to control us instead. As a matter of fact, no single newsroom can tackle mis/disinformation alone and as such it’s now very necessary for every newsroom to create a fact checking desk.

Media freedom depends on journalists having control over their editorial processes where they can decide what stories to tell, how to tell them, and when to publish. But what happens when AI tools begin influencing those decisions? For instance, if newsrooms start relying on AI-powered analytics to determine which stories are “worth publishing” based only on engagement metrices, then stories that serve the public interest but don’t trend online may be ignored.

It gets scarier when we look at surveillance. AI tools can now scan phone records, trace online behavior, and monitor journalists’ activities without them even knowing. Such tools, in the hands of governments or private actors with the wrong intentions, can be used to track down reporters, unmask sources, and silence even whistleblowers. And in a country like Ghana, where some journalists have been followed, harassed, or in tragic cases like that of Ahmed Suale, murdered, the idea that AI could make journalists even more vulnerable is alarming.

Now let’s talk about public trust in the media; which is already shaky. Misinformation is no longer spread only by bloggers or political actors. It can now be mass-produced by AI. Generative AI tools can fabricate press releases, fake audio recordings, or produce realistic videos making it difficult for ordinary people to know what’s true. During the 2024 elections, the Ghana Fact-Checking Coalition found that a majority of false claims circulating online were in video and image form, many of them manipulated using AI. When people consume these in good faith and later find out they’ve been misled, it creates a deep skepticism that even the most credible journalists and newsrooms are treated with scruples.

This is where the danger lies. If citizens can’t tell the difference between a journalist’s story and an AI-generated lie, trust in the media collapses. And once that happens, the backbone of our democracy starts to shake.

As someone passionate about both journalism and the potential of AI, I don’t believe we should fear the technology, but we must not adopt it blindly too. I think we need to build Ghanaian AI tools trained on our data, shaped by our values, to be used under strong ethical guidelines. Journalists must stay in control, not the algorithms. And the public must always know who is telling the story and why.

By this, it’s clear that the task ahead of confronting and countering the malign effects of AI on political efficacy is necessary though daunting.

How can the Ghanaian media industry leverage AI to improve investigative journalism, content diversity, and audience engagement?

One of the biggest myths about AI in journalism is that it’s here to replace us. But I believe its true power lies in helping us do what we already do but faster, deeper, and smarter.

Currently, most Ghanaian newsroom budgets are shrinking, and reporters are doing too much; especially in newsrooms that run radio, TV, and online portals. The reporter’s story has to be on all platforms; and this is where AI can be the behind-the-scenes assistant we’ve always needed and lacked.

In the specific case of investigative journalism – it’s ordinarily tedious and time-consuming: hundreds of documents to review, months of calls, gigabytes of interviews to transcribe. But with the right tools, AI can help. Imagine a system that scans procurement records, flags malfeasance, and unusual contracts, repetitive names, or shell companies with suspicious links

At the Fourth Estate, where I work, we’ve begun experimenting with tools that transcribe interviews instantly and organize research notes using AI. What once took hours and days now takes minutes — freeing us to focus on what machines still can’t do; i.e., asking the right questions, connecting human dots, and telling stories that move people.

AI can also help diversify our content. Ghana’s mainstream media largely focus on politics and sports. But AI tools that analyze social media trends and search patterns can reveal serious underreported topics like youth mental health, rural climate challenges, among others. This data-informed approach could push newsrooms to broaden their coverage, include more voices, and challenge the status quo of what’s really “newsworthy” and maintain the role of the newsroom as gatekeeper.

And on the issue of audience engagement, AI can personalize news delivery; ensuring a 22-year-old student in Cape Coast sees different stories than a cocoa farmer in the Ashanti Region without compromising editorial integrity. Visual AI tools can help generate infographics from hard data, making stories more interactive and digestible.

But for all of this to work in Ghana, we need intentionality. We can’t just import foreign AI tools and expect them to understand our dialects and our contexts. We need to train AI on Ghanaian datasets, and build models that recognize local languages like Twi, Ewe, and Dagbani.

Above all, we need to empower journalists, especially young ones to embrace AI not as a threat, but as a partner. Imagine journalism schools in Ghana teaching students not just how to conduct interviews, but how to fact-check AI output, or spot algorithmic bias. That’s the future we all have to believe in.

In a media ecosystem dominated by social media algorithms and AI tools, how can traditional journalism maintain relevance and authority?

With the rise of viral TikToks and AI-generated content, traditional journalism risks being drowned in a sea of speed and spectacle.

As in many parts of the world including Ghana, the news cycle now moves at the speed of likes and shares. A single tweet can set the national agenda. A misleading WhatsApp voice note can ignite panic. And mostly, long before traditional newsrooms publish their first verified story, the public has already made up its mind.

So how do we stay relevant? With one word: trust.

Trust is the new currency of journalism. Social media may be loud, but it is volatile. Traditional journalism must step in not to chase trends but to contextualize them and verify them.

However, trust must be earned. Newsrooms must treat digital platforms as core spaces, not dumping grounds for headlines.  That strategy no longer works. A 19-year-old university student scrolling through Instagram is not looking for a 1,500-word press release. She’s looking for something that speaks to her with visuals, emotion, and clarity.

And finally, newsroom collaboration. Let me repeat that no single newsroom can tackle disinformation alone. We badly need coalitions, shared tools, and joint investigations in the AI era than ever.  Fact-Check Ghana, Dubawa, and FactSpace West Africa collaborated before and during the election 2024 in Ghana and that is commendable work but their efforts must be amplified, mainstreamed, and embedded into every newsroom.

Indeed, the game has changed. But relevance is not about being loud or fast.  It’s about being trusted. And in this algorithmic age, trust is journalism’s greatest asset.

What are some of the ethical breaches using AI, and the role of media houses in adopting strategies to guard against deepfakes, misinformation, and algorithmic manipulation of news?

AI makes it easier to manipulate content. From realistic fake videos to stories that never happened, the ethical threats are multiplying.

Tools like GPT-3, Claude AI and DeepSeek can write convincing articles, generate almost perfect images and videos, mimic a journalist’s tone, or even fabricate interviews that sound entirely plausible.

The Ghana Fact-Checking Coalition’s post-election report reveals that videos (40.15%) and images (29.2%) were the most common formats of misinformation. Many were created using AI or recycled to appear new. Despite swift debunking efforts, the damage often lingers. In a click-first, correct-later culture, truth is always playing catch-up.

Media houses must respond with rigor:

  1. Verification must extend beyond text to video, images, and even AI-generated commentary.
  2. Training must evolve. Journalists need tools to detect deepfakes and spot algorithmic manipulations.
  3. Policies must be developed. Ethical guidelines for using AI in newsrooms must be clearly defined.

Moreover, we must empower the public. Media literacy campaigns can help audiences question, verify, and critically engage with content. Initiatives like those by the Media Foundation for West Africa which train journalists to spot mis/disinformation are steps in the right direction.

In short, technology alone won’t save us. Ethics, transparency, and education will.

What message do you have for media practitioners and persons who use AI to aid in their work?

To every journalist, editor, and media manager, AI is not the future of journalism. Ethical, courageous humans using AI are.

So to the young journalist in Kumasi using AI to transcribe faster: keep going, but always fact-check the machine. To the editor in Accra using AI to track trends: use the data, but let your human instincts guide the story.

Final Reflections

As of July 7, 2025, ChatGPT incorrectly identified Ghana’s Minister for Information as Hon. Fatimatu Abubakar. In that regard, there is urgent need for journalists to verify AI-generated information before relying on it.

Meanwhile, China’s DeepSeek AI refuses to answer sensitive questions about President Xi Jinping or human rights in China. This shows how countries are curating datasets to shape their narratives.

The machines may be learning fast. But our judgment, empathy, and courage remain the most powerful newsroom tools in an AI era.

 

Share this story!

Related Stories