Category
Anti-Fake

Russia Runs ChatGPT-Powered Propaganda Farms to Target Africa—OpenAI Report

Russia Runs ChatGPT-Powered Propaganda Farms to Target Africa—OpenAI Report

While Moscow boasts of technological self-reliance and rejects Western systems, a new OpenAI report reveals that Kremlin-linked actors are quietly relying on American AI tools to wage influence campaigns in Africa.

4 min read
Authors
Photo of Illia Kabachynskyi
Feature Writer

OpenAI has released a major new report titled Disrupting Malicious Uses of Our Models. In it, researchers describe how ChatGPT and other company systems have been used not for the public good, but with harmful intent. Among the cases outlined are two examples of Russians using the company’s models with malicious intent to generate large volumes of fabricated content. We provide a brief overview of each.

Operation “Fish Food”

Rybar  is one of the largest information networks operating in the Kremlin’s interests. The network runs a major Telegram channel and maintains a presence on other platforms.

We bring you stories from the ground. Your support keeps our team in the field.

DONATE NOW

Using ChatGPT, members of the network wrote articles, drafted posts and accompanying comments, and planned their work. Content production targeted both Russian and foreign audiences, with some materials created in English and Spanish. Users typically submitted prompts in Russian but generated content in multiple languages. In addition to ChatGPT, the AI video generator Sora was also used.

The content produced as part of the operation reflected typical patterns of covert Russian information campaigns:

  • praise of Russia and its allies (including Belarus);

  • criticism of Ukraine;

  • accusations of Western interference.

In effect, the Rybar team built a content farm using ChatGPT to run disinformation campaigns on social media. The efforts gained traction: one of the most popular tweets from ChatGPT on Rybar’s X account received 150,000 views.

One article accused Germany of creating an influence network in Moldova; in addition to drafting the piece, users asked ChatGPT to generate comments to accompany it.

Another request involved editing a proposal for an election interference team, reportedly deployed in Africa. The plan included both offline activities (building networks of local agents and organizing mass events) and online campaigns. Prompts also addressed:

  • an information campaign regarding the Democratic Republic of the Congo;

  • electoral processes in Burundi and Cameroon;

  • campaign scenarios in Madagascar, including ideas for inciting protests.

Operation “No Bell”

OpenAI blocked accounts that were generating lengthy analytical articles and social media content on African geopolitics. The account was likely of Russian origin.

Its primary activity consisted of creating social media posts and long-form analytical materials on geopolitics in sub-Saharan African countries. The user primarily submitted prompts in English, though instructions were sometimes entered in Russian. Some of the content was published by Facebook pages posing as news outlets in South Africa, Ghana, Kenya, and Angola.

The materials largely focused on geopolitics in sub-Saharan Africa. Typical narratives included:

  • praise of Russia;

  • criticism of Ukraine, the United Kingdom, and the United States;

  • personalized attacks on Ukrainian President Volodymyr Zelenskyy and US President Donald Trump.

  • Beyond global geopolitics, the content also addressed local issues:

  • allegations that German arms manufacturer Rheinmetall used its South African subsidiary to circumvent export controls;

  • accusations that the British organization Crisis Action organized protests in South Africa;

  • separate texts concerning court cases involving British soldiers in Kenya.

At the same time, articles were published defending Russia’s role in Africa. For example, one piece praised Russia’s presence in the Central African Republic. Another—along with accompanying Facebook posts—accused Western leaders of running a disinformation campaign against South Africa. A separate article promoted the idea of awarding the Nobel Peace Prize to the President of Angola, reportedly to “irritate” the team of US President Donald Trump.

These articles were often published under the names of individuals who did not exist; the OpenAI team separately searched for and verified each listed author. Users also asked ChatGPT to write in the style of a “real, living journalist” and to remove long dashes, which are often perceived as a telltale sign of AI-generated text.

Destabilization worldwide

Each of these examples directly demonstrates that Russia is leveraging every available tool to destabilize regions, using new technologies to create a negative information environment. The Kremlin seeks to glorify itself after launching Europe’s largest war since World War II, while simultaneously discrediting leaders of the free world. Efforts to preserve its influence in Africa are a particularly vivid example.

This underscores a broader reality: although active combat is confined to Ukrainian territory, Moscow is running operations across the globe. Hybrid warfare has long been underway in Europe, and the information domain is actively engaged in Africa. There is little indication that these efforts will stop.

FAQ

What is OpenAI?

OpenAI is a US-based artificial intelligence research company behind ChatGPT and other powerful generative tools that can produce text, images, audio, and video.

What are AI content farms?

AI content farms are coordinated networks that use artificial intelligence to mass-produce articles, posts, and digital media at scale. While they can serve legitimate marketing purposes, they are also used to flood the internet with propaganda and disinformation.

See all

fisherman

Support UNITED24 Media Team

Your donation powers frontline reporting from Ukraine.
United, we tell the war as it is.