Gwengoat/Getty Images

US, allies take down Kremlin-backed AI bot farm

The bot farm allegedly originated from a deputy manager at RT — a Russian state-backed news agency — and spread disinformation on the X social media platform.

The FBI dismantled two websites and nearly 1,000 accounts on the X social media platform that were used by Kremlin spin doctors to run an AI-powered campaign that sought to spread disinformation within the U.S. and abroad.

The disinformation operation was carried out by a bot farm, a network of automated accounts, or “bots,” controlled to perform tasks across digital platforms, including spreading propaganda, amplifying messages or engaging in coordinated campaigns to influence public opinion. 

The FBI and international partners targeted a total of 968 accounts, seizing a portion of them, while the X platform, formerly known as Twitter, voluntarily suspended the remaining trove of sham users identified in court documents that authorized their seizure, the Justice Department said in a prepared statement. It did not specify how many accounts were seized by the U.S., nor the remaining amount taken down on X, which did not immediately respond to a request for comment.

“Russia intended to use this bot farm to disseminate AI-generated foreign disinformation, scaling their work with the assistance of AI to undermine our partners in Ukraine and influence geopolitical narratives favorable to the Russian government,” FBI Director Christopher Wray said in a statement.

The seized websites — mlrtr.com and otanmail.com — display takedown notices with signatures from American, Canadian and Dutch intelligence authorities. The hundreds of fake accounts are shown to have been affiliated with both domains and are now displayed as suspended when searched for on X.

A separate affidavit indicates that the bot operation first began in 2022 through an unnamed “Individual A” that worked as the deputy editor-in-chief of state-run Russian news organization RT. At that time, RT leadership aimed to find new ways to disseminate information beyond their traditional TV broadcasts. Individual A spearheaded the creation of software to run the social media bot farm, which would generate fake online personas to distribute information widely, DOJ said.

RT's website names Anna Belkina as RT’s deputy editor-in-chief and head of communications, marketing and strategic development throughout the time described in the affidavit. 

The bot farm’s development was carried out by an unnamed “Individual B” and other team members, who concealed their identities and location in Russia while working to acquire the necessary infrastructure for the bot operation in April 2022, just two months after Russia began its war in Ukraine.

Early last year, an unnamed officer in Russia’s federal security service, or FSB, created a private intelligence network with Kremlin approval that was composed of Belkina, the bot farm developer and other RT employees, DOJ says. 

The entity then leveraged the bot apparatus to push narratives about Russia’s actions in Ukraine and other slanted geopolitical narratives that aimed to shift the burden of negative U.S. pressure on Russia, including a fake U.S. resident of a city written only as “Gresham” that claimed the death tolls of foreign fighters embedded with Ukrainian forces are lower than publicly available estimates.

"I’m more than happy to tend to my farm (dacha) - made up mostly of tomatoes and strawberries, but sadly without any help from the FSB," Belkina told Nextgov/FCW in a statement shared by RT's press office.

The takedown marks a major U.S. push to clamp down on Russian information operations that aim to sow doubt about domestic and international politics on social media. An investigation into the operation is still ongoing, DOJ said, though it underscores yet another attempt made by the Kremlin’s clan of information operations specialists that intelligence officials have persistently warned of as elections take place around the world in 2024.

The U.S. has frequently warned of Russia’s attempts to use disinformation campaigns. Moscow views the coming presidential election as an opportunity to deploy influence operations, a recent intelligence community threat assessment says. Russia is contemplating how election outcomes will affect U.S. support for Ukraine and “probably will attempt to affect the elections in ways that best support its interests and goals,” the Office of the Director of National Intelligence said in that assessment.

The move could serve as another warning sign to Moscow to refrain from deploying similar tactics in November, after the Treasury Department in March sanctioned Kremlin-backed firms for operating a collection of fake news sites that sought to tip the outcomes of since-past European elections. Russia’s embassy in Washington, D.C. did not return a request for comment.

The RT news organization in early 2022 came under fire as Western nations scrutinized its ties to the Kremlin.

Officials and researchers fear that consumer-facing AI tools or similar offerings available on the dark web will supercharge hackers’ abilities to craft realistic-sounding campaigns that seek to or instill distrust in domestic democratic processes, including the upcoming election. Last October, a U.S. intelligence report was sent to over 100 countries warning that Russia is using spies, social media and state-run media accounts for disinformation purposes.