Before the US election, Russian disinformation videos tricked Biden.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Last month, a video began circulating on social media telling the story of an internet troll farm in Kiev targeting the US election.

Speaking in English with a Slavic accent, “Olesya” gives a first-person account of how she and her colleagues initially worked in support of Ukrainian President Volodymyr Zelensky. Then, she says, after a visit from mysterious Americans who were “probably CIA,” the group began sending messages to American audiences in support of President Biden.

“We were told that our new target is the United States of America, specifically the upcoming elections,” the woman says in the video. “Long story short, we were told to do everything we could to prevent Donald Trump from winning the election.”

The video is fake, part of an effort to cloud the political debate ahead of the US election.

U.S. officials say the video is consistent with Russian disinformation operations because Russian-linked Internet fighters appear to be honing their tactics. Some old tactics from 2016 or 2020 can be reused with new improvements.

While there has been much debate over the role artificial intelligence could play in fooling voters this year, current and former officials said the videos are one of the most immediate threats.

Microsoft said the video featuring “Olysia” likely came from a group it calls Storm-1516, which is now focusing on making videos it hopes will spread across the U.S. Can be viral.

The group likely includes veterans of the Internet Research Agency, a Kremlin-linked troll farm that tried to influence the 2016 election. The agency was run by Yevgeny Prigozhin, the founder of the Wagner Group, who led a coup against the Kremlin and was later killed in a plane crash that U.S. and coalition officials believe was caused by Russian intelligence agencies. was arranged by

Microsoft said the group also included people associated with Valery Korovan, head of an obscure Moscow-based think tank called the Center for Geopolitical Expertise, a conservative organization linked to Alexander Dugin. who is an ultranationalist writer facing US sanctions for his role in militant recruitment. war

Russian activists flock to the videos, many of which falsely claim to be made by independent journalists or whistleblowers. Videos, as opposed to blogs or social media posts, are more likely to spread beyond America's conspiratorial confines and become part of the mainstream conversation.

On Wednesday afternoon, senior officials, including Director of National Intelligence Avril DeHanes, are scheduled to testify before the Senate on election threats from Russia, China, Iran and other countries. Senator Mark Warner, Democrat of Virginia and chairman of the Senate Intelligence Committee, warned that Americans seem to be increasingly interested in conspiracy theories that bolster their views and, in turn, foreign influence. are victims of campaigns.

Clint Watts, general manager of Microsoft's Threat Analysis Center, said that bots pushing written disinformation is mostly a waste of time — in 2024 it's disinformation video that has the best chance of reaching an American audience.

The CIA video, Mr. Watts said, was a classic Russian tactic: accuse your adversary of doing the same thing you're doing. “When they say Zelensky has a troll farm in Ukraine that's been going on since the US election, what they're saying is what we're doing,” Mr Watts said.

CIA spokesman Walter Truson said the agency was not involved in the activities described in the video.

“This claim is patently false and precisely the kind of misinformation that the intelligence community has long warned about,” Mr Truson said. “The CIA is a foreign-focused organization that takes very seriously our responsibility to remain uninvolved in American politics and elections.”

Several groups in Russia spread disinformation aimed at the United States. In addition to the videos, researchers and government officials say, Russia has created a handful of fake American local news sites and is using them to push Kremlin propaganda, including stories on crime, politics and culture. .

Gen. Paul M. Nakasone, who retired from the military earlier this year and is a former director of the National Security Agency, said the best defense against Russian disinformation remained the same: identifying it and disseminating the propaganda. He said the United States needs to expand its information sharing domestically and around the world so people can identify and discount disinformation being spread by Moscow.

“The great antidote to all of this is to shine a light on it,” said General Nakasone, who was named founding director of Vanderbilt University's new Institute for National Defense and Global Security last week. . “If they are trying to influence or interfere in our elections, we should make it as difficult as possible for them.”

Some mainstream Republicans have already warned fellow lawmakers to be wary of repeating claims that stemmed from Russian disinformation or propaganda.

Representative Michael R. Turner, Republican of Ohio, said: “We see direct communications from Russia trying to cover up messages that are anti-Ukraine and pro-Russian, some of which we're talking about on the floor of the House. Listen.” The House Intelligence Committee told CNN's “State of the Union” on April 7.

Russia's information warriors have pushed fake videos to spread lies about Ukraine, aimed at discrediting it or framing it as corrupt. Republican politicians opposed to sending more aid to Ukraine have repeated unsubstantiated allegations that Mr. Zelenskiy had tried to buy a yacht through associates, false information first posted on YouTube and other social media sites. Appeared on video.

Most of the videos produced by Storm-1516 fail to gain traction. Others come close. A video appeared on a Russian Telegram channel showing Ukrainian soldiers burning an effigy of Mr Trump, blaming him for delays in aid deliveries.

The video was featured on Alex Jones' right-wing conspiracy site, InfoWars, and other English-language outlets. But this was quickly discounted – the Ukrainian soldiers reportedly had Russian accents and were wearing masks.

“This campaign serves to further some of Russia's key objectives, particularly portraying Ukraine as a corrupt, rogue state that cannot be trusted with Western aid,” Mr Watts said.

Since last August, Microsoft has identified at least 30 videos produced by Storm-1516. The first was Ukraine's goal. But others are trying to influence American politics by conveying messages to right-wing audiences that Mr. Biden is benefiting from Ukrainian aid.

Intelligence officials, lawmakers and security firms have warned about the use of artificial intelligence by China, Russia and other countries to spread disinformation. But so far, Russian groups like Storm 1516 have mostly avoided using AI tools, according to security firms.

“Many AI campaigns are easy to detect or unravel,” said Brian Murphy, general manager of national security at Logically, which tracks disinformation. “AI is improving, but it's not yet at the stage this year where it will be used at the scale and quality that some predict. Maybe in a year or so.”

A better shot at working seems to be basic videos, like the CIA troll form or the yacht video, which aim to give authentic narrators access to great information.

In 2016, Russian-controlled propagandists can push fake news articles or social media posts and, in some cases, have influence. But now, those old techniques don't work.

Mr Watts said that nobody would notice it nowadays. “You have to have a video form to really get an American audience today, which wasn't technologically possible 10 years ago.”

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment