A small army fighting the deluge of deepfakes in India's elections

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

In the midst of a high-stakes election held during a mind-melting heat wave, a blizzard of obscure deepfakes is sweeping across India. The variety seems endless: AI-powered simulation, ventriloquism and hallucinatory editing effects. Some of it is crude, some is a joke, some is so obviously fake that it can never be expected to be genuine.

The overall effect is confusing, adding to a social media landscape already filled with misinformation. The volume of online detritus is simply too much for any Election Commission, let alone to keep track of.

A diverse group of vigilant fact-checking organizations has emerged to fill the breach. While the wheels of the law grind slowly and unevenly, the task of tracing deepfakes has been taken up by hundreds of India-based government workers and private fact-checking groups.

“We have to be prepared,” said Surya Sen, a forest officer in Karnataka state who was reassigned during the election to lead a 70-person team hunting for misleading AI-generated content. were Social media is the battleground this year. When Mr. Sen's team finds content they believe is illegal, they ask social media platforms to take it down, report it as fraud or call for criminal charges to be filed. .

Celebrities have become familiar fodder for politically astute maneuvers, including Ranveer Singh, a star of Hindi cinema.

During a videotaped interview with an Indian news agency on the Ganges River in Varanasi, Mr Singh praised powerful Prime Minister Narendra Modi for celebrating “our rich cultural heritage”. But that's not what viewers heard when an altered version of the video, with Mr Singh's voice and near-perfect lip sync, took social media by storm.

“We call this lip sync deepfake,” said Pimposh Raina, who heads the Deepfake Analysis Unit, a consortium of Indian media houses that has opened a tip line on WhatsApp where people can check suspicious videos and audio. can send for He said Mr Singh's video was a typical example of authentic footage edited with an AI clone voice. The actor filed a complaint with the Cyber ​​Crime Unit of the Mumbai Police.

No party has a monopoly on misleading content in this election. Another manipulative clip unfolded with authentic footage showing Mr. Modi's most prominent rival, Rahul Gandhi, taking part in the mundane ritual of being sworn in as a candidate. It was then layered with an AI-generated audio track.

Mr. Gandhi did not actually resign from his party. The clip also includes a personal dig, with Mr. Gandhi saying he “can no longer pretend to be a Hindu.” The ruling Bharatiya Janata Party portrays itself as the defender of the Hindu faith, and its opponents as traitors or impostors.

Sometimes, deep political flaws seep into the supernatural. Dead politicians have a way of coming back to life through uncanny, AI-generated similes that endorse the real-life campaigns of their descendants.

In a video that surfaced days before voting began in April, a resurrected H Vasanth Kumar, who died of Covid-19 in 2020, spoke indirectly about his death and blessed his son Vijay, who had He is running for his father's former parliamentary seat. In the southern state of Tamil Nadu. The appearance follows a precedent set by two other late titans of Tamil politics, Muthuvel Karunanidhi and Jayalalitha Jayaram.

Mr. Modi's government is enacting laws that are supposed to protect Indians from deepfakes and other types of misleading content. An “IT Rules” Act of 2021 makes online platforms, unlike in the United States, liable for all types of objectionable content, including impersonation intended to offend. The Internet Freedom Foundation, an Indian digital rights group, which has argued that these powers are too broad, is tracking 17 legal challenges to the law.

But the Prime Minister himself seems to be accepting some form of AI-generated content. A pair of videos produced with AI tools show two of India's biggest politicians, Mr Modi and Mamata Banerjee, one of his staunchest opponents, a viral YouTube video by American rapper Lil Yachty. Doing the “hardest walkout ever” imitating the video.

Sharing the video on X, Mr. Modi said that such creativity is a “joy”. Election officials like Mr. Sen in Karnataka called it political satire: “A Modi rock star is fine, not a violation. People know it's fake.”

Police in West Bengal, where Ms Banerjee is chief minister, issued notices to some people for posting “offensive, malicious and inflammatory” content.

On the search for DeepFax, Mr. Sen said his team in Karnataka, which works for the opposition-controlled state government, carefully scrolled through social media platforms like Instagram and X, looking for keywords. and frequently updates the accounts of popular influencers.

The DeepFax analysis unit has 12 media fact-checking partners, including a pair close to Mr Modi's national government. Ms. Raina said her unit also works with outside forensic labs, including one at the University of California, Berkeley. They use AI detection software like TrueMedia, which scans media files and determines if they should be trusted.

Some tech-savvy engineers are refining AI-forensic software to identify which part of a video was manipulated, down to individual pixels.

Pratik Sinha, founder of Alt News, one of India's most respected independent fact-checking sites, said the potential of deepfakes has yet to be fully exploited. Someday, videos may show politicians not only saying things they didn't say, but also doing things they didn't do, he said.

Dr. Hani Fareed has taught digital forensics at Berkeley for over 25 years and collaborates with the Deep Fax Analysis Unit on some cases. While “we're catching bad deepfakes,” he said, if more sophisticated fakes enter the field, they may go undetected.

In India, as elsewhere, an arms race is on, between the deep-fakers and the fact-checkers – fighting on all sides. “In the first year, I'd say we've really started to see the impact of AI in interesting and sinister ways,” Dr Farid said.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment