Law enforcement struggles to prosecute AI-generated child pornography, asks Congress to act

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Law enforcement agencies are struggling to prosecute abusive, sexually explicit images of minors created by artificial intelligence (AI), Rep. Anna Paulina Luna (R-Fla.) told House Oversight Subcommittee on Tuesday. Told fellow members at the committee hearing.

Laws against child sexual exploitation material (CSAM) require “a real picture, a real picture, of a child, to be prosecuted,” Carl Zabo, vice president of the nonprofit NetChoice, told lawmakers. . With creative AI, average juvenile photos are being transformed into fictional yet vivid content.

“Bad actors are taking images of minors, using AI to edit sexually compromising positions, and then avoiding the letter of the law, not the intent of the law,” Sabo said. letter,” Szabo said.

Attorneys general from all 50 states wrote a bipartisan letter urging Congress to “study the causes and methods. [AI] used to exploit children and “propose solutions to prevent and combat such exploitation to protect America’s children.”

The letter calls on Congress to “explicitly cover AI-generated CSAM” to enable prosecution.

“That’s actually something that the FBI, in talking to them about cybercrimes, specifically asked us to look for because they currently have trouble prosecuting these serious, sick people. There are problems. Because, technically, a child doesn’t get hurt in the process, because it’s a created image,” Luna said.

Hill has contacted the FBI for comment.

Although AI-generated CSAM represents a small fraction of the abusive content currently circulating online, the ease-of-use, versatility, and highly realistic nature of AI programs mean their use for CSAM will increase, said John Sheehan, vice president of the Exploited Children Division at the National Center for Missing and Exploited Children (NCMEC).

Lawmakers and witnesses frequently cited research from the Stanford Internet Observatory, which found that generative AI is enabling the creation of more CSAM, and that training data for publicly available AI models is tainted with CSAM. .

NCMEC provides “the nation’s central reporting system for online child exploitation,” called the CyberTipline. According to Sheehan, despite the “explosion” in the number of apps or services available, only five generative AI companies have submitted reports to the tipline to date.

“State and local law enforcement agencies are having to deal with these issues, because technology companies are not taking steps on the front end to make these tools secure by design,” he said.

Sheehan also noted that “nudifying” or “declothing” AI applications or web services was particularly important in terms of CSAM’s generation.

“None of the platforms offering ‘Nudify’ or ‘unclothe’ apps have registered to report to NCMEC’s ​​CyberTipline. None have contacted NCMEC to prevent child sexual exploitation and the creation of nudity.” No one has spoken about and no one has submitted a report to NCMEC’s ​​cyber tipline,” he said.

“The sheer volume of cybertips often prevents law enforcement from conducting proactive investigations in the first place that would effectively target the most serious criminals,” said Rep. Nick Langworthy (RN.Y.).

“In just three months from November 1, 2022 to February 1, 2023, there were more than 99,000 IP addresses across the United States that distributed known CSAM, and only 782 were investigated. Currently, law enforcement agencies , through no fault of their own, they just don’t have the capacity to investigate, to prosecute, a large number of these cases,” said Langworthy, adding John Pizzoro, CEO of the nonprofit Raven. citing information from previous testimony by, during a February 2023 Senate Judiciary hearing on protecting children online.

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment