How Congress is fighting the rise of non-consensual AI porn.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

(The Hill) — Political momentum to control the spread of unconscionable deepfakes is building as the problem of digitally altered images moves from a potential threat to a reality.

Several bipartisan bills introduced in Congress aim to curb the spread of unconsented explicit images created using artificial intelligence (AI), an issue that has plagued not only public figures and celebrities, but everyday people. It has affected people and even children.


Anna Olivarius, founding partner of McAllister Olivarius, a transatlantic law firm specializing in race and gender discrimination issues, said, “Last year, it's a really new thing where it's forced itself — where we've had a big There is a problem.” .

In January, apparent AI-generated images of what looked like Taylor Swift circulated online, drawing widespread attention to the issue. The outcry prompted lawmakers and the White House to pressure the platform to enact its own laws and curb the spread of such images.

While the Swift deepfake scandal has highlighted the rise of non-consensual AI porn, the problem is much broader. Even schools have been forced to face new forms of cyberbullying and harassment as students create and spread deep fakes of their peers in a largely unregulated space.

“It's affecting a lot of everyday people,” Olivarius said.

Legislators were also targeted. Rep. Alexandria Ocasio-Cortez (D-N.Y.), one of the lawmakers spearheading a bill to fight obvious deepfakes, herself disagreed with obvious deepfakes in an April interview with Rolling Stone. Talked about being targeted by

This issue is gaining support from legislators in the political arena. One bill, the Defense Act, is being led by Ocasio-Cortez and Sen. Judiciary Committee Chair Dick Durbin (D-Ill.), while the other, the Take It Down Act, is being led by Sen. Ted Cruz (R-Texas). ) and Amy Klobuchar (D-Minn.).

“The support on both ends is amazing,” Oliverius said.

“It looks like we finally have something here that lawmakers can agree on or actually have enough to pass,” he said.

Both bills aim to address this issue from different angles. The Defense Act, which was introduced in March, would create a federal civil cause of action that would allow victims to sue individuals who produce, distribute or solicit deepfakes.

The Take It Down Act, which was introduced last month, would make it a federal criminal offense to publish or threaten to publish non-consensual digitally altered images. Online It would also create a process that would allow victims to force tech platforms to remove unconsented explicit deepfakes that portray them.

Durban spokeswoman Emily Hempston said the two bills are interrelated and her staff is in discussions with the offices of the other bill's sponsors.

While there is bipartisan support for the bills, getting them passed could be an uphill battle — especially in the months leading up to a contested election in which the White House's power and both chambers are at stake. Get on.

Durban, the Senate majority whip, put the defense act up for a unanimous consent vote in June, but it was blocked by Sen. Cynthia Loomis (R-Wyo.) — a co-sponsor of a take-it-down act.

Lummis spokeswoman Stacey Daniels said the senator “supports the intent of the Defiance Act” but is “concerned that the legislation contains overly broad language that could unintentionally threaten privacy technology and It can stifle innovation while failing to protect victims.”

Lummis' team is working with Durbin's to try to resolve the issues, Daniels said.

“Senator Lummis supports the Take It Down Act for its more tailored approach, which ensures that those who produce or knowingly distribute deepfake pornography,” Daniels said in an email. They are held accountable.”

Olivarius said the civil remedies created in the Defense Act are “very powerful” because it would give individuals the power to bring an action. The Take It Down Act, however, is “more narrow.”

Victims' rights advocate Cary Goldberg said the Take It Down Act is an “interesting new approach” but also highlighted potential hurdles in how it would be enforced as a criminal law.

“I'm very skeptical of laws that just put power back in the government,” Goldberg said.

“Then it becomes an issue of whether law enforcement is taking it seriously,” he said.

At the same time, Goldberg said, one goal of such a bill is to demonstrate that the conduct is illegal, and that it alone can deter criminals.

He also said tech companies could argue that Section 230 of the Communications Decency Act could preempt the bill's notice-and-takedown provision. Section 230 protects platforms from being held liable for content posted by third parties.

“But because it's a federal law that kind of conflicts a little bit with another federal law, it's going to be interesting to see how it plays out,” Goldberg said.

Another bill to counter unconstitutionally obvious deepfakes was introduced in May by Sens. Maggie Hassan (DN.H.) and John Cornyn (R-Texas). The legislation would make it illegal to share deepfake pornographic images and videos without consent and make sharing such images a criminal offence. The bill would also create a private right of action for victims to file lawsuits against the parties sharing the images.

Oliverius urged Congress to act on the issue, emphasizing its impact on women in particular and its serious — and even potentially fatal — effects, citing cases where victims have been converted. have died by suicide after the leaked images.

“Society hasn't done much to show that many people care about women,” she said. “This [support for the bills is] Extraordinary I think it's great, and hopefully we can get it on the books as soon as possible,” he said.

With the potential obstacles posed by Section 230, though, Goldberg said Congress should prioritize eliminating the controversial provision to help victims.

“The best way to handle many of the losses that occur on platforms is for the platforms themselves to share the cost and responsibility,” Goldberg said.

“The power needs to shift to people, and they need to be able to sue or demand that content be removed from platforms,” ​​he added.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment