A Wisconsin man has been arrested for allegedly creating AI-generated child sexual exploitation material.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

A Wisconsin software engineer was arrested Monday for allegedly creating and distributing thousands of AI-generated images of child sexual exploitation material (CSAM).

Court documents describe Steven Andregue as “highly technically savvy” with a background in computer science and “decades of experience in software engineering.” Andregue, 42, is accused of sending AI-generated images of nude minors to a 15-year-old boy via Instagram DM. Andregue was put on the radar of law enforcement agencies after the National Center for Missing and Exploited Children flagged the messages, which he allegedly sent in October 2023.

According to information law enforcement obtained from Instagram, Andregue posted an Instagram Story in 2023 that “contains a realistic GenAI image of a minor wearing BDSM-themed leather clothing” and sent others on Telegram ” Checkout” encouraged. In private messages with other Instagram users, Andrej allegedly “discussed his desire to have sex with prepubescent boys” and told one Instagram user that there were “tons of AI-generated CSAM” on his Telegram. “There are pictures.

Andregue allegedly started sending the photos to another Instagram user after learning that he was just 15 years old. “When the minor stated his age, the defendant did not reprimand him or question him further. Instead, he wasted no time telling the minor how sexually explicit he was. GenAI creates images and sends customized content to the child,” the charging documents claim.

When law enforcement searched Andregue's computer, they found more than 13,000 images, including “hundreds — if not thousands — of nude or semi-clothed prepubertal minors,” according to prosecutors. “. Andrig created the images on the text-to-image model Stable Diffusion, a product developed by Stability AI, and used “highly specific and clear cues” to create the images, charging documents say. Andregue also allegedly used “negative cues” to avoid creating images depicting adults and used third-party stable diffusion add-ons that “specialized in generating genitalia.”

Last month, several major tech companies, including Google, Meta, OpenAI, Microsoft, and Amazon, said they would review their AI training data for CSAM. Companies are bound by a new set of rules that include “stress testing” models to ensure they are not creating CSAM. Stability AI has also signed the rules.

According to prosecutors, this is not the first time Andregue has come into contact with law enforcement over his alleged possession of CSAM through the peer-to-peer network. Prosecutors claim that in 2020, someone using the Internet at Andregue's Wisconsin home attempted to download several known CSAM files. Law enforcement searched his home in 2020, and Andregue admitted to having a peer-to-peer network on his computer and frequently resetting his modem, but was not charged. .

In a brief supporting Andregue's pretrial detention, the government noted that he has worked as a software engineer for more than 20 years, and that his CV includes a recent job at a startup. , where he used his excellent technical understanding to build AI models.

If convicted, Andregue could face up to 70 years in prison, though prosecutors say the “recommended sentence could be as high as life in prison.”

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment