Kids and teens face new AI-generated nude photos of themselves, too
For the first time and surely not the last, a Wisconsin man was arrested for generating child pornography using the Stable Diffusion AI image generator.
Court filings describe 42-year-old software engineer Steven Anderegg as “extremely technologically savvy,” with a background in computer science and “decades of experience in software engineering.”
Anderegg is accused of sending AI-generated images of naked minors to a 15-year-old boy via Instagram DMs. The National Center for Missing & Exploited Children flagged the messages, which he allegedly sent in October 2023, and helped Wisconsin law enforcement begin an investigation.
AI represents a new challenge in the fight against child sexual abuse material (CSAM)
Anderegg allegedly produced and shared thousands of AI-generated images of child sexual abuse material (CSAM). We know there are already thousands of photos and videos available online through peer-to-peer (P2P) networks.
P2P networks work just like the song-sharing app Napster years ago, where no central server hosts the files. There is no website URL people go to for this material. The material merely passes between phones and laptops with no commercial interests in between, almost like whispering information from person to person without ever writing anything down. This makes them challenging to shut down since the files — the literal bits of data that make up the photos or videos — both exist everywhere and nowhere at once, like rumors between two people’s minds that have never been written down.
But generating images with AI models poses new challenges, including the ability to produce immense amounts of CSAM material within minutes. It is possible that AI generators could flood the darkest corners of the Internet with more CSAM material in the next six months than all the material that has heretofore existed in the last 30 years.
About a year ago, students at a Westfield, New Jersey high school generated fake AI-generated nude photos of students, mostly young girls. The images combined photos of the student’s faces with AI-generated nude bodies. Even when students knew the images were fake, the damage to self-esteem through gossip and being looked at as “dirty” were impossible to overcome.
Legal challenges are also likely, as defendants argue no one was harmed and no child was a victim since the images are of no specific individual. The Justice Department says, “CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive, and increasingly photorealistic images of children.”
Lawmakers and courts will have to grapple with this question since models are trained on existing known images. In Anderegg’s filing, prosecutors say he was also generating images of minors in sexually explicit clothing. AI models have plenty of images of children and of adults in explicit clothing. AI image models could reasonably synthesize the two in a single prompt with nothing stopping it except guardrails and the quality of the restrictions placed by the AI image generation service.
AI images are already super-realistic
AI images have evolved quickly from the mangled 7-finger and cartoonishly garish images of just 18 months ago. New AI systems have advanced to create images like these that may be hard to recognize as AI, or, to most people, seem “good enough”. The following images are all examples generated with AI:
CSAM material and pornographic images are going to be new frontiers for parents, too, who will need to talk to kids about online safety, safe sex, respect for others, and the risks of addiction. Schools will need to update their student handbooks to create stiff penalties for students who create nude AI images, and law enforcement will have to be ready to investigate a new frontier of cyber crime.
Services like Stable Diffusion, among others, have policy restrictions that prohibit the use of explicit, illegal, or overly-sensitive topics from being generated. But these are challenging edge cases. Services like Stable Diffusion can be downloaded to a machine and run on a person’s own computer. Even if the text-to-response or text-to-image AI systems say they can’t perform an operation, some of the local code can be tweaked, or, just repeatedly asking or just rephrasing the question or prompt can circumvent the block since the AI is “adapting” during conversation.
For now, much of their use is shrouded in technical requirements many people are not likely to know how to use. But as their presence gets built-in to more software platforms and services, it seems inevitable that pornographic material — whether of children or adults — will become more available.