Study: Teen girls are using AI to create sexual imagery
Nudification tools are surprisingly popular with teen girls. Researchers wonder why.

A new study suggests that teen girls use so-called nudification apps at the same rate as teen boys. The artificial intelligence-powered undressing tools allow users to create sexualized images of a person, typically by uploading a picture of them.
The results surprised Dr. Chad M.S. Steel, a digital forensics researcher at George Mason University who studies technology-facilitated crimes against children.
"Males tend to be more involved in any type of online sexual endeavors, whether it's sexting or viewing pornographic material or the like, there's usually a much stronger signal for males than females," Steel said of the findings, which were published Wednesday in the journal PLOS One.
SEE ALSO: Parents need to talk to their kids about this online danger right nowIn Jan. 2025, Steel conducted an online survey of 557 English-speaking adolescents ages 13 to 17. Even a year ago, Steel found widespread use of nudification tools. Fifty-five percent of the respondents said they'd created a sexualized image, and 54 percent said they'd received one.
More than a third of teens said they'd been victims of the technology. More than a third reported that someone had made a non-consensual image of them, and a third said an image of theirs had been shared without their permission.
Roughly 1 in 6 teen girls and boys used nudification tools frequently to see how they looked. About the same share of teen girls shared such imagery "once or twice" with someone else. A slightly smaller percentage of boys reported the same behavior.
Why girls might be using nudification tools
Steel didn't ask the teens why they used nudification tools, though sexting is a common practice among adolescents. He suspects that the popularity of "try it on" clothing and makeup visualization tools among girls builds familiarity with the same type of engagement as nudification apps. Coupled with male coercion for sexually explicit imagery, teen girls may find themselves using a familiar technology to deal with the pressure, Steel explained.
Dr. Linda Charmaraman studies girls' wellbeing with an emphasis on social media and digital health but wasn't involved in the study. She reviewed the findings and told Mashable that teens are in a delicate developmental period as they form their identities and seek social connection and acceptance.
"When you combine that time of development with AI, it can bring further risks," Charmaraman, director of the Youth, Media, & Wellbeing Research Lab at Wellesley College, wrote in an email. "For example, there might be a lot of pressure for girls to create certain kinds of content in order to fit in with their peers and to possibly promote their social status."
Boys did report higher usage of generative AI than girls to create and distribute sexual imagery, both with and without the permission of the subject.
Steel said that he would like to see his results replicated among a much larger sample of teens.
"In this case, I'd love to find out that I had an extremely unusual subset," Steel said.
Charmaraman said that the survey's nationally representative sample and effective quality checks indicate it reached diverse households. Yet she wondered whether the way the survey was advertised could have attracted "technology-savvy" participants, potentially skewing the results.
Top takeaways for parents
Nudification has become normal.
Steel said the survey results suggest that teen use of nudification tools has become widespread, and that "we have no idea what the effects will be."How to talk to your teen about nudification imagery.
Steel urges parents to consider the likelihood that their child will encounter nudification tools and imagery, and talk to them nonjudgmentally about the risks. Focusing on abstention won't work, he added, given that teens may see AI-created sexual imagery as a natural extension of exploring their sexuality.
Charmaraman recommends regular conversations about what's happening in teens' digital lives. This builds a strong foundation so that if parents learn about distressing incidents like non-consensual sharing of AI-generated sexual imagery, the lines of communication are already open. Instead of immediately restricting an app or device, Charmaraman suggests learning more about a teen's intentions, such as why they wanted to create sexual imagery and whether they were coerced by strangers or peers.Deterring illegal imagery.
Steel said teens often don't grasp that they're creating what's known as child sexual abuse material when they use nudification tools. It's unlikely that they'll face legal consequences when that imagery is shared consensually with an adolescent peer.In order to deter teens from creating and sharing images without permission, Steel recommends policymakers explore a bystander approach wherein teens are taught the value of speaking up if they learn their friends or peers are going to use AI to generate sexual imagery of a victim.
Charmaraman has previously advocated for a "duty of care" standard that shifts safety responsibility from the user to the tech companies that design platforms.
"Tech companies must also provide tools that allow minors and their parents to manage their digital experience, including the ability to disable certain product features and protect their personal information," she said.
The risk of sextortion.
Teens might not always understand that predators are highly interested in amassing collections of child sexual abuse material, including AI-generated imagery. Adult predators may ask teens for this content online, or they may use a nudification app to create that imagery based on publicly available pictures of the victim. Some predators may even try to sextort the teen using a nudified image they made on their own.
Steel said both parents and teens should be aware of this possibility. Teens might consider using social media account protections, such as keeping their accounts private and allowing only well-known followers to access their photos.
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information and a list of international resources.