Study Finds Majority of U.S. Teens Use AI to Create Nude Images

Study Finds Majority of U.S. Teens Use AI to Create Nude Images
Digital Trends

Key Points

  • A PLOS ONE study surveyed 557 U.S. teens ages 13‑17 about AI‑generated nude images.
  • 55.3% reported creating at least one AI‑generated nude image of themselves or others.
  • 54.4% said they had received AI‑generated nude images.
  • 36.3% experienced a non‑consensual AI‑generated nude image of themselves.
  • 33.2% reported that non‑consensual images were shared without permission.
  • Male teens reported higher rates of creation and distribution.
  • Researchers warn AI nudification can bypass consent and cause harm similar to child sexual exploitation.
  • The study calls for legislative and educational responses to address the issue.
  • Industry leaders are debating adult‑content AI models, highlighting the broader context.

A new study published in PLOS ONE surveyed 557 U.S. teens ages 13 to 17 and found that more than half have used AI tools to generate nude images of themselves or others. Over half also reported receiving AI‑generated nude images, and a third said such images were shared without consent. Male participants reported higher rates of both creation and distribution. Researchers warn the ease of AI‑nudification could worsen consent issues and call for action by lawmakers and educators.

Background

A research team led by Chad Steel of George Mason University conducted an anonymous survey of 557 English‑speaking U.S. residents between the ages of 13 and 17. The study, published in the open‑access journal PLOS ONE, was carried out with parental consent and focused on the use of artificial‑intelligence (AI) tools for creating sexualized images.

Key Findings

The results show that 55.3% of teens surveyed reported using nudification tools to create at least one image of themselves or others. A similar proportion, 54.4%, said they had received AI‑generated nude images. Moreover, 36.3% indicated that a sexualized AI image of themselves had been created by someone else without their consent, and 33.2% reported that those images were shared without permission.

Male participants reported higher rates of both creating and distributing these images, whether consensually or non‑consensually. The study notes that the prevalence of these behaviors was largely consistent across different demographic groups.

Implications and Reactions

Researchers warn that AI‑nudification tools remove the need for a willing participant, allowing anyone with a photo and access to an app to generate a fake nude image. Victims experience consequences similar to other forms of child sexual exploitation material, including a sense of dehumanization and lasting disruption to their lives.

Steel summed up the shift, stating, "Teens are no longer just digital natives but AI‑natives. ‘Nudification’ and GenAI apps are their new ‘sexting,’ only with more challenging issues surrounding consent." The study’s authors hope the findings will prompt lawmakers and educators to address the problem before it becomes even more difficult to manage.

Industry Context

The report arrives amid broader debates about adult content generation in AI. OpenAI CEO Sam Altman is pushing to release an “adult version” of ChatGPT, while Elon Musk has advocated allowing Grok to generate R‑rated content. These discussions underscore the growing concern about how AI‑generated sexual content may be used and regulated.

#artificial intelligence#teenagers#sexual content#privacy#consent#digital abuse#AI‑generated images#child exploitation#legislation#education
Generated with  News Factory -  Source: Digital Trends

Also available in: