Teen Sues Nudify App Over AI-Generated Fake Nudes

Key Points
- Teen files lawsuit against a popular "nudify" app for creating AI‑generated fake nudes of her.
- She seeks to block the app in the United States and hold the creator accountable.
- The plaintiff reports lasting emotional distress and constant fear of the images resurfacing.
- A spokesperson says nonconsensual pornography is prohibited by Telegram’s terms of service.
- Around 45 states have criminalized fake nudes, and the Take It Down Act mandates rapid removal of such content.
A teenage girl has filed a lawsuit against the creator of a popular "nudify" app after AI‑generated fake nudes of her were circulated online, causing lasting emotional distress. She seeks to block the app in the United States and hold the responsible parties accountable. The case highlights growing concerns over nonconsensual pornography, the role of platforms in removing such content, and recent legal measures aimed at curbing AI‑generated sexual imagery.
Background of the Lawsuit
A teenage girl filed a lawsuit alleging that a "nudify" app used artificial intelligence to create fake nude images of her without consent. The complaint states that the images have caused her profound emotional distress, describing feeling "mortified and emotionally distraught" and fearing the images could reappear at any time.
Allegations Against the App and Its Operators
The plaintiff targets the app itself, arguing that if the developers fail to respond, a court could award a default judgment and block the app’s distribution in the United States. She also sued the boy who allegedly used the app to generate the images, claiming his actions prompted her to consider dropping out of school.
Impact on the Victim
The teen describes a constant sense of fear, saying she will spend "the remainder of her life" monitoring for any resurfacing of the images. She expressed hopelessness over the possibility that the images could reach pedophiles, traffickers, friends, family, future partners, colleges, employers, or the public at large.
Platform Response and Legal Context
A spokesperson told The Wall Street Journal that "nonconsensual pornography and the tools to create it are explicitly forbidden by Telegram’s terms of service and are removed whenever discovered." The lawsuit arrives amid broader efforts to combat AI‑generated non‑consensual intimate imagery. Approximately 45 states have criminalized fake nudes, and recent legislation, such as the Take It Down Act, requires platforms to remove both real and AI‑generated non‑consensual intimate images within 48 hours of a victim’s report.
Potential Outcomes
If the court grants the plaintiff’s request, the app could be blocked in the United States, setting a precedent for how AI‑generated sexual content is addressed legally. The case underscores the challenges victims face in controlling the spread of deepfake imagery and the responsibilities of online platforms to enforce content policies.