Skip to content
Search AI Powered

Latest Stories

Teen sues 'ClothOff' developer over fake nude images created with clothes removal tool

The lawsuit adds to increasing calls for regulation as AI-generated sexual content

Teen Sues ClothOff Developer Over Fake Nude Images

The lawsuit targets the app developer and Telegram

iStock

Highlights:

  • A 17-year-old New Jersey girl files a lawsuit over manipulated images created using the AI-powered app ClothOff.
  • The lawsuit targets the app developer and Telegram, which hosted bots providing access to the tool.
  • Case underscores growing concerns over digital privacy and misuse of emerging technologies.

Lawsuit targets app and distribution platforms

A 17-year-old from New Jersey has sued AI/Robotics Venture Strategy 3 Ltd., the developer of ClothOff, a web-based “clothes removal” tool, after a classmate allegedly used it to create a fake nude image from her Instagram photo when she was 14.

The case, filed with the support of a Yale Law School professor, his students, and a trial attorney, also names Telegram as a nominal defendant because it hosted bots that provided access to the app.


According to the complaint, the teen’s Instagram photo, showing her in a bathing suit, was altered into a realistic nude image and circulated among male classmates. The plaintiff seeks the deletion of all nonconsensual images created with the software and a court order to prevent its distribution.

Developer denies wrongdoing amid allegations

ClothOff’s developer, registered in the British Virgin Islands and believed to operate from Belarus, claims its system cannot process images of minors and automatically deletes all data.

However, the plaintiff’s lawyers allege the software has been misused to create sexually explicit content involving minors, violating federal and state laws.

The teenage boy accused of creating the manipulated images is not part of this lawsuit, though the plaintiff has filed a separate case against him. His attorneys said he lacks sufficient knowledge to respond to the allegations.

Growing pressure to regulate AI-generated imagery

The lawsuit adds to increasing calls for regulation as AI-generated sexual content becomes more widespread. In May, Congress passed the Take It Down Act, criminalizing the nonconsensual publication of intimate imagery, real or AI-generated, and requiring platforms to remove content within 48 hours of a valid complaint.

The teen’s filing says she now “lives in constant fear” that the fake image will resurface online, highlighting the ongoing challenges of digital privacy and safety.