New Jersey Lawsuit Targets Deepfake Pornography App ClothOff

A lawsuit filed by a clinic at Yale Law School aims to dismantle the controversial app ClothOff, which has been implicated in the creation of non-consensual pornography affecting young women. Despite being removed from major app stores and banned from many social platforms, the app remains accessible online and through a Telegram bot. The case highlights the significant challenges in combatting deepfake technology and protecting victims from its repercussions.

The legal action was initiated in October 2023 and seeks to force the app’s owners to eliminate all generated images and cease operations entirely. However, identifying the defendants has proven to be complex. According to John Langford, co-lead counsel in the lawsuit, “It’s incorporated in the British Virgin Islands, but we believe it’s run by a brother and sister in Belarus. It may even be part of a larger network around the world.”

The emergence of deepfake pornography, particularly after the launch of Elon Musk‘s xAI, has raised alarms, especially as many of the victims are underage. Child sexual abuse material is strictly illegal, yet there are limited avenues for addressing the proliferation of such content generated by platforms like ClothOff. While individual users can face prosecution, the platforms that facilitate this content remain challenging to regulate, leaving victims with few options for legal recourse.

The clinic’s complaint describes the alarming case of an anonymous high school student in New Jersey. Her classmates manipulated her Instagram photos using ClothOff, creating AI-altered images when she was only 14 years old. These modified versions fall under the legal classification of child abuse imagery. Despite the clear illegality of such content, local authorities opted not to prosecute, citing difficulties in gathering evidence from the suspects’ devices. The legal document states, “Neither the school nor law enforcement ever established how broadly the CSAM of Jane Doe and other girls was distributed.”

Progress in the court case has been slow. Since the filing, Langford and his team have faced challenges in serving notice to the defendants due to the global nature of the operation. Once served, the clinic can push for a court appearance and a judgment, but the timeline remains uncertain for ClothOff’s victims.

In contrast, the case against Grok, another platform linked to xAI, appears more straightforward. Unlike ClothOff, Grok operates openly, and there is potential for substantial monetary compensation for successful legal claims. Yet, Grok’s general-purpose design complicates accountability in court. Langford explained, “ClothOff is designed and marketed specifically as a deepfake pornography image and video generator. When you’re suing a general system that users can query for all sorts of things, it gets a lot more complicated.”

Recent legislation in the United States has attempted to address the issue of deepfake pornography, such as the Take It Down Act. While specific users may violate these laws, holding entire platforms accountable remains a significant hurdle. Current legal frameworks require clear evidence of intent to harm, complicating the ability to prove that xAI knowingly facilitated the creation of non-consensual content. Langford emphasizes, “In terms of the First Amendment, it’s quite clear Child Sexual Abuse material is not protected expression.”

Regulatory responses vary significantly across regions. While countries like Indonesia and Malaysia have taken steps to block access to Grok, authorities in the United Kingdom have opened investigations that could lead to similar actions. The European Commission is also exploring preliminary measures, while no official response has emerged from US regulatory agencies.

The ongoing investigations underscore the pressing need for clarity in the legal landscape surrounding deepfake technology and child protection. Langford remarks, “If you are posting, distributing, disseminating Child Sexual Abuse material, you are violating criminal prohibitions and can be held accountable. The hard question is, what did X know? What did X do or not do? What are they doing now in response to it?”

As the legal battle against ClothOff unfolds, it serves as a stark reminder of the urgent need for robust protections against the misuse of technology and the safeguarding of vulnerable individuals online.