Close Menu
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Is safety is ‘dead’ at xAI?

February 14, 2026

Hollywood isn’t happy about the new Seedance 2.0 video generator

February 14, 2026

India doubles down on state-backed venture capital, approving $1.1B fund

February 14, 2026
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
Home » A New Jersey lawsuit shows how hard it is to fight deepfake porn
AI

A New Jersey lawsuit shows how hard it is to fight deepfake porn

IQ TIMES MEDIABy IQ TIMES MEDIAJanuary 12, 2026No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


For more than two years, an app called ClothOff has been terrorizing young women online — and it’s been maddeningly difficult to stop. The app has been taken down from the two major app stores and it’s banned from most social platforms, but it’s still available on the web and through a Telegram bot. In October, a clinic at Yale Law School filed a lawsuit that would take down the app entirely, forcing the owners to delete all images and cease operation entirely. But simply finding the defendants has been a challenge. 

“It’s incorporated in the British Virgin Islands,” explains Professor John Langford, a co-lead counsel in the lawsuit, “but we believe it’s run by a brother and sister and Belarus. It may even be part of a larger network around the world.”

It’s a bitter lesson in the wake of the recent flood of non-consensual pornography generated by Elon Musk’s xAI, which included many underage victims. Child sexual abuse material is the most legally toxic content on the internet — illegal to produce, transmit or store, and regularly scanned for on every major cloud service. But despite the intense legal prohibitions, there are still few ways to deal with image generators like ClothOff, as Langford’s case demonstrates. Individual users can be prosecuted, but platforms like ClothOff and Grok are far more difficult to police, leaving few options for victims hoping to find justice in court.

The clinic’s complaint, which is available online, paints an alarming picture. The plaintiff is an anonymous high school student in New Jersey, whose classmates used ClothOff to alter her Instagram photos. She was 14 years old when the original Instagram photos were taken, which means the AI-modified versions are legally classified as child abuse imagery. But even though the modified images are straightforwardly illegal, local authorities declined to prosecute the case, citing the difficulty of obtaining evidence from suspects’ devices.

“Neither the school nor law enforcement ever established how broadly the CSAM of Jane Doe and other girls was distributed,” the complaint reads.

Still, the court case has moved slowly. The complaint was filed in October, and in the months since, Langford and his colleagues have been in the process of serving notice to the defendants — a difficult task given the global nature of the enterprise. Once they’ve been served, the clinic can push for a court appearance and, eventually, a judgment, but in the meantime the legal system has given little comfort to ClothOff’s victims.

The Grok case might seem like a simpler problem to fix. Elon Musk’s xAI isn’t hiding, and there’s plenty of money at the end for lawyers who can win a claim. But Grok is a general purpose tool, which makes it much harder to hold it accountable in court.

Techcrunch event

San Francisco
|
October 13-15, 2026

“ClothOff is designed and marketed specifically as a deepfake pornography image and video generator,” Langford told me. “When you’re suing a general system that users can query for all sorts of things, it gets a lot more complicated.”

A number of US laws have already banned deepfake pornography — most notably the Take It Down Act. But while specific users are clearly breaking those laws, it’s much harder to hold the entire platform accountable. Existing laws require clear evidence of an intent to harm, which would mean providing evidence xAI knew their tool would be used to produce non-consensual pornography. Without that evidence, xAI’s basic first amendment rights would provide significant legal protection..

“In terms of the First Amendment, it’s quite clear Child Sexual Abuse material is not protected expression,” Langford says. “So when you’re designing a system to create that kind of content, you’re clearly operating outside of what’s protected by the First Amendment. But when you’re a general system that users can query for all sorts of things, it’s not so clear.”

The easiest way to surmount those problems would be to show that xAI had willfully ignored the problem. It’s a real possibility, given recent reporting that Musk directed employees to loosen Grok’s safeguards. But even then, it would be a far riskier case to take on.  

“Reasonable people can say, we knew this was a problem years ago,” Langford says. “How can you not have had more stringent controls in place to make sure this doesn’t happen? That is a kind of recklessness or knowledge but it’s just a more complicated case.”

Those First Amendment issues are why xAI’s biggest pushback has come from court systems without robust legal protections for free speech. Both Indonesia and Malaysia have taken steps to block access to the Grok chatbot, while regulators in the United Kingdom have opened an investigation that could lead to a similar ban. Other preliminary steps have been taken by the European Commission, France, Ireland, India and Brazil. In contrast, no US regulatory agency has issued an official response.

It’s impossible to say how the investigations will resolve, but at the very least, the flood of imagery raises lots of questions for regulators to investigate — and the answers could be damning.

“If you are posting, distributing, disseminating Child Sexual Abuse material, you are violating criminal prohibitions and can be held accountable,” Langford says. “The hard question is, what did X know? What did X do or not do? What are they doing now in response to it?“



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
IQ TIMES MEDIA
  • Website

Related Posts

Is safety is ‘dead’ at xAI?

February 14, 2026

Hollywood isn’t happy about the new Seedance 2.0 video generator

February 14, 2026

India doubles down on state-backed venture capital, approving $1.1B fund

February 14, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Social media posts extend Epstein fallout to student photo firm Lifetouch

February 13, 2026

Jury deadlocks in trial of Stanford University students after pro-Palestinian protests

February 13, 2026

Harvard sued by Justice Department over access to admissions data

February 13, 2026

San Francisco teachers reach deal with district to end strike

February 13, 2026
Education

Social media posts extend Epstein fallout to student photo firm Lifetouch

By IQ TIMES MEDIAFebruary 13, 20260

MALAKOFF, Texas (AP) — Some school districts in the U.S. dropped plans for class pictures…

Jury deadlocks in trial of Stanford University students after pro-Palestinian protests

February 13, 2026

Harvard sued by Justice Department over access to admissions data

February 13, 2026

San Francisco teachers reach deal with district to end strike

February 13, 2026
IQ Times Media – Smart News for a Smarter You
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 iqtimes. Designed by iqtimes.

Type above and press Enter to search. Press Esc to cancel.