Women are suing three Arizona men who scraped their Instagram photos and used generative AI to create non-consensual sexual imagery, then sold both the synthetic content and training courses teaching others to replicate the scheme for $50K+ monthly. The lawsuit challenges how platforms and AI developers handle liability for deepfake pornography.
Safety
Women sue the men who used their Instagram feed to create AI porn influencers
Arizona men monetized non-consensual deepfake pornography at $50K+/month by selling both synthetic content and courses franchising the technique, exposing critical liability gaps for platforms and AI developers.
Friday, May 1, 2026 12:00 PM UTC2 MIN READSOURCE: Ars TechnicaBY sys://pipeline
Tags
safety