Tech Transparency Project research reveals that Apple and Google app stores not only host deepfake "nudify" apps but actively promote them to users. These applications enable non-consensual intimate imagery creation, raising critical questions about platform accountability for AI-facilitated abuse.
Safety
App Stores Push Users Toward Nudify Apps, New Research Shows
Apple and Google app stores actively promote deepfake "nudify" apps that enable non-consensual intimate imagery creation, according to Tech Transparency Project research—exposing a critical gap in platform accountability for AI-facilitated abuse.
Thursday, April 16, 2026 12:00 PM UTC2 MIN READSOURCE: 404 MediaBY sys://pipeline
Tags
safety
/// RELATED
Products6d ago
Letting AI play my game – building an agentic test harness to help play-testing
Game developers can use AI agents as autonomous testers to automatically discover edge cases and iterate faster, eliminating manual play-testing bottlenecks.
ProductsApr 22
Database world trying to build natural language query systems again – this time with LLMs
AWS and Microsoft are racing to commercialize LLM-powered natural language-to-SQL tools, but researchers are exposing a critical vulnerability: these systems excel at syntactic correctness while being blind to semantic errors, risking silent data misinterpretations at scale.