Apple's App Store Under Fire for Hosting AI-Generated Porn Apps
Apple Faces Backlash Over AI Porn Apps in Store

In a startling revelation that contradicts Apple's carefully cultivated image as a privacy champion, investigators have uncovered dozens of AI-powered pornographic apps thriving in the App Store. The findings come from the Technology Transparency Project (TTP), exposing significant gaps in Apple's content moderation system.
How These Apps Slip Through
The so-called "AI Nudification" apps employ sophisticated artificial intelligence to generate fake nude images from ordinary photos - often without subjects' knowledge or consent. What's particularly alarming? Many victims appear to be minors.
Searching for related keywords yields numerous results, suggesting these apps aren't hiding in dark corners but operating openly. Shockingly, they've collectively amassed over 700 million downloads worldwide, generating more than $100 million in revenue - with Apple taking its customary 30% cut.
A Game of Cat and Mouse
While Apple's guidelines explicitly ban pornographic content, developers have found clever workarounds. Many disguise themselves as legitimate photo editors or video creation tools during the review process before revealing their true functionality post-approval.
The situation echoes concerns raised during previous App Store controversies. "This isn't just about policy violations," says digital rights advocate Maria Chen. "We're talking about technology that can ruin lives through non-consensual deepfakes - and Apple is profiting from it."
Slow Response Raises Eyebrows
After being presented with TTP's findings, Apple removed some offending apps but left many others available. The selective takedowns have fueled criticism that the company prioritizes profits over protection.
The controversy arrives at an awkward time for Apple, which recently launched new privacy-focused marketing campaigns. Security experts argue the company needs to invest equally in ethical safeguards as it does in privacy features.
What Comes Next?
Pressure is mounting on both Apple and Google (whose Play Store hosts similar apps) to implement more robust screening processes specifically targeting AI-generated content. Some lawmakers are already calling for regulatory intervention.
The incident serves as a wake-up call about the challenges platform operators face policing increasingly sophisticated AI tools - and whether current moderation systems can keep pace with rapidly evolving technologies.
Key Points:
- Widespread availability: Dozens of AI nudification apps remain accessible despite violations
- Massive scale: Over 700 million downloads generating $100M+ revenue
- Minors at risk: Technology being used to create explicit images without consent
- Revenue sharing: Both Apple and Google profit through commission fees
- Policy gaps: Current moderation systems failing to catch disguised apps
- Growing backlash: Calls for stronger regulation of AI-generated content




