It’s not just Grok: Apple and Google app stores are infested with nudifying AI apps

We tend to think of the Apple App Store and Google Play Store as digital “walled gardens” – safe, curated spaces where dangerous or sleazy content gets filtered out long before it hits our screens. But a grim new analysis by the Tech Transparency Project (TTP) suggests that the walls have some serious cracks. The report reveals a disturbing reality: both major storefronts are currently infested with dozens of AI-powered “nudify” apps. These aren’t obscure tools hidden on the dark web; they are right there in the open, allowing anyone to take an innocent photo of a person and digitally strip the clothing off them without their consent.

Earlier this year, the conversation around this technology reached a fever pitch when Elon Musk’s AI, Grok, was caught generating similar sexualized images on the platform X. But while Grok became the lightning rod for public outrage, the TTP investigation shows it was just the tip of the iceberg. A simple search for terms like “undress” or “nudify” in the app stores brings up a laundry list of software designed specifically to create non-consensual deepfake pornography.

The scale of this industry is frankly staggering

We aren’t talking about a few rogue developers slipping through the cracks. According to the data, these apps have collectively racked up over 700 million downloads. They have generated an estimated $117 million in revenue. And here is the uncomfortable truth: because Apple and Google typically take a commission on in-app purchases and subscriptions, they are effectively profiting from the creation of non-consensual sexual imagery. Every time someone pays to “undress” a photo of a classmate, a coworker, or a stranger, the tech giants get their cut.

The human cost of this technology cannot be overstated. These tools weaponize ordinary photos. A selfie from Instagram or a picture from a yearbook can be twisted into explicit material used to harass, humiliate, or blackmail victims. Advocacy groups have been screaming about this for years, warning that “AI nudification” is a form of sexual violence that disproportionately targets women and, terrifyingly, minors.

So, why are they still there?

Both Apple and Google have strict policies on paper that ban pornographic and exploitative content. The problem is enforcement. It has become a digital game of Whac-A-Mole. When a high-profile report comes out, the companies might ban a few specific apps, but the developers often just tweak the logo, change the name slightly, and re-upload the exact same code a week later. The automated review systems seem completely incapable of keeping up with the rapid evolution of generative AI.

For parents and everyday users, this is a wake-up call. We can no longer assume that just because an app is on an “official” store, it is safe or ethical. As AI tools get more powerful and easier to access, the safeguards we relied on in the past are failing. Until regulators step in – or until Apple and Google decide to prioritize safety over commission fees – our digital likenesses remain uncomfortably vulnerable.

Share.
Exit mobile version