Apple and Google position themselves as the ultimate gatekeepers of the mobile ecosystem, promising users a curated experience governed by strict safety guidelines. However, a recent investigation reveals a jarring disconnect between these public policies and the actual behavior of their algorithms, showing that both companies are actively steering users to nudify apps—AI-powered tools designed to digitally strip clothing from photos of real people.
The findings suggest that the Apple App Store and Google Play Store are not merely passive hosts for these tools but are actively elevating them through search autocomplete functions and sponsored advertising. By directing users toward software capable of creating nonconsensual sexual imagery, the platforms are facilitating the spread of tools that can turn any photo—of a colleague, a classmate, or a celebrity—into an explicit deepfake.
The scale of this ecosystem is massive. According to data from mobile analytics firm AppMagic, the nudify apps identified in the research have been downloaded 483 million times and have generated more than $122 million in lifetime revenue. This financial incentive creates a troubling paradox: while these apps often violate the platforms’ own terms of service, the companies continue to profit from their subscriptions and ad spend.
Algorithmic Steering and Sponsored Content
The investigation found that the “steering” happens through three primary mechanisms: search results, sponsored ads, and autocomplete suggestions. When researchers searched for terms like “nudify,” “undress,” and “deepnude,” roughly 40 percent of the top results in both stores were capable of rendering women nude or scantily clad.
In several instances, the platforms didn’t just return these apps in organic search; they placed them at the very top as paid advertisements. In the Apple App Store, ads for these tools appeared as the first result for searches like “deepfake” and “adult AI.” These ads are distinguishable by a light blue background and an ad badge, indicating they are placed and sold directly by Apple.
Google Play utilized a similar strategy, deploying “Suggested for You” carousels mid-way through search results for “AI NSFW” and “adult AI.” These carousels featured dozens of apps, some of which were blatantly pornographic. One such sponsored app, Magic AI: Dream Image Maker, featured a specific “AI remove clothes” template that explicitly instructed users to upload a photo of a real person to “take off your clothes.”
The platforms also proactively suggested these tools through autocomplete. For example, when a user began typing “AI NS” in the Apple App Store, the system recommended the search term “image to video ai nsfw,” which led directly to explicit video generators.
The Safety Gap and Risks to Minors
Perhaps the most concerning finding is the failure of age-verification and rating systems. The investigation identified 31 nudify apps that were rated as suitable for minors. This occurs at a time when schools globally are reporting a surge in sexual deepfake scandals involving students.

When questioned, Google spokesperson Dan Jackson stated that the International Age Rating Coalition, rather than Google, sets the age ratings for apps in the Play Store. Jackson added that many of the identified apps have since been suspended and that the company investigates and takes action when violations are reported.
Apple declined to comment on why its search functions point to these apps, how they bypass the review process, or how the company handles revenue collected from apps that violate its policies. Following the investigation, Apple removed 14 apps, and Google removed seven.
Platform Performance Comparison
| Platform | Unique Apps Tested | Apps Capable of Nudifying | Success Rate (%) |
|---|---|---|---|
| Apple App Store | 46 | 18 | 39.1% |
| Google Play Store | 49 | 20 | 40.8% |
Privacy and National Security Implications
Beyond the immediate harm of nonconsensual imagery, some of these tools introduce significant data privacy risks. Several apps, such as AI Replace & Remove, list developers based in China and state in their privacy policies that they are governed by the laws of the People’s Republic of China.
Under Chinese national security laws, companies can be compelled to share user data with the government. In the context of nudify apps, this means highly sensitive, edited images of real people could potentially be accessed by state authorities, creating a secondary layer of vulnerability for users.
The investigation also highlighted the role of third-party APIs. One app, Uncensored AI, used xAI’s Grok for image generation to produce topless images of women. While the standalone Grok app reportedly blocks attempts to remove clothing from images, the API allowed a third-party developer to bypass those safeguards until the issue was flagged.
The Profitability of Policy Violations
Both Apple and Google have explicit policies against “creepy” or “offensive” content. Apple prohibits material that is “overtly sexual or pornographic,” while Google Play bars apps that “degrade or objectify people, such as apps that claim to undress people.”

However, the continued presence and promotion of these apps suggest a systemic failure in enforcement. Because the platforms take a percentage of all in-app purchases and subscription fees, there is a direct financial incentive to allow high-revenue apps to persist, even if they operate in a legal and ethical gray area.
As legislative efforts to criminalize nonconsensual AI-generated pornography increase across various jurisdictions, the role of the “steering” algorithms used by Apple and Google is likely to face greater regulatory scrutiny. The focus is shifting from the developers of the apps to the platforms that not only host them but actively market them to the public.
Future checkpoints for this issue include upcoming legislative sessions on AI safety and potential updates to the International Age Rating Coalition’s standards for generative AI tools.
Do you think app stores should be legally liable for the content their algorithms promote? Share your thoughts in the comments or share this story on social media.
