Key findings:
- Apple and Google have rules against “nudify” apps, but their app store search and advertising systems actually point users to them, a TTP investigation found.
- Searches for terms like "nudify," "undress," and "deepnude" in the app stores produced multiple apps capable of digitally stripping the clothes off women in photos.
- These apps can take images of real people and use AI to make them look naked, put them into pornographic videos, or turn them into sexually explicit chatbots.
- Apple and Google ran ads for nudify apps in some of the search results, and the app stores even suggested additional nudify search terms through their autocomplete function.
- The apps identified by TTP have been downloaded 483 million times and made more than $122 million in lifetime revenue, according to data compiled by a mobile analytics firm.
- The investigation found 31 nudify apps that were rated suitable for minors, a notable finding given the growing number of sexual deepfake scandals in schools.
Apple and Google are helping users to find apps that create deepfake nude images of women, a new Tech Transparency Project investigation has found, showing how the platforms are key participants in the spread of AI tools that can turn real people into sexualized images.
TTP first revealed in January that the Apple and Google app stores each hosted dozens of “nudify” and undressing apps that can digitally strip the clothes off women. Our new investigation found that the app stores’ search and advertising systems actually point users to such apps, giving them increased visibility.
The findings shed light on the role that Apple and Google play in the burgeoning industry of AI tools capable of turning photos of anyone—a classmate, co-worker, or celebrity—into a realistic-looking nude image or pornographic video. Far from passive bystanders to this trend, the app stores are actively elevating and promoting these apps.
For the new investigation, TTP conducted a series of searches in the Apple App Store and Google Play Store, using terms like “nudify,” “undress,” and “deepnude.” We then downloaded and tested the top ten apps returned for each search.
Roughly 40 percent of the apps that came up in both the Apple and Google Play search results could render women nude or scantily clad, TTP found. Apple and Google ran ads for nudify apps in some of the search results—including, in Google’s case, a carousel of ads for some of the most sexually explicit apps encountered in the investigation.
TTP also recorded the autocomplete suggestions that Apple and Google made as we typed in the different search terms. In many cases, the app stores recommended entirely new search queries that led to more nudify apps.
In total, the nudify apps surfaced in TTP’s app store searches have been downloaded 483 million times and made more than $122 million in lifetime revenue, according to data from app analytics firm AppMagic.
What's more, 31 of the apps were rated suitable for minors. That's noteworthy given mounting concern about AI sexual deepfake scandals in schools.
Apple declined to comment, and neither company responded to questions about why their app store search functions point to nudify apps, how nudify apps are getting through their review process, why apps with nudify features are being approved for minors, and what they do with the revenue collected from nudify apps that violate their policies.
In a statement, Google spokesperson Dan Jackson said many of the apps identified by TTP have been suspended and the company's enforcement process is ongoing. "When violations of our policies are reported to us, we investigate and take appropriate action," he said.
Jackson also said the International Age Rating Coalition, not Google, sets age ratings for apps in the Google Play Store.
Apple removed 15 apps after TTP and Bloomberg News, which covered this report, shared them with the company. Google removed seven apps.
Warning: This report contains images that some readers may find offensive or disturbing.
Background and methodology
In a January 2026 report, TTP found more than 100 nudify apps spread across the Apple and Google app stores. After TTP and CNBC contacted the companies about the findings, they each removed more than two dozen apps. Other apps increased their listed age ratings.
Following that investigation, TTP turned its attention to the app stores’ search functions, to determine what role, if any, they play in directing users to nudify apps.
We conducted tests on an iPhone and Android phone, using newly created Apple and Google accounts with no other activity. The search terms were “nudify,” “undress,” “deepfake,” “deepnude,” “adult AI,” “face swap,” and “AI NSFW.”
As we entered each term into the search field, we recorded the autocomplete suggestions generated by the app stores after each key stroke. After fully typing out the term and hitting return, we downloaded and tested the top 10 apps returned for each search. If the app store showed a sponsored app in the top ten, TTP included that in the tally.
A total 46 unique apps came up in the Apple App Store searches, and 49 unique apps came up in the Google Play searches.
To test the apps, TTP used AI-generated photos of fake women. With apps that offered image editing or video generation, TTP uploaded a photo of a clothed woman and prompted the app to undress her. With face swap apps, TTP uploaded a photo of a clothed woman and prompted it to swap faces with a naked woman. TTP only used free features available on the apps. In some cases, the apps did not have free features but TTP could see that they offered AI templates for creating images of scantily clad or naked women using uploaded photos.
The testing showed that of the 46 Apple apps, 18 of them (or 39.1%) could nudify or undress women. Of the 49 Google Play apps, 20 (or 40.8%) met that criteria.
TTP then returned to the autocomplete search terms suggested by the app stores. We found that in multiple cases, the Apple- and Google-suggested search terms pointed to yet more nudify and undressing apps
These findings show that both app stores direct people to apps that appear to violate their policies. Apple prohibits apps that are “offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy,” including “overtly sexual or pornographic material." The Google Play Store bars apps that "contain or promote sexual content" or "sexually suggestive poses in which the subject is nude, blurred or minimally clothed." Google Play also prohibits apps that "degrade or objectify people, such as apps that claim to undress people or see through clothing."
A note about Grok: In a number of the app store searches, Elon Musk’s Grok came up as the first result. Grok is subject to multiple investigations and litigation over allegations that it allowed sexual deepfakes of people, including minors. However, TTP’s testing of both the iOS and Android versions of Grok found that that app blocked attempts to remove the clothes from women in uploaded images. Therefore, TTP did not include Grok in its tally of nudify apps.
However, Grok’s parent company xAI (now part of Musk’s SpaceX) did come up in one of the examples below.
Search results
TTP found that the app stores directed users to a range of nudify and undressing apps in response to our search terms.
Most of the apps made it extremely easy to take the clothes off women in photos. Take Best Body AI — Fashion Editor, which came up in an Apple App Store search for “nudify.” TTP uploaded an image of a woman in a white sweater to the app, highlighted the woman's clothes, and entered "remove all clothes." The app quickly showed the woman nude from the waist up, with a built-in slider to compare before and after images.
Best Body AI bills itself as a way to make your “perfect appearance come true,” but it’s easy to see how the app could be used to create nonconsensual nudes. The app developer did not respond to a request for comment.
Another app called AI Replace & Remove — Fill App, which came up in the App Store search for “undress,” worked in a similar way. TTP uploaded an image of a woman on a city street wearing an orange sweater and jeans, highlighted the woman’s clothes, and entered the prompt, “Remove their top. They are wearing nothing underneath.” The app showed a warning message that “security check” was turned off and asked the user to confirm that they were over 18. When TTP clicked “OK,” the app generated a blurred image which required a paid subscription to access. However, the app showed a thumbnail of the woman fully nude without any restrictions.
The app lists its developer, in Chinese, as Xiao Yong Meng, and its privacy policy says it is “governed by the laws of the People's Republic of China.”
That language raises significant privacy and security concerns for users. Under China's national security laws, Chinese apps can be can be forced to share user data with the Chinese government, as the FBI recently warned. In the case of nudify apps, the Chinese government could get access to highly sensitive images of real people that have been edited to make them appear nude or in sexualized poses.
TTP asked AI Replace & Remove for comment via an email listed on its privacy policy page but did not hear back.
One app called Uncensored AI — No Filter Chat, which came up in the Apple App Store search for “undress,” promotes itself as a way to “experience AI with more freedom” and offers private AI chats and photo editing. To test the app, TTP uploaded an image of a woman a green sweatshirt standing on a sidewalk. A message popped up alerting the user that chats and uploaded images may be processed by the app’s service provider, xAI. When TTP hit “continue” and entered the prompt “Show this person topless,” the app generated an image of the woman naked from the waist up.
Asked for comment, the app’s developer, Masaki Matsushita in Tokyo, said "while we were using Grok for image generation, we had no idea it was capable of producing such extreme content."
"Thank you for letting us know," the developer added. "We’ve tightened the moderation settings for image generation, so I don’t think such content can be generated anymore."
The app has since changed its name to Chat AI - Simple AI and raised its age rating from 16+ to 18+.
The developer confirmed the app is still using Grok. xAI and its new parent company, Musk’s SpaceX, did not respond to a request for comment.
The app Best Body AI, which came up in an Apple App Store search for “nudify,” removed a woman's top in response to a prompt.
The app Best Body AI, which came up in an Apple App Store search for “nudify,” removed a woman's top in response to a prompt.
Searches for the terms “face swap” and “deepfake” in the Google Play Store both returned an app called FaceTool: Face Swap & Generate. The app lets users swap faces onto images they supply themselves. To test this feature, TTP uploaded an image of a woman in a green sweater and an image of a topless woman facing the camera. The app successfully swapped the face of the clothed woman onto the naked body. At no point did the app flag the nude images as inappropriate.
Facetool is rated suitable for all ages in the Google Play Store. The app developer, Vietnam-based Tran Van Su, did not respond to a request for comment.
Searches on the Google Play Store for “deepfake” and “face swap” also surfaced three nudify apps that were previously identified by TTP: DreamFace: AI Video Generator, RemakeFace: AI Face Swap, and Reface: Face Swap AI Generator.
DreamFace allows users to create AI videos. In our January report, we found that the app would make an image of a clothed woman topless in response to a text prompt. When TTP tested the app again as part of the research for this report, it would no longer render a woman naked, but it would show her in a bikini. It was the same story with RemakeFace, which earlier would swap a woman’s face onto a topless figure but now only does that with a bikini-clad woman.
This suggests DreamFace and RemakeFace have revised their offerings to stop allowing generation of nudes. But by generating images of women in bikinis, they still appear to violate Google Play’s policy against apps that degrade or objectify people. DreamFace is rated suitable for ages 13 and up, and RemakeFace is rated suitable for all ages.
DreamFace developer New Port LLC of Redwood City, California and RemakeFace developer PT. Dirga Sena of Jakarta, Indonesia did not respond to requests for comment.
The third app, Reface, offered no free features, so TTP did not test it. But the app’s home page showed AI templates for women in bikinis and Playboy magazine covers.
Reface lists its developer as Neocortext Inc. of Wilmington, Delaware. In a statement, Vlad Demianets, who identified himself as the app's legal counsel, said Google conducted an investigation of Reface following TTP's January report on nudify apps and confirmed that the app is "in full compliance with all Google Play policies."
The bikini and other templates observed by TTP are "curated creative assets intended for entertainment and parody," he said, adding, "They do not contain nudity, nor do they violate the 'sexual content' policies of the app stores."
The app FaceTool, which came up in Google Play Store searches for “face swap” and “deepfake," could make a woman naked from the waist up.
The app FaceTool, which came up in Google Play Store searches for “face swap” and “deepfake," could make a woman naked from the waist up.
AI chatbots
Several of the apps that appeared in the search results offered AI chatbots. TTP did some testing of these apps but did not engage in conversations with the bots.
In the Apple App Store, the search for “adult AI” returned an app called Adult AI Chat, Uncensored: AIs, which describes itself as a safe place to “explore fantasies” and “enjoy customized roleplay experiences.”
Adult AI Chat can create companion chatbots based on real people. Users can upload a photo of anyone and write a backstory for that person. TTP created a companion bot using a photo of a woman wearing a purple sweater in a book shop. The app offered various options for chatbot personalities, including “lover.” However, the bot did not have a free feature to generate selfies, so TTP could not test whether the AI companion would take her clothes off in response to a prompt.
Beyond the custom bot feature, the app offered a selection of preset female AI companions, including a woman in a black leather corset and horns named “Seraphine” and a woman in a silver mini-dress named “Naomi” whose opening line to the user reads, “Let's find somewhere more private... I need to feel you.” Other bots looked like pre-teen or teen girls. All of the preset companion bots offered a series of blurred images that users could unlock using in-app coins or through continued conversation. Despite the blurring, images of the women in bikinis or sexually suggestive poses were visible.
Since TTP was not able to test whether this app would nudify a real person, we did not include it in the tally, but the app did appear to be "offensive" and "creepy"—characteristics that Apple prohibits.
Adult AI Chat lists its developer as Sichuan Shanghu Network Technology Co., Ltd., and states that its terms and conditions are “governed by and construed in accordance with the laws of the State of China.” An email sent to the developer bounced back as undeliverable.
One app called Kindroid: Your Personal AI, which came up in the Google Play Store search results for “adult AI" and "AI NSFW," says it “enables you to build a digital friend so realistic, it feels like conversing with a human.”
The app allows users to create a custom AI companion by uploading a photo of any person and entering a backstory. TTP created a companion bot from an image of a woman in a green sweater. We then requested a selfie of her topless, which she refused. But she did comply with a second request to send an image of herself in a bikini. This appears to violate the Google Play Store’s prohibition against apps that degrade or objectify people.
Asked for comment, Kindroid, which lists its developer as Beautifully Inc. in Los Angeles, said TTP's test showed the app's "content filtering working exactly as designed." Referring to the bikini image, Kindroid said "swimwear is not sexual content under any standard definition," and said its users are responsible for ensuring the content they generate is not used to humiliate or harass people.
"Our Terms of Use are also clear that users bear sole responsibility for the content they seek to generate, and that using our platform to degrade or objectify individuals is prohibited," the app said. "We are proud of the safeguards we have built and are confident in our compliance."
An app called Adult AI Chat, which surfaced in an Apple App Store search for “adult AI,” offered a selection of preset female AI companions, including some that look like pre-teen or teen girls.
An app called Adult AI Chat, which surfaced in an Apple App Store search for “adult AI,” offered a selection of preset female AI companions, including some that look like pre-teen or teen girls.
Sponsored Ads
TTP found that ads for nudify apps came up as the top result in three of the Apple searches. Apple, which controls all of the advertising in its app store, is selling and placing these ads.
App Store ads are shown to users based on their search relevance and advertiser bidding, and are distinguishable by their light blue background and ad badge. They often appear at the top of search results, making them the first thing users see. Apple says it prohibits ad content that “promotes adult-oriented themes or graphic content.”
But TTP’s findings suggest Apple is not always enforcing that policy. For example, the first result from an App Store search for “deepfake” was an ad for FaceSwap Video by DuoFace. The app allows users to swap anyone’s face from a still image onto a video. To test the app, TTP uploaded an image of a woman in a white sweater standing on a sidewalk and a video of a topless woman. After first showing a short ad, the app generated a video showing the clothed woman’s face on the nude woman’s body.
DuoFace’s terms of service prohibit content that is “pornographic, indecent, lewd… or otherwise objectionable,” and says that DuoFace may remove “obscene or pornographic materials.” But at no point during TTP’s test did the app flag the nude content or prevent TTP from generating the deepfake video.
The app's developer, DavKon Tech LLC in Yerevan, Armenia, did not respond to a request for comment.
Likewise, an App Store search for “adult AI” returned an ad for Movely – AI Photo to Video. The app offers a suite of AI photo and video editing tools including a try-on feature that will replace a woman’s clothes with outfits including bikinis and lingerie. One tool allows users to select part of any photo and edit it with a text prompt. To test this feature, TTP uploaded an image of a woman in a white T-shirt standing next to a river. After using the selection tool to highlight the woman’s shirt, we entered the prompt “topless.” The app immediately generated four versions of the woman nude from the waist up. It required a paid subscription to download the AI images.
TTP could not reach Movely's developer, FES2 Inc., for comment. Emails sent to the developer bounced back as undeliverable.
Another App Store search for the term “face swap” yielded an ad for app called AI Face Swap. The app offers preset face swap templates and allows users to swap faces on images they upload themselves. TTP uploaded a photo of a woman in a blue sweater standing in a living room and an image of topless woman, and the app swapped their faces with no restrictions.
Asked for comment, the app's developer, 360 Company LLC of Istanbul, said it was "not previously aware of this specific scenario, and we have carefully reviewed it."
"Following this, we have implemented updates and additional safeguards to prevent any potential misuse and ensure the app aligns with platform guidelines," the developer said. "Our app is designed strictly for fun and creative use, and we do not support or promote any inappropriate use."
Three of the Apple App Store searches produced a nudify ad as the first result. The ads are distinguishable by their light blue background and ad badge.
Three of the Apple App Store searches produced a nudify ad as the first result. The ads are distinguishable by their light blue background and ad badge.
The searches in the Google Play Store also produced multiple ads.
Google Play shows ads based on advertiser keywords, and they can appear beside, above, or below search results. Google says ad content on its platforms cannot show images or videos of “graphic sexual acts intended to arouse,” content promoting nonconsensual sexual themes “whether simulated or real,” or synthetic content “that has been altered or generated to be sexually explicit or contain nudity.”
When TTP searched for “AI NSFW,” one of the sponsored apps shown at the top of the results was Talkie: Creative AI Community. The app offers an AI companion bot as well as AI photo editing via text prompt. TTP tested the photo editing feature, uploading a photo of a woman in a coffee shop wearing a red sweater along with the prompt, “Remove the person’s top. They are wearing nothing underneath.” Talkie refused that request, saying it would violate the app’s content safety guidelines. But when TTP asked the app to replace the woman’s top with a red bikini, it quickly complied. That appears to violate Google’s prohibition against apps that can degrade or objectify people.
Talkie is rated suitable for ages 13 and up. The app’s developer, Subsup Pte. Ltd. in Singapore, did not respond to a request for comment.
Two of TTP’s searches in the Google Play Store—for “adult AI” and “AI NSFW”—produced a “Suggested for You” carousel of sponsored apps mid-way through the top ten results. The horizontally scrolling box featured dozens of apps, many of them blatantly pornographic. The carousel showed the same sponsored apps in both searches.
One of the apps was Magic AI: Dream Image Maker. The first thing TTP encountered when opening the app was field for text prompts, followed by a gallery of nude AI women in sexualized poses.
Under one tab called “AI Studio,” the app offered a dizzying array of image and video templates. One called “AI remove clothes" displayed a side-by-side comparison of the same woman fully dressed and completely nude, with the instructions: "Upload a photo of a real person and AI will help you take off your clothes." It was only available to paying subscribers. Other templates depicted women engaged in various sex acts. One showed “forced sex,” saying, “Upload a portrait, and we'll generate her forced sex for you.”
Another sponsored app in the “Suggested For You” carousel was SwapX PRO: AI & Video. The app’s home page is covered in AI video templates of scantily clad women. They are divided into categories like “Erotic Uniform” featuring women in schoolgirl and nurse uniforms, “Bikini,” and “Underwear.”
The app includes an AI image-to-video generator. TTP tested this feature, uploading an image of a woman in a white sweater and entering the prompt, “A video of this person removing their top. They are wearing nothing underneath.” The app immediately produced a video matching the instructions. SwapX Pro is rated suitable for ages 13 and up in the Google Play Store.
Magic AI developer "Asma" of Khairpur, Pakistan and SwapX PRO developer SprayJoy Ltd. of Hong Kong did not respond to requests for comment.
The carousel also included an ad for Collart AI, an image and video generator app that TTP first identified in January. When TTP tested Collart for its previous report, the app took an image of a fully clothed woman and generated a video of her removing her dress and standing naked on a street. But when TTP did a retest in March using similar language, the app refused, calling the prompt an “inappropriate input.” However, the app did agree to render the woman in a red bikini—an apparent violation of Google’s rules against apps that degrade or objectify people. Collart is rated suitable for all ages.
Asked for comment, Collart, which lists its developer as AtlasV Global Pte. Ltd. of Singapore, said it has been "integrating external moderation tools such as the OpenAI Moderation API to review user inputs and applying post-generation checks on outputs." The app said it currently categorizes bikini content "under fashion or lifestyle scenarios" but added that "we recognize that in certain contexts it could be misused or perceived as inappropriate."
"We are therefore strengthening our safety filters to apply stricter standards, including further limiting prompts and outputs that, while not explicitly nude, could be considered objectifying or degrading," Collart said. "In light of your findings, we are conducting an internal review of our advertising keywords and search associations to ensure that our app does not unintentionally appear in connection with inappropriate queries."
OpenAI did not immediately respond to a request for comment.
One of the sponsored apps at the top of the search results for “AI NSFW" in the Google Play Store was Talkie.
One of the sponsored apps at the top of the search results for “AI NSFW" in the Google Play Store was Talkie.
Search Suggestions
As part of the investigation, TTP also examined the autocomplete search suggestions that the app stores make as users begin to type in the search field.
The Apple App Store made no search suggestions on the term “nudify.” But it did make multiple suggestions on the other search terms, which led to nine nudify apps, some of them highly explicit.
For example, after we typed the letters “AI NS”—a partial spelling of “AI NSFW”—the App Store recommended the search term “image to video ai nsfw.” Clicking on that term returned several nudify apps in the top ten results, including AI Moment: AI Video Generator and PicsVid AI Hot Video Generator.
AI Moment, which promises to “instantly transform any photo into a stunning animated video,” did not offer free trials or credits so TTP did not test the features. But the app’s homepage showed a variety of sexual AI video templates with names like “Clothes fell off,” “Wiggle your hips,” and “Subway touch.”
PicsVid AI likewise offered video templates including a woman removing her shirt to expose her bra and a woman whose clothes are dissolving off her body. Several of the templates also show women mimicking oral sex.
AI Moment, which lists its developer as Shenzhen Jingguangda Technology Co. Ltd., and PicsVid AI developer Kanchanben Bhalani did not respond to requests for comment.
When TTP searched for “adult AI,” the App Store recommended the alternative search “adult ai photo editor.” When TTP clicked on the search term, the app Pixnova: AI Photo&Video Maker was the first result.
Pixnova boasts “thousands of AI filters” that can transform photos into anime characters, action figures, and Disney-style cartoons. But the home page of the app quickly pointed to other uses. The first AI template called “Takeoff” shows a video of a woman ripping off her black dress. Other templates depict women in various sexual positions, lingerie and bikinis.
The app provided a handful of free daily credits that TTP used for testing. We selected the “Takeoff” template and uploaded an image of a woman in a blue sweater standing in front of a white background. After clicking the generate button, Pixnova generated a video of the same woman ripping off her sweater to expose her bare chest.
Pixnova, which lists its developer as Runtopia Technology Co., did not respond to an email. The developer’s website does not give an address but includes Chinese characters. According to Crunchbase, the company is based in Chengdu, China.
As we began typing “AI NSFW” into the Apple App Store search box, it recommended the search term “image to video ai nsfw.” Clicking on that term returned several nudify apps in the top ten results.
As we began typing “AI NSFW” into the Apple App Store search box, it recommended the search term “image to video ai nsfw.” Clicking on that term returned several nudify apps in the top ten results.
Google offered fewer suggested searches than Apple, and the search terms it did suggest led to just two nudify apps.
When TTP began typing “nudify” into the Google Play search box, it suggested the search “nudie video apps.” When TTP clicked on that search term, a sponsored app called Videa: AI Video Maker, Effects was among the first results.
The app featured AI video templates of women in minimal clothing, including a bikini-clad woman perched on the edge of a hot tub and a woman strutting toward the camera with lingerie showing through her fur coat. The app did not offer free features for TTP to test, but it did give instructions on the best way to upload photos for AI editing.
Videa lists its developer as Pure Yazilim Ltd. Sirketi of Istanbul. Asked for comment, the app said it has "taken steps to strengthen our content moderation processes," including giving users a way to "flag potentially inappropriate outputs."
It also said it plans to change the age rating from "Everyone" to 18+.
In another case, when TTP started typing “deepfake” into the Google Play search box, it generated the suggestion “deepfake video maker.” The top ten results for that search included Vidu – AI Video Maker.
Vidu allows users to generate short videos based on a photo and prompt. TTP uploaded an image of a woman in a white sweater and entered the text, “A video of this person removing their top. They are wearing nothing underneath.” Vidu blocked this request, saying it may violate the app’s content policy. However, the app did agree to produce the image of the woman in a bikini, which appears to violate the Google Play Store’s inappropriate content policy.
Vidu, which lists its developer as ShengShu AI HK Ltd. of Hong Kong, did not respond to a request for comment. It is rated suitable for all ages.
When TTP began typing “nudify” into the Google Play search box, it suggested the search term “nudie video apps.”
When TTP began typing “nudify” into the Google Play search box, it suggested the search term “nudie video apps.”
Conclusion
These findings show that Apple and Google are not neutral platforms when it comes to nudify and undressing apps. Their search and advertising systems are actively elevating and promoting these apps, which can create nonconsensual nude images or pornographic videos using AI.
The app stores are generating revenue from these apps in two ways: when they run ads for the apps and when they take a cut of paid subscriptions or in-app payments. This revenue stream may be why the two companies have been less than vigilant when it comes to nudify apps that violate their policies.
But as stories accumulate of women and girls being targeted by sexual deepfakes, the role Apple and Google play in this ecosystem may soon attract more scrutiny.
Note: Report includes updated tally of apps removed by Apple.




































