The Technology Transparency Project (TTP) conducted new research focusing on how Apple's own search and advertising systems can help users find these apps, following its report in January that highlighted many "nudify" apps available in the App Store. Here are the details.
Nudify apps are still a problem in app stores
According to the new report, both the App Store and Google Play Store are "helping women find apps that create deepfake naked images" and sometimes do this through promoted search results and autocomplete search suggestions.
In its report, TTP states that Apple and Google are still failing to prevent nudify apps from appearing in app stores; some of them appear to be suitable for minors. The group found that nearly 40% of the top 10 apps returned for searches like "nudify," "undress," and "deepnude" could "make women naked or scantily clad."
Additionally, some searches also revealed sponsored results for these apps. From the content of the report:
(...) The first result of an App Store search for the term 'deepfake' was an ad for the FaceSwap Video app developed by DuoFace. This app allows users to swap someone's face from a still image onto a video. To test the app, TTP uploaded an image of a woman in a white sweater standing on the sidewalk and a video of a naked woman. After showing a short ad, the app produced a video placing the clothed woman's face onto the naked woman's body.
And
Another App Store search for the term 'face swap' revealed an ad for an app called AI Face Swap. This app offers pre-set face swapping templates and allows users to swap faces in images they upload. TTP uploaded a photo of a woman in a blue sweater and an image of a naked woman, and the app swapped their faces without any restrictions.
Interestingly, in addition to reaching out to Apple and Google about these findings, TTP also contacted developers of several of these apps. In at least one case, the app developer confirmed that they used Grok for image generation but claimed they were "unaware that it could produce such extreme content." The developer promised to tighten moderation settings for image generation.
In the rest of the report, TTP noted that because the search for "AI NS" could result in "AI NSFW," the App Store suggested "image to video ai nsfw." This search returned several nudify apps among the top ten results, respectively.
Despite refusing to respond to TTP's request, Apple removed most of the apps identified by TTP in response to the report.
Follow this link to read TTP's full report.
Products worth browsing on Amazon
- David Pogue – 'Apple: The First 50 Years'
- MacBook Neo
- Logitech MX Master 4
- AirPods Pro 3
- AirTag (2nd Generation) – 4 Pack
- Apple Watch Series 11
- Wireless CarPlay adapter
Comments
(1 Comment)