A new report has highlighted how Apple faces a steep challenge in preventing 'Face Swap' apps in the App Store that can otherwise be used to create deepfake images - including pornography.
The company regulates its App Store via a "walled garden" policy where apps are required to comply with Apple's guidelines, but a report from 404 Media suggests "dual use" apps that incorporate features like face swapping can be used to switch faces onto pornographic content - sometimes using minors.
Apple's challenges with Dual Use apps
The reporter found a face swap ad on Reddit, with the app suggesting a series of websites, including pornographic ones, to swap faces into.
As the report itself says "I tested the app and found that, for users willing to pay a subscription fee of $20 a month, it makes it incredibly easy to generate nonconsensual deepfake porn of anyone."
"All I had to do was provide one image of the person I wanted to deepfake, and use an in-app internet browser to navigate to the video I wanted them to appear in. As the ad I saw on Reddit suggested, when I navigated to a specific Pornhub video, the app automatically pulled the video and created deepfake porn of the person in the image I provided."
"The entire process, from the moment I saw the ad on one of the most popular websites in the world to the completed deepfake video, took about five minutes, and created a highly convincing result.
Given Apple doesn't allow porn apps on the App Store, this feels like a way to circumnavigate that policy somehow while potentially sourcing content from an adult site.
While Apple Intelligence cannot generate images of that kind, the company might need to take a deeper look at third-party options on its App Store before long.