Apple has removed an AI-generated nude image app from the App Store over concerns about privacy violations.
Apple has removed an app that uses AI to generate nude images from the App Store.
apple I deleted some apps from app store What I used artificial intelligence According to a report by 404 Media, AI creates nude images of people. These apps were a concern because they could be used to generate nude photos of someone without their permission.
Here's how it works: Ads for these apps were found in: Instagram, some of them are linked to downloadable apps available on the App Store. These apps offered a variety of features, such as swapping faces with adult photos and using AI to digitally remove clothing from regular photos.
The report raises concerns about Apple's ability to independently detect such apps. Apparently, Apple took action after 404 Media reported the app and its ads. This suggests that Apple may need to improve how it finds apps that violate the rules. App Store Guidelines.
This isn't the first time tech companies have faced problems with AI-powered apps that can create fake or misleading content. In 2022, similar apps can be found in both the Apple App Store and the Apple App Store. Google Play Store. At the time, both companies asked app developers to stop promoting these features on adult sites, but the apps themselves were not removed.
The prevalence of these “undressing apps” is particularly concerning as they have been found in schools and universities around the world. These apps can be a form of privacy invasion and harassment.
By removing these apps, Apple is sending a message that it will not tolerate apps that can be used to create non-consensual nude images. However, the report also highlights the need for Apple to improve its app review process to prevent such apps from appearing on the App Store in the first place.
