Trendingger Posted April 26 Share Posted April 26 Apple has reportedly eradicated some AI image generation apps, which are capable of generating fake nudes, from its App Store. The misuse of generative AI technology for creating deepfakes and nonconsensual pornography has been a growing issue. Despite the potential harm, Apple had previously been relatively passive in addressing this problem. The alarm was raised by 404 Media, who informed Apple of these apps’ existence. Marketed as capable of creating nonconsensual nude images, these apps were a ticking time bomb. Following the alert, Apple took action and removed three of these apps from the App Store. Despite the positive step, questions remain. Apple’s App Store Review process failed to identify these apps initially. It was only after third-party alerts that Apple took action. This raises concerns about the effectiveness of Apple’s review process and its ability to prevent such apps from appearing on the App Store in the first place. While Apple’s decision to remove these apps is a step in the right direction, it also highlights the challenges tech companies face in policing their platforms. As AI technology continues to evolve, how can we ensure it is used responsibly? We invite you to share your thoughts below. Read more: https://www.404media.co/apple-removes-nonconsensual-ai-nude-apps-following-404-media-investigation/ https://appleinsider.com/articles/24/04/26/apple-finally-pulls-generative-ai-nude-apps-from-the-app-store https://9to5mac.com/2024/04/26/apple-pulls-multiple-ai-nude-image-apps-from-app-store/ Image: Bigtunaonline | Dreamstime.com Quote Link to comment Share on other sites More sharing options...
Recommended Posts