In a privately obtained letter, Apple explained to U.S. senators how it has been working behind the scenes to address the viral increase of sexually explicit deep fakes produced by Grok earlier this year. Here are the details.
Apple Worked Behind the Scenes to Address the Grok Controversy
Earlier this year, after users discovered that the chatbot easily fulfilled requests to force people in photos, especially women, to remove their clothing, Apple came under intense pressure to remove Grok and X applications from the App Store.
Apple largely remained silent during the controversy, but according to NBC News, the company determined behind the scenes that "X and Grok violated the rules" and "secretly threatened to remove Grok from the App Store."
The report continues to state that Apple "contacted the teams of X and Grok after receiving complaints about the scandal and seeing the news" and "requested that the app developers create a plan to improve content moderation."
In response, X submitted an update for the Grok app for review, but it was rejected because "the changes did not go far enough." Elon Musk's company later submitted revised versions of the X and Grok applications, of which only one was accepted.
According to Apple's letter, NBC News:
"Apple reviewed subsequent applications made by the developers and determined that X had significantly resolved its violations, but the Grok app remained non-compliant. As a result, we rejected the Grok application and informed the developer that additional changes would be necessary to rectify the violation; otherwise, the app could be removed from the App Store. […] After additional interactions and changes from the Grok developer, we determined that Grok had significantly improved and thus approved its latest application."
These details had not been publicly disclosed until now, but they help explain the complex moderation changes announced by xAI during the peak of the feedback. These changes included restrictions on who could use Grok's image tools and limitations on edits related to photos of real people.
However, in a separate report published today, NBC News states that Grok continues to "produce sexually explicit images without the consent of individuals" and emphasizes that it has documented dozens of such instances in the past month.
The report notes that the volume of images has significantly decreased since January, but a group of users is still able to bypass restrictions to dress women in "more revealing clothing, such as towels, sports bras, body-hugging Spider-Woman costumes, or bunny suits."
Click here to read the paid report of the letter Apple sent to U.S. senators. Click here to read the report on Grok's ongoing issues with sexually explicit deep fakes.
Products Worth Checking Out on Amazon
- David Pogue – 'Apple: The First 50 Years'
- MacBook Neo
- Logitech MX Master 4
- AirPods Pro 3
- AirTag (2nd Generation) – 4 Pack
- Apple Watch Series 11
- Wireless CarPlay Adapter
Comments
(10 Comments)