Elon Musk’s artificial intelligence tool, Grok, has restricted its image creation function for most users following widespread criticism over its use to generate sexually explicit and violent content.
The decision follows threats of fines, regulatory action and reports that X, Musk’s social media platform, could face a possible ban in the UK.
Grok had been used to manipulate images of women, removing their clothing and placing them in sexualised poses. Access to the image generation and editing feature has now been limited to paying subscribers.
In a post on X, Grok said: “Image generation and editing are currently limited to paying subscribers.”
As a result, most users can no longer create images using the tool. Those who retain access are required to provide full personal and payment details, meaning they can be identified if the feature is abused.
Research reported by the Guardian found Grok had been used to produce pornographic videos of women without their consent, as well as images depicting women being shot or killed.
Musk is now facing the prospect of regulatory action in multiple countries after the tool was used to generate non-consensual sexual imagery.
On Wednesday, the UK prime minister, Keir Starmer, warned that strong action could be taken against X.
He called on the company to “get a grip” on the spread of AI-generated images of partially clothed women and children, describing the material as “disgraceful” and “disgusting”.
Starmer said the communications regulator Ofcom had the government’s full backing to intervene. “It’s unlawful. We’re not going to tolerate it,” he said. “I’ve asked for all options to be on the table. It’s disgusting. X need to get their act together and get this material down.”
He added that action would be taken if the platform failed to respond, saying the content was “simply not tolerable”.
Thousands of sexualised images of women have been created without their consent over the past two weeks, following an update to Grok’s image creation feature at the end of December.
Musk had faced repeated public pressure to remove or restrict the tool, but until now the platform had taken no visible action.

