X has restricted access to Grok’s Undress AI image creation feature for the majority of its users following explicit warnings from British authorities about a potential nationwide prohibition tied to the tool’s ability to digitally remove clothing from individuals.
Responses from the official Grok profile on X, which reporters reviewed, indicate that picture generation along with editing functions now belong exclusively to those with paid subscriptions. Previously, everyday account holders could simply mention the AI undress bot in posts and request custom visuals without any payment barrier.
This unrestricted earlier approach contributed significantly to disturbing patterns where people submitted ordinary photos of dressed individuals, including some who were minors, then directed the AI to strip away garments or place figures in provocative scenarios. Grok followed those instructions without resistance.

The decision to limit availability arrives against a backdrop of growing official discussions in various countries about possibly blocking or shunning X entirely unless stronger controls prevent misuse of its artificial intelligence capabilities.
British screenshots depicting outputs from Grok soon caught the eye of government officials and oversight bodies prompting scrutiny over adherence to the country’s Online Safety legislation.
Jess Phillips, the minister responsible for protection issues, spoke bluntly, describing the creation through Grok of humiliating unauthorized private style pictures as thoroughly shameful.
She stressed how such instruments can ruin lives by enabling harassment and mistreatment directed at females and young people.
The administration has pledged to outlaw applications focused on digital AI undressing and aims to lead globally by criminalizing the ownership production or sharing of artificial intelligence systems built specifically for child sexual exploitation imagery carrying prison terms reaching five years.
Keir Starmer, the Prime Minister reinforced that position labeling the content surfacing via Grok as wholly intolerable. When pressed on whether continued governmental engagement with the service remained viable he replied that every possibility including severe measures stays under consideration.
Starmer urged X to promptly address and eliminate the problematic material declaring that authorities would step in because such circumstances cannot be allowed to persist, according to coverage in The Telegraph.
The parliamentary group focused on women and equality matters indicated it might reassess its official activity on the site, noting that a platform generating overt abusive imagery targeting females and minors does not align with standards for formal government interactions.
Oversight agencies adopted equally stringent positions. Ofcom cautioned that failure to suppress unlawful or damaging material might trigger penalties and substantial financial sanctions under existing safety rules. Meanwhile, the data watchdog expressed interest in potential violations of privacy statutes especially involving unauthorized modifications to photographs of identifiable persons.
A spokesperson for the Information Commissioner’s Office informed reporters that they are monitoring accounts of troubling Grok produced material. Contact has been made with X and xAI to obtain details on safeguards implemented for meeting British data rules and safeguarding personal rights.
Although X provided no reply to inquiries from reporters, the company maintains on its safety page that it actively combats prohibited material such as child exploitation content through post removals, account terminations and police collaboration. It further states that users attempting to generate illegal outputs via Grok will encounter equivalent repercussions as those who directly post such items.
Even with the subscription requirement potentially reducing widespread abuse many watchdogs remain unconvinced that restricting paid access sufficiently resolves core questions about why the capability existed unrestricted initially.
Other Stories You May Like