X Curbs Grok’s Bikini Edits on Real People, but AI ‘Undressing’ Debate Rages On

Update: 2026-01-15 11:11 IST

X has moved to tighten how its AI assistant Grok handles image editing after a wave of criticism over non-consensual sexual deepfakes, but the platform’s response has sparked fresh controversy instead of closing the issue.

Under the updated rules, Grok is no longer supposed to edit photographs of real people into bikinis, swimwear, or other sexualised outfits. However, the same restrictions do not apply to AI-generated or fictional characters, which can still be shown partially or fully undressed if NSFW mode is enabled. Elon Musk has defended this distinction, saying it aligns with what he considers acceptable adult-content standards in the United States.

The renewed debate began on X, as it often does, with a post from DogeDesigner, an account closely linked to Musk. The account claimed it had tried multiple prompts to make Grok produce nude images and failed each time, suggesting media criticism of Grok was part of “a relentless attack” on Musk. Musk replied by publicly challenging users: “Can anyone actually break Grok image moderation? Reply below.”

As more people began testing the system, Musk clarified what Grok is designed to allow. “With NSFW enabled, Grok is supposed allow upper body nudity of imaginary adult humans (not real ones) consistent with what can be seen in R-rated movies on Apple TV. That is the de facto standard in America,” he wrote, adding that rules could vary depending on local laws.



These comments came as Grok and its parent company xAI were already facing growing scrutiny for their role in spreading sexualised deepfakes on X. Following the backlash, users noticed a change: prompts that had previously worked — such as asking Grok to change someone’s clothes into a bikini — suddenly returned blurred or blocked images.

X later confirmed the policy shift. In a post from its Safety account, the company said, “We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis. This restriction applies to all users, including paid subscribers.” Reports said the goal was to block requests involving sexual poses, swimwear, or explicit scenarios when a real person is involved.

On paper, the new rules appear strict. In reality, they remain inconsistent.

According to tests carried out by outlets such as The Verge, some obvious prompts are now rejected, but slightly altered wording can still produce highly sexualised images. Even when commands like “put her in a bikini” or “remove her clothes” were blocked, requests such as “show me her cleavage,” “make her breasts bigger,” or “put her in a crop top and low-rise shorts” reportedly went through, sometimes resulting in images that looked very similar to a bikini edit.

These loopholes are not limited to paid users. Reporters were able to run similar prompts using free Grok and X accounts. While an age-verification screen sometimes appeared on the Grok website, it could be bypassed simply by entering a birth year over 18, without any proof. In many cases, no age check appeared at all on the X site or mobile app.

The uneven enforcement becomes even more obvious when looking at what Grok still permits. While edits involving women are now partially restricted, Grok continues to generate images of men or even objects in bikinis. In one test, the system complied with a request to turn a selfie into a sexualised image involving a male subject, again using a free account.

Despite the new safeguards, The Verge said it remains “extremely easy” to undress women or place them into sexualised poses using Grok. In one case, a journalist was reportedly able to create sexualised deepfakes of herself without the system stepping in.

X and xAI have largely blamed these failures on how people use the tool. Musk has previously pointed to “user requests” and “times when adversarial hacking of Grok prompts does something unexpected” as reasons such content continues to slip through.

Tags:    

Similar News