Grok’s image generation tool exposes the naked truth about misogyny

There’s a kind of violation that doesn’t leave bruises, the kind that happens quietly, digitally, and then spreads faster than you can process it. When I saw the reports about Grok, Elon Musk’s AI chatbot, generating non-consensual sexualised edits of women and even children, it didn’t shock me. It exhausted me. Because this isn’t just a tech failure. It’s a cultural one, and a political one; a reminder of who still gets to feel safe, and who doesn’t.

Women like Ashley St. Clair had their childhood photos manipulated. Bella Wallersteiner found her selfies turned into bikini-style edits. Maya Jama had to publicly ask Grok to stop altering her image. Julie Yukari watched a harmless New Year’s Eve photo become a sexualised deepfake within hours. These women didn’t “invite” anything. They were targeted because they exist in a world where women’s bodies are still treated as editable, ownable, downloadable.

And then Elon Musk decided to make a joke out of it.

Instead of acknowledging the harm, he used Grok to generate an image of himself in a bikini. A smug, dismissive “see, it can happen to me too” performance that completely ignored the power dynamics at play. It was the techbro version of “not all men,” dressed up as humour. It wasn’t clever. It wasn’t neutral. It was a man with extraordinary influence telling women that our fear is overblown, and our anger is inconvenient.

This is what misogyny looks like in 2026, not just harassment, but minimisation. Not just violation, but mockery. And it all comes with the refusal for a man to acknowledge it.

Grok didn’t invent sexism, it industrialised it. It automated the entitlement that has always told women our bodies are public property, and when the man in charge responds with a joke, it tells every woman watching that our safety is optional, our dignity negotiable, and our trauma entertaining.

This is why feminism, the kind that some love to sneer at as “woke”, matters. Because feminism is the insistence that women deserve control over our own bodies, our own images, our own narratives. When AI ironically strips that away, it isn’t a glitch. It’s a warning sign of what happens when technology is built without us, about us, or against us.

The digital undressing of women is not a prank. It is a political act. It reinforces the idea that no matter how far we’ve come, we can still be reduced to an image someone else controls. And it tells the misogynists, specifically the ones who laugh at this, that technology will happily do their dirty work for them.

If Grok can undress women with a prompt, then we need to undress the truth, this isn’t about AI misbehaving. It’s about a society that still believes women exist to be consumed.