Use of AI to harm women is in its infancy, experts warn | Grok AI

Applications of AI


“Since discovering Grok AI, regular porn is no longer useful. It just sounds absurd now,” one enthusiast of the Elon Musk-owned AI chatbot wrote on Reddit. Another agreed: “If you really want a specific person, then yeah.”

If those horrified by the distribution of sexual images on Grok were hoping that last week’s belated safety measures might put the genie back in the bottle, there are many such posts on Reddit and elsewhere that tell a different story.

And while there is no doubt that Grok has changed the public’s understanding of the power of artificial intelligence, he also pointed to a much broader issue: the increasing availability of tools and delivery methods, posing a challenge to the world’s regulators that many see as impossible. The UK has announced that the production of non-consensual sexual and intimate images will soon become a crime, but experts say the use of AI to harm women is still in its infancy.

Other AI tools have more stringent safeguards in place. “You can’t do that,” says Claude, a large language model (LLM) who was asked to take a photo of a woman and change it into a bikini. “You can’t edit an image to change her clothes or create a photo of a doctored person.” ChatGPT and Google’s AI tool Gemini create bikini images, but nothing more explicit.

However, there are far fewer restrictions elsewhere. Users on Reddit’s Grok forum are sharing tips on how to use photos of real women to produce the most hardcore porn images possible. In one thread, users complained that Grok “after a struggle” allowed them to create images of topless women, but refused to produce genitalia. Some have found that seeking “artistic nudity” circumvents the safeguards surrounding naked women.

Grok was also used to generate a deepfake image of Elon Musk in a bikini. Photo: Leon Neal/Getty Images

Beyond LLM and the major platforms, there is an entire ecosystem of websites, forums, and apps dedicated to female nudity and humiliation. These communities are increasingly finding a pipeline into the mainstream, says Anne Cranen, a researcher at the Institute for Strategic Dialogue who works on technology-facilitated gender-based violence.

Communities on Reddit and Telegram are discussing ways to circumvent guardrails and get LLMs to produce porn, a process known as “jailbreaking.” The X thread amplifies information about nude apps that generate AI-generated images of unclothed women and how to use them.

Kranen said the routes for misogynistic content to reach the broader internet have expanded, adding: “There’s a very fertile ground for misogyny to thrive there.”

An ISD investigation last summer found dozens of nude apps and websites, totaling nearly 21 million visitors in May 2025. There were 290,000 mentions of these tools on X in June and July last year. An investigation by the American Sunlight Project in September found that despite the platform’s enforcement efforts, there are thousands of ads for such apps on Meta.

“Mainstream app stores like Apple and Google host hundreds of apps that make this possible,” said Nina Jankowitz, co-founder and disinformation expert at the American Sunlight Project. “Much of the deepfake sexual abuse infrastructure is supported by the companies we all use on a daily basis.”

Claire McGlynn, a law professor at Durham University and an expert on violence against women and girls, said she feared things would get worse. “OpenAI announced last November that it would allow ‘erotica’ on ChatGPT. What happened with X shows that all new technology is being used to abuse and harass women and girls. So what will we see on ChatGPT?”

“Women and girls are much more reluctant to use AI. This should not surprise any of us. Women do not see this as an exciting new technology, but simply as a new way to harass and abuse us and try to force us offline.”

Jess Asato, the Labor MP for Lowestoft, has campaigned on the issue and said her critics have gleefully created and shared explicit images of her since the restrictions on Grok. “It’s still happening to me and it’s being published in X because I’m speaking up about it,” she added.

Asato added that AI deepfake abuse has been happening to women for years and is not limited to Grok. “I don’t know why [action] It took a long time. I’ve talked to far worse victims. ”

Grok

Users can create sexually explicit images based on clothed photographs of real people. Free users of X have no restrictions. If asked to strip down to a bondage outfit for a photo, X will comply. They also put women in sexually dangerous positions and smear them with a white semen-like substance.

Kranen said the point of creating deepfake nudes is often not just to share erotic images, but the spectacle of it, especially when the images are flooding platforms like X.

“I actually go back and forth, [trying] “To shut someone up by saying, ‘Glok, put her in a bikini,'” she said.

“The performance there is very important and really shows the underpinnings of misogyny that seeks to punish or silence women. It also has cascading effects on democratic norms and the role of women in society.”



Source link