Remove everything from women’s bodies advertised, AI video app banned

AI Video & Visuals


An AI video editing app ran a YouTube ad that appeared to show a woman digitally undressed, but Britain’s advertising watchdog intervened and banned it.

Advertisement for an app called PixVideo – AI Video Makerwas spotted in January and showed “before” and “after” images of the young woman. In the image before the change, red graffiti was superimposed on the abdomen. In the afterimage, part of her bare skin was exposed. Along the bottom of the image you’ll see text similar to the following: “Erase everything” — followed by a heart-eyes emoji.

Eight people lodged a complaint with the Advertising Standards Authority (ASA), claiming the ad sexualized women, was irresponsible, offensive and harmful. The ASA agreed and the advertising was banned.

What the ad actually showed — and why the ASA acted

The central problem wasn’t just poor taste. That’s what the ad implied viewers could do with the app.

The ASA acknowledged that PixVideo, made by a company called Saeta Tech, does not actually allow users to remove clothing from digital images to create sexually explicit content. But that distinction couldn’t save advertising. Regulators concluded that the way the ads were presented would lead most viewers to believe they were seeing them.

“We determined that this ad condoned the digital alteration and exposure of women’s bodies without their consent, as it implied that viewers could use the app to undress women,” the ASA said in a statement.

The regulator went further and explained the ad: “irresponsible”that’s what I’m saying “Contained harmful gender stereotypes” And it was “This could lead to serious crimes.”

There is one detail that adds another layer to the story. It’s not even clear if the woman in the ad is a real person or an AI-generated image. The ASA told the BBC its assessment was not part of the investigation.

Key facts of the ASA judgment

detail What sources confirm
App name PixVideo – AI Video Maker
developer Saeta Tech
Platform on which the ad was placed YouTube
when you see an ad January (year not specified in source)
Number of complaints filed eight
Regulatory authorities investigated British Advertising Standards Authority (ASA)
result Advertising prohibited
Does this app actually allow you to take off clothes? No, according to PixVideo/Saeta Tech
Was the woman in the ad real or AI-generated? Undetermined — not part of the ASA investigation
  • The ad showed a “before” image of a woman with red graffiti on her navel, and an “after” image of her bare skin exposed.
  • The text “Erase anything” appeared alongside a heart eyes emoji
  • Complaints cite sexualization, objectification, irresponsibility and potential harm.
  • The ASA found the ad condoned altering a woman’s body without her consent.

Why this matters beyond just one banned ad

This ruling touches on something much bigger than a single YouTube ad being removed. The rise of AI-powered image and video editing tools has made creating manipulated content faster and cheaper than ever before. Additionally, advertising around these tools increasingly tests the limits of what is acceptable.

So-called “deep nudity” and clothing removal tools have existed in various forms for years, and they have consistently been associated with image-based abuse, the creation and sharing of sexually explicit images of real people without their consent. Even if your app doesn’t actually offer that feature, by promoting your app in a way that suggests it offers that feature, you’re sending a clear message about who your audience is and what they want.

The ASA’s decision suggests that simply implying that your product can be used to compromise someone’s digital images is enough to cross the line, even if it’s not technically possible. The impression your ad makes is just as important as the fine print.

For women in particular, this type of advertising normalizes the idea that their bodies are objects that can be digitally manipulated for everyone to enjoy. Critics of such content argue that the framing itself is damaging, even when no actual harm is caused, reinforcing the idea that this is a rational act.

What this ruling means for AI app advertising

The PixVideo story is a reminder that AI tool developers don’t get a free pass just because their technology is new. UK advertising rules apply regardless of whether a product uses artificial intelligence, and the ASA has made it clear that it will assess the impression an ad creates, not just the technical capabilities of the product being promoted.

For companies building and marketing AI editing tools, the message is direct. How you frame what your app can do has legal and regulatory significance. As far as the ASA is concerned, ads that wink at non-consensual image manipulation, even if it’s implied, are not a gray area.

Saeta Tech was not named as it could face further penalties beyond the “advertisement is no longer permitted to run in its current form” ban.

FAQ

Which apps have banned ads?
This ad was for PixVideo – AI Video Maker, developed by a company called Saeta Tech.

Where did my ad appear?
The ad was posted on YouTube and was seen in January.

Why was advertising banned?
Britain’s Advertising Standards Authority ruled that the ad implied users could use the app to undress women, condoned digitally altering and exposing women’s bodies without their consent, was irresponsible, contained harmful gender stereotypes and was likely to give rise to a serious offence.

Does PixVideo actually allow users to remove clothing from images?
According to PixVideo and Saeta Tech, the app does not allow users to remove clothing from digital images to create sexual content, but the ASA found that the ad suggested doing so.

Is the woman in the ad a real person?
This has not been confirmed. The ASA told the BBC that determining whether the images were of a real person or generated by AI was not part of the investigation.

How many people complained about the ad?
Eight people lodged a complaint with the Advertising Standards Authority over concerns that the ad sexualized and objectified women, was harmful and offensive.



Source link