Meta takes legal action against AI apps that generate fake nude images

Applications of AI


As Meta continues to encourage content creation through its own AI-generation tools, it displays images, videos and tools generated by AI that are more harmful to filter into your app.

Today, Meta announced it is pursuing legal enforcement against a company called “.Joy Timeline HK Limited promotes an app called “Crashai.” This allows users to create nude or sexually explicit images generated by an individual's AI without their consent.

As explained by Meta:

All over the Internet, we have seen concerns about the growth of so-called “Nudify” apps that use AI to create fake, nonconsensual, nude or sexually explicit images. Meta has long-standing rules for intimate, nonconsensual imagery, over a year ago We have updated these policies To make it even more clear that we do not allow promotions for the Nudify app or similar services. Remove the ads, It promotes these services when you notice Facebook pages and Instagram accounts that promote these services, blocks links to websites that host them so they are not accessible through the meta platform, restricts search terms such as “undressing” and “delete clothes” on Facebook and Instagram, and does not display results.

However, some of these tools are still going through the meta system, either through user posts or promotions.

So now, Meta aims to be the developers themselves, in this first action against the “nudify” app.

It filed a lawsuit in Hong Kong based on Joy Timeline HK Limited to prevent it from promoting the Crashai app on its meta platform. This follows multiple attempts restricted to Joy Timeline HK to avoid Meta's ad review process and continue these ads after breaking the rules and being repeatedly deleted. ”

As mentioned before, it is a difficult area for the meta, as it encourages people to use their own AI visual creation apps at every opportunity, but they don't want to use such tools for less purposes.

That will happen. If the expansion of the Internet teaches us something, it's not the intended purpose, but that all innovations amplify the worst elements, even though generative AI has not proven the difference.

In fact, last month, researchers at the University of Florida: AI-generated sexually explicit images created without subject's consent.

Worse, based on an analysis of UFs on 20 AI “nudification” websites, the technology is also used to create images of minors, but women are disproportionately targeted by these apps.

This is why there is a big push to support now National Center Missing and Exploited Children's (NCME) Takedown Actand aims to introduce official laws to ban unconsensual images, among other measures to combat AI misuse.

Meta has its support behind this push, and this latest legal effort is another step to stop it, ideally eliminating the use of such a tool.

But they are not completely culled. Again, the history of the Internet shows that people are constantly trying to find ways to use modern technology for suspicious purposes, and that the ability to generate adult images with AI remains a problem.

But ideally, this will at least help reduce the prevalence of such content and the availability of the Nudify app.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *