You don’t have to accept being in deepfake porn

AI Video & Visuals


I recently found out through a Google alert that I’m being targeted for deepfake porn. I wasn’t shocked. For over a year, I have been the target of widespread harassment campaigns online, including deepfake porn (creators using artificial intelligence to portray real people in sexual situations that never actually happened). ), but this is a valuable weapon used by misogynists to banish women from public view. The only feelings I had when I reported my recent breach of privacy to my lawyer were deep feelings for this technology and for lawmakers and regulators who did no justice to those who appeared in porn clips without their consent. I was disappointed. Many commentators find themselves in a corner over this issue. potential Threats posed by artificial intelligence – deepfake videos to tip elections or start wars, introduction of ChatGPT and other generative technologies to destroy jobs. But policymakers largely ignore the urgent AI problem that already affects many lives, including mine.

Last year, I resigned as chairman of the Department of Homeland Security’s Disinformation Control Committee. The commission is a policy coordination body that the Biden administration allowed to establish amid criticism, mostly from the right. In the months that followed, at least three artificially generated videos purporting to show me engaging in sexual activity were uploaded to a website dedicated to deepfake porn. The image doesn’t look like much to me. The generative AI model that spits out these looks like it was trained on official US government portraits I took when I was six months pregnant. The person who made this video probably used a free “face swap” tool, essentially pasting my photo onto an existing porn video. The moment the deepfake Frankenstein moves and my face flickers, you can see the original performer’s mouth. However, these videos are not meant to be convincing and all his websites and the individual videos they host are clearly labeled as fake. They may provide the viewer with cheap thrills, but their deeper purpose is to humiliate, humiliate, and objectify women, especially those who have the courage to speak out. I have studied and written about this abuse for years, so I am somewhat familiar with it. But for other women, especially those in more conservative or patriarchal settings, appearing in deepfake porn videos can be severely stigmatizing and even career and life threatening.

As if to underscore the videomakers’ obsession with punishing women who speak up, one of the videos Google warned me about showed me with Hillary Clinton and Greta Thunberg. Because they are world-famous, deepfakes of former presidential candidates and climate change activists are far more numerous and more graphic than mine. Users can also easily find deepfake porn videos of singer Taylor Swift, actress Emma Watson, and former Fox News host Megyn Kelly. Democratic officials such as Kamala Harris, Nancy Pelosi, and Alexandria Ocasio-Cortez. Republicans Nikki Haley and Elise Stefanik. And countless other notable women. Our mere public presence as women makes us all targets, stripped of our achievements, intelligence, and activism, and reduced to sexual objects for the enjoyment of millions of anonymous people.

Of course, men are much less often subjected to such abuse.To report this article, I searched the name donald trump A video of the former president and three full pages of videos depicting his wife, Melania, and daughter, Ivanka, were found on a well-known deepfake porn website. A 2019 study by synthetic media watchdog company Sensity estimated that more than 96 percent of deepfakes in existence at the time were pornography of non-consensual women. The reasons for this imbalance are interconnected, both technical and motivational. The people making these videos are probably heterosexual men who care more about their own satisfaction than they care about women’s personalities. And because AI systems are trained over the internet, which is rich in images of female bodies, much of the non-consensual porn they generate is computer-generated, for example, of cute animals playing. They are more reliable than clips.

While researching the provenance of the videos in which I appear, I stumbled upon deepfake porn forums, as I am a disinformation researcher. There, users are surprisingly comfortable with the privacy violations they’re doing. Some people seem to believe they have the right to distribute these images. By feeding public photos of women into an application designed to create pornography, you have created a legitimate work of art and parody. Clearly, some people think they can avoid legal consequences for their actions simply by labeling their videos and images fake. These providers claim that the videos are for entertainment and educational purposes only. But by using that description for a video where a famous woman is “humiliated” or “beaten,” as the titles of some of the clips say, these men are telling themselves what they find fun and profitable. reveals a lot about

Ironically, some creators who post on deepfake forums express great concern for their safety and privacy. A forum thread I found ridiculed a man for signing up for a face-swapping app that didn’t protect user data. The women they portray do not have the same rights because they have chosen public career paths. The most chilling page I found listed a woman turning 18 this year. They are removed on their birthdays from a “blacklist” maintained by deepfake forum organizers to avoid violating child pornography laws.

Effective law is exactly what victims of deepfake porn need. Several states, including Virginia and California, have banned the distribution of deepfake porn. But for victims living outside these jurisdictions, or seeking justice for perpetrators based elsewhere, these laws have little effect. For myself, it’s probably not worth the time and money spent trying to identify the creators of these videos. I could subpoena information about the user who uploaded the video to the platform, but even if the site had that detail and shared it with me, if the abuser lives out of state or in another country, There is very little I can do. to bring them to justice.

New York State Rep. Joseph Morrell is looking to reduce loopholes in the jurisdiction by reintroducing the Prevent Intimate Image Deepfakes Act, an amendment to the 2022 reauthorization of the Violence Against Women Act. Morrell’s bill would ban the distribution of deepfakes nationwide without the explicit consent of the person appearing in the image or video. The measure would also provide a somewhat simple remedy if a victim finds themselves unknowingly appearing in non-consensual pornography.

In the absence of strong federal law, the measures available to mitigate the damage caused by my deepfakes are not very encouraging. I can ask Google to remove the video’s web address from its search results. You can also request my lawyers to remove the video permanently from the online platform, even if the legal basis for any request is questionable. But even if those websites were compliant, it’s very likely that the videos would appear elsewhere. Women targeted by deepfake porn are caught in a costly and endless game of troll-swatting.

Intimate image deepfake prevention laws do not solve the deepfake problem. The internet is forever, deepfake technology is becoming more prevalent, and its output is only getting more convincing. But adapting the law to the new category of misogynistic abuse, especially as AI grows stronger by the month, is becoming increasingly important to protect women’s privacy and safety. As policy makers worry that AI will destroy the world, I ask them. First stop the men who use AI to discredit and humiliate women.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *