An image of former President Donald Trump hugging and kissing former chief medical adviser Dr. Anthony Fauci. Pornographic depictions of Hollywood actresses and internet influencers. A photo of the explosion at the Pentagon.
These all turned out to be ‘deepfakes’, highly realistic audio and video content created by rapidly advancing artificial intelligence technology.
Lawmakers across the country are scrambling to close the gap, with few legal recourse for those affected by digital counterfeiting, especially women who appear in non-consensual and explicit sexual deepfakes.
“Porn deepfakes presented honestly do not necessarily violate existing laws,” said Matthew Coogler, a Northwestern University law professor who supports the anti-deepfakes bill currently pending with the governor in Illinois. rice field.
“Because you’re taking something that’s publicly visible, your face, and completely someone else’s, under many current laws and torts, there’s no clear way to sue people for it. There was no way,” he said.
Recent interest in the power of generative AI has already led to several congressional hearings this year with proposals to regulate the burgeoning technology. But as the federal government falters, state legislatures are more quickly pushing legislation aimed at addressing AI’s impending harm.
Nine states have enacted laws regulating deepfakes, primarily because of pornography and their impact on elections, and at least four other states have introduced legislation at various stages of the legislative process. .
California, Texas, and Virginia were the first states to enact deepfake legislation in 2019, before the current AI frenzy. Minnesota recently enacted a deepfake law in May, and Illinois has a similar bill awaiting the governor’s signature.
Matthew Ferraro, an attorney at Wilmer Hale Law Firm who has tracked deepfake laws, said, “It is often said that the pace of legislation is slow and glacial, but this is not the case in this field.” says.
Technology that drives the law
The term “deepfake” first appeared on the internet in 2017, when a Reddit user of that name used an AI algorithm to create fake fake faces by digitally adding celebrity faces to real adult videos without their consent. That was when I started posting porn videos for.
Earlier this year, the proliferation of non-consensual porn deepfakes sparked controversy in the video game streaming community, highlighting the immense harm of unbridled deepfakes and the lack of legal remedies. Popular streamer QTC Cinderella claimed she was being harassed by internet users sending her images and threatened to sue those behind the deepfakes, but her lawyers later said no lawsuit had been filed. It is said that it was broken.
Since then, the number of deepfakes circulating on the internet has exploded. Deeptrace Labs, a service that identifies deepfakes, released a widely read report in 2019, online where he identified nearly 15,000 deepfake videos, 96% of which were pornographic content featuring women. reported. Deepfake videos have increased exponentially since 2018, according to Sensity AI, which also detects deepfakes.
“As technology continues to advance, unless you’re an expert in digital forensics, it’s very difficult to tell if something is fake,” said Rebecca Delfino, a law professor at Loyola Marymount University who studies deepfakes. is difficult,” he says.
This only adds to the spread of misinformation online and in political campaigns. An attack ad by Republican presidential candidate Ron DeSantis showed a series of photos of Trump embracing Fauci, some of which were generated by AI.
A fake but realistic photo that began circulating on Twitter in May showed an explosion at the Pentagon, resulting in a temporary drop in the stock market.
In a way, synthetic media has been around for decades, along with basic photo manipulation techniques, and more recently with programs like Photoshop. However, the ease with which even non-technical Internet users can create highly realistic digital counterfeiting is driving the new legislation forward.
“This speed, scale, authenticity and access to this technology is what brings all kinds of elements together to create this witch beer,” Ferraro said.
find a remedy
Without specific laws dealing with porn deepfakes, victims have limited legal options. A jumble of intellectual property, privacy, and defamation laws could theoretically allow victims to sue and obtain justice.
A federal court in Los Angeles is currently hearing a publicity rights lawsuit from a reality TV celebrity who claimed he did not give permission to an AI app that allows users to digitally paste their faces. However, publicity rights laws vary from state to state and protect personal images only when used for commercial purposes.
Forty-eight states have criminal bans on revenge porn, and some have laws banning “upskirts,” the taking of someone else’s privates without their consent. Victims can sue for defamation, said Northwestern law professor Kugler, but those laws don’t necessarily apply if deepfakes include a disclaimer that says they’re fake.
Caroline Ford, an attorney at Mink Law Firm, which specializes in assisting victims of revenge porn, said that although many victims could be helped under these laws, the laws were designed with deepfakes in mind. He said it was not written on the basis of
“In situations like this, it’s always preferable to enact legislation that makes it very clear to courts that the legislature recognizes the great harm done here and seeks to remedy that harm,” she said. Stated.
state patchwork
The extent of legislation enacted in each state to date varies.
While in Hawaii, Texas, Virginia, and Wyoming, non-consenting porn deepfakes are only criminal offences, New York and California laws only create private action rights that allow victims to file civil proceedings. be. A recent Minnesota law outlines both criminal and civil penalties.
Finding the right parties to file a lawsuit can be difficult, and local law enforcement isn’t always cooperative, Mr. Ford said of the revenge porn cases he handled. Many of her clients just want the images and videos removed and don’t have the resources to file a lawsuit.
The definition of a deepfake also varies by state. Some, like Texas, directly refer to artificial intelligence, while others contain only language such as “computer-generated images” and “digitization.”
Many of these states also amended their election laws to ban deepfakes in election ads during a specific period before the election.
free speech concerns
Like most new technologies, deepfakes can be used for harmless purposes. Making parodies, bringing historical figures to life, dubbing movies, and more, are all activities protected by the First Amendment to the United States Constitution.
Balancing banning harmful deepfakes while protecting legitimate deepfakes is not easy. “You can see that policy makers are really struggling,” said Delfino, a Loyola law professor.
Illinois’ ACLU initially opposed the state’s porn deepfakes bill, saying that while deepfakes can cause real harm, the bill’s sweeping provisions and immediate takedown clauses “shrink the vast amounts of protected speech.” It could be forced or silenced,” he argued.
A recent amendment changed the bill to add deepfakes to Illinois’ existing revenge porn law, a “significant improvement,” the group’s public affairs director Ed Yonka said in an email. “We continue to have concerns that this language undermines existing legal standards,” he said.
Delfino said the deepfake bill, which was introduced to Congress last month, could raise similar concerns because the exemptions are limited to “legitimate public concerns.”
He pointed out that California’s statute contains explicit references to First Amendment protections. Congress “will have to do a little more work on the proposal if it wants to take this seriously,” she said.
Kugler said the first deepfake-related laws were primarily aimed at non-consensual porn because these cases are “easy achievements” when it comes to free speech issues. rice field. The emotional distress and damage to dignity and reputation is obvious, but the benefits of free speech are minimal, he said.
Delfino has long advocated for stronger revenge porn laws and has followed the rise of deepfake porn since it first came to prominence. She said she is pleased that renewed interest in AI in general is driving stronger legislation.
“Like a lot of things about crimes against women and the objectification of women and minorities, they get attention from time to time, and then the public interest trickles in,” she said. “But now people are back to deepfake technology and are concerned again.”
