The Rise of Deepfake Pornography and the Fight Against It
Introduction: The Shocking Reality of Deepfake Pornography
In the digital age, technology has advanced to the point where creating realistic and sexually explicit images or videos of someone without their consent has become alarmingly simple. This phenomenon, often referred to as "nudification" technology, has left victims like Molly Kelley in shock and horror. Kelley discovered in June that someone she knew had used this technology to create explicit content using her family photos posted on social media. What’s even more disturbing is that this person targeted around 80 to 85 other women, most of whom live in Minnesota. Kelley’s story is not just a personal tragedy but a wake-up call for lawmakers to act against this invasive and harmful technology. Minnesota is now at the forefront of a new strategy to combat deepfake pornography, with a bipartisan bill aimed at cracking down on companies that allow users to upload photos and turn them into explicit images or videos.
Why Advocates Say the Bill is Needed
The lead author of the Minnesota bill, Democratic Senator Erin Maye Quade, emphasizes the urgency of the situation, stating that AI technology has advanced so rapidly that additional restrictions are necessary. The proposed legislation would require operators of "nudification" sites and apps to disable access for users in Minnesota or face civil penalties of up to $500,000 for each unlawful access, download, or use. Developers would need to figure out how to implement these restrictions for Minnesota users. Maye Quade argues that the harm goes beyond the distribution of these images; the very existence of such content is deeply harmful to victims. Kelley echoed this sentiment, revealing that anyone can create "hyper-realistic nude images or pornographic videos" in mere minutes. So far, law enforcement has primarily focused on the distribution and possession of such material, but the Minnesota bill aims to prevent the creation of these images in the first place.
Congress, States, and Cities Are Taking Action
The issue of deepfake pornography is not limited to Minnesota; it has caught the attention of lawmakers across the country. San Francisco has filed a first-of-its-kind lawsuit against several "nudification" websites, alleging violations of state laws against fraudulent business practices, nonconsensual pornography, and the sexual abuse of children. This case is still pending, but it highlights the growing concern over the legal and ethical implications of this technology. At the federal level, the U.S. Senate unanimously approved a bill introduced by Senators Amy Klobuchar (D-MN) and Ted Cruz (R-TX) that makes it a federal crime to publish nonconsensual sexual imagery, including AI-generated deepfakes. The bill would require social media platforms to remove such content within 48 hours of being notified by a victim. Melania Trump has also urged the Republican-controlled House to pass this legislation, signaling the bipartisan concern over this issue.
States like Kansas have also taken action, expanding the definition of illegal sexual exploitation of a child to include images generated with AI if they are indistinguishable from real children, morphed from a real child’s image, or generated without any actual child involvement. Similar bills have been introduced in states such as Illinois, Montana, New Jersey, New York, North Dakota, Oregon, Rhode Island, South Carolina, and Texas. Senator Maye Quade plans to share her proposal with other states, recognizing that federal action may be slow and that state-level legislation could be a more immediate solution. "If we can’t get Congress to act, then we can maybe get as many states as possible to take action," she said.
Victims Share Their Stories
The impact of deepfake pornography on victims cannot be overstated. Sandi Johnson, a senior legislative policy counsel for the victim’s rights group RAINN (Rape, Abuse and Incest National Network), testified that the Minnesota bill would hold websites accountable by preventing the creation and dissemination of these harmful images. She emphasized that once these images are created, they can be posted anonymously and spread rapidly, making them nearly impossible to remove. Megan Hurley, another victim, shared her horrifying experience of discovering that someone had generated explicit images and videos of her using a "nudification" site. As a massage therapist, she feels especially humiliated because her profession is already sexualized in some people’s minds. "It is far too easy for one person to use their phone or computer and create convincing, synthetic, intimate imagery of you, your family, and friends, your children, your grandchildren," Hurley said. She expressed her outrage that such technology exists and that companies profit from it.
AI Experts Urge Caution
While advocates argue that the Minnesota bill is necessary to protect victims, experts on AI law caution that the proposal might face constitutional challenges, particularly on free speech grounds. Wayne Unger of the Quinnipiac University School of Law and Riana Pfefferkorn of Stanford University’s Institute for Human-Centered Artificial Intelligence both noted that the bill is too broadly constructed and may not survive a court challenge. Pfefferkorn suggested that limiting the scope to images of real children might help it withstand a First Amendment challenge, as such images are generally not protected by free speech. However, she also pointed out that the bill could conflict with a federal law that protects websites from liability for user-generated content. Unger echoed this sentiment, stating, "If Minnesota wants to go down this direction, they’ll need to add a lot more clarity to the bill. And they’ll have to narrow what they mean by nudify and nudification." Despite these concerns, Senator Maye Quade remains confident that her legislation is on solid constitutional ground, arguing that it regulates conduct rather than speech. "These tech companies cannot keep unleashing this technology into the world with no consequences. It is harmful by its very nature," she said.
Conclusion: The Challenges Ahead
The issue of deepfake pornography presents a complex challenge for lawmakers, balancing the need to protect victims with the potential infringement on free speech. While the Minnesota bill and similar legislative efforts across the country are a step in the right direction, they are just the beginning of what promises to be a long and contentious battle. As technology continues to evolve, lawmakers must remain vigilant and adaptable, ensuring that protections for victims are robust while also respecting constitutional rights. The stories of Molly Kelley, Megan Hurley, and others serve as a stark reminder of the urgent need for action. Whether through state-level legislation, federal laws, or legal challenges, the fight against deepfake pornography is far from over.