Salazar Introduces Legislation to Protect Victims of Deepfake Revenge Porn

WASHINGTON, D.C. – Today, Reps. María Elvira Salazar (R-FL), Madeleine Dean (D-PA), August Pfluger (R-TX), Stacey Plaskett (D-VI), Vern Buchanan (R-FL), and Debbie Dingell (D-MI) introduced the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Network (TAKE IT DOWN) Act. This bill protects victims of real and deepfake ‘revenge pornography,’ a crime that is unfortunately having an increasing impact on our nation’s youth.
While artificial intelligence (AI) can provide countless benefits to society, some bad actors have used it to take advantage of innocent Americans for nefarious purposes. The growth in non-consensual intimate imagery (NCII), more commonly known as ‘revenge porn,’ following the rising popularity of AI requires decisive legal protections that will empower victims, most of whom are women and girls.
The TAKE IT DOWN Act criminalizes the publication of these harmful images and requires websites to quickly remove them.
“The alarming rise of deepfakes is threatening to destroy innocent individuals’ and families’ lives,” said Rep. Salazar. “Non-consensual deepfake imagery is a cancer that can no longer go untreated. The TAKE IT DOWN Act is the best way to hold online platforms accountable and protect victims of these horrendous crimes.”
The TAKE IT DOWN Act solves the problem of inconsistent, or non-existent, legislation protecting victims of deepfake pornographic images at the state level. While nearly all states have laws protecting their citizens from revenge porn, only 20 states have explicit laws covering deepfake non-consensual intimate imagery (NCII). Within those 20 states, there is a high degree of variance in classification of crime, penalty, and even criminal prosecution. Victims also struggle to have images depicting them removed from websites in a timely manner, potentially contributing to more spread and retraumatization.
In 2022, Congress passed legislation creating a civil cause of action for victims to sue individuals responsible for publishing NCII. However, bringing a civil action can be incredibly impractical. It is time-consuming, expensive, and may force victims to relive trauma. Further exacerbating the problem, it is not always clear who is responsible for publishing the NCII.
The TAKE IT Down Act addresses these issues while protecting lawful speech by:
- Criminalizing the publication of NCII or the threat to publish NCII in interstate commerce;
- Protecting good faith efforts to assist victims by permitting the good faith disclosure of NCII for the purpose of law enforcement or medical treatment;
- Requiring websites to take down NCII upon notice from the victims within 48 hours; and
- Requiring that computer-generated NCII meet a ‘reasonable person’ test for appearing to realistically depict an individual, so as to conform to current First Amendment jurisprudence.
“Artificial Intelligence is rapidly evolving—our government must meet this moment with urgency, especially when addressing the dangers of explicit deepfakes and non-consensual intimate imagery (NCII) that often devastate girls and women,” said Rep. Dean. “We must defend victims—regardless of whether their attackers used deepfake technology or a simple camera—and ensure these images are removed from the Internet. I’m thankful to work with Congresswoman Salazar on this bipartisan legislation that will better protect victims and meaningfully regulate AI.”
“In recent years, we’ve witnessed a stunning increase in exploitative sexual material online, largely due to bad actors taking advantage of newer technologies like generative artificial intelligence. Many women and girls are forever harmed by these crimes, having to live with being victimized again and again,” said Senator Cruz. “While some states provide legal remedies for victims of non-consensual intimate imagery, states would be further supported by a uniform federal statute that aids in removing and prosecuting the publication of non-consensual intimate images nationwide. By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime. I’m grateful to Reps. Salazar and Dean for leading this bipartisan effort in the House and look forward to working with all cosponsors to get this bill passed through both chambers.”
“There is an urgent need for Congress to act by putting protections in place for victims of exploitative deep fakes and levying consequences on those creating and facilitating this sickening practice,” said Rep. Pfluger. “As a father to three young girls, I am proud to join this bipartisan bicameral effort to prevent explicit material from circulating and harming innocent victims.”
“Federal policymakers and AI experts must continue to expose and address the dangers of explicit deepfakes and non-consensual intimate imagery (NCII) that profoundly impact women and girls,” said Rep. Plaskett. “This bipartisan legislation is a necessary and pragmatic approach that protects victims, holds websites accountable, and protects lawful speech. I’m thankful for the opportunity to co-lead this timely bill and provide relief for these victims.”
“If there’s one thing everyone can agree on, it’s the need to protect our vulnerable children and grandchildren,” said Rep. Buchanan. “While the rise in artificial intelligence (AI) brings countless potential benefits, I am deeply disturbed by the rise of so-called ‘revenge porn’ and explicit AI-generated images of young girls circulating on social media. I’m pleased to help introduce this legislation with Congresswoman Salazar and Congresswoman Dean and am hopeful Congress will pass this common sense legislation with broad bipartisan support.”
“I’m very concerned about the increasing use of artificial intelligence to create and circulate deep fake pornography, which threatens the mental and emotional health and financial security of its victims, primarily women. Perpetrators have used deep fake pornography as a tool to harass, humiliate, and intimidate women online, often in response to them speaking out or advocating for themselves,” said Rep. Dingell. “The TAKE IT DOWN Act provides a critical remedy for victims to ensure these images are removed and that perpetrators are held accountable. As new technology emerges, so too does the potential for new forms of abuse, and we must act swiftly to protect women from tech-facilitated abuse.”
Senators Ted Cruz (R-TX), Amy Klobuchar (D-MN), Shelley Moore Capito (R-WV), Richard Blumenthal (D-CT), Cynthia Lummis (R-WY), Jacky Rosen (D-NV), Ted Budd (R-NC), Laphonza Butler (D-CA), Todd Young (R-IN), Joe Manchin (I-WV), John Hickenlooper (D-CO), Bill Cassidy (R-LA), Martin Heinrich (D-NM), John Barrasso (R-WY), John Thune (R-SD), and Roger Wicker (R-MS) introduced the Senate companion bill.
The TAKE IT DOWN Act is supported by over three dozen organizations across the political spectrum, including: National Center for Missing & Exploited Children (NCMEC); TechNet; National Center on Sexual Exploitation (NCOSE); Rape, Abuse, & Incest National Network (RAINN); SAG-AFTRA; Public Citizen; IBM; Center for American Progress; American Psychological Association; American College of Pediatricians; National Association of Chiefs of Police; National Association of Police Organizations (NAPO); National Collegiate Athletics Association (NCAA); Major League Baseball (MLB); U.S. Olympic & Paralympic Committee (USOPC); Becca Schmill Foundation; David’s Legacy Foundation; S.E.A.S.A.M.E. (Stop Educator Sexual Abuse, Misconduct, & Exploitation); National Decency Coalition; SWGfL (StopNCII.org); Talk More. Tech Less; National Organization for Women (NOW); Reclaim Coalition; Joyful Heart Foundation; Institute for Strategic Dialogue; Family Policy Alliance; Hope for Justice; Thistle Farms; Citizens for Decency; Stop Sexual Assault in Schools (SSAIS); 3Strands Global Foundation; Interparliamentary Taskforce on Human Trafficking; Street Grace; Enough Abuse; P@rn Free Colorado; National Children’s Alliance; Institute for Family Studies; American Principles Project; National Association of Counties (NACo); Bull Moose Project; HSA Coalition; Parents Television and Media Council; Enough is Enough; Match Group; The Danbury Institute; Digital First Project; National Consumers League (NCL); National Alliance to End Sexual Violence; and Bumble.
To read a summary of the legislation, click here.
To read the full text of the legislation, click here.
###