An unfortunate byproduct of technological advancement is that there are new ways of being terrible to one another.
Deepfake AI-generated pornography is the latest from the high-tech frontier in human cruelty. It produces images and videos that superimpose a real person’s face on bodies engaging in sexual acts. The resulting content is startling and gets more realistic as technology advances.
The harrowing tale of Elliston Berry, a teenager who sat beside first lady Melania Trump during Tuesday’s joint session, underscores the suffering deepfake images inflict. In October 2023, Berry woke up for a normal day at Aledo High School in Texas and discovered that deepfake pictures of her naked had been circulating throughout the student body on Snapchat and Instagram. The original images had been lifted from her social media accounts by a male classmate and manipulated by artificial intelligence.
“It was so realistic. It is child porn,” said Berry’s mother, Anna McAdams, at the time. “We really could not take care of her. We, in that moment, were helpless. More and more pictures were coming out throughout that first day into the second day.”
It wasn’t until McAdams traveled to Washington to meet Sen. Ted Cruz (R-TX) that she was able to get the pictures of her daughter removed from social media. Cruz contacted Snapchat, and they were taken down within 24 hours.
“If you’re a victim of revenge porn or AI-generated explicit imagery, your life changes forever,” Cruz said at a recent roundtable discussion event. “Disturbingly, many of these victims are teenagers at American high schools.”
Available research suggests that teenage girls are most frequently targeted by deepfake pornography. A 2023 analysis found that 99% of all deepfake pornography content targets women and girls. According to a 2024 survey by the Center for Democracy and Technology, 15% of public high school students were aware of at least one sexually explicit deepfake image depicting someone from their school in the prior school year.
This is especially concerning given that there has been a worsening mental health crisis among teenage girls in recent years. According to the National Institutes of Health, 38% of adolescent girls today have an anxiety disorder, compared to 26% of boys. In 2023, the Centers for Disease Control and Prevention found that 57% of high school girls reported feeling persistently sad or hopeless, a drastic rise from 36% in 2011.
The threat of deepfake AI pornography is an anchor dragging girls and young women deeper into this crisis.
The Senate recently passed the Take It Down Act, introduced by Cruz and co-sponsored by Sen. Amy Klobuchar (D-MN). This bill, which passed unanimously, would require websites, including social media sites, to install a “notice and takedown” process and remove deepfake pornography within two days of being informed about it by a victim.
The bill received strong support from the first lady, who went to Capitol Hill on Monday to make the case for its passage through the House.
“Every young person deserves a safe online space to express themself freely, without the looming threat of exploitation or harm,” she said at a roundtable discussion to which only one Democrat, Rep. Ro Khanna (D-CA), bothered to show up.
Khanna’s support for the bill is crucial not just because it should be bipartisan but also because his congressional district includes Silicon Valley, which has raised free-speech objections to the bill. In particular, there is legitimate worry that the bill contains overbroad definitions. It is also argued, less persuasively, that the takedown mandate infringes on due process. Free-speech advocates also believe the bill could stifle whistleblowers who rely on anonymity for legitimate reasons.
On Monday, Khanna remarked that the bill “balances the free-speech concerns with the protections,” and we believe he is right.
MELANIA TRUMP PUSHES ANTI-REVENGE PORN BILL, SLAMS NO-SHOW DEMOCRATS
The language is limited in scope, targeting “nonconsensual intimate imagery,” which would curtail overbroad censorship. The 48-hour window for taking down such conduct offers enough protection to due process to balance against the urgent need to protect young girls from harm.
The House must follow the Senate in passing the Take It Down Act. This would let creators and purveyors of deepfake AI pornography know that their conduct will no longer be tolerated.